In the mists of an old blogger's memory, there is a punch line to a hoary old joke that went something like, "Patience, jackass." The moral of today's post for all Dumbos/Teabaggers under every rock is the equivalent: "Patience, jackass(es)." The Dumbos/Teabaggers get all of their information about the world from Faux News and that black hole of journalism spews the anti-science blather that most Dumbos/Teabaggers quote verbatim. If this is a (fair & balanced) antidote to science stupidity, so be it.
[x New Yorker]
Cleaning Up Science
By Gary Marcus
Tag Cloud of the following article
A lot of scientists have been busted recently for making up data and fudging statistics. One case involves a Harvard professor who I once knew and worked with; another a Dutch social psychologist who made up results by the bushel. Medicine, too, has seen a rash of scientific foul play; perhaps most notably, the dubious idea that vaccines could cause autism appears to have been a hoax perpetrated by a scientific cheat. A blog called RetractionWatch publishes depressing notices, almost daily. One recent post mentioned that a peer-review site had been hacked; others detail misconduct in dentistry, cancer research, and neuroscience. And that’s just in the last week.
Even if cases of scientific fraud and misconduct were simply ignored, my field (and several other fields of science, including medicine) would still be in turmoil. One recent examination of fifty-three medical studies found that further research was unable to replicate forty-seven of them. All too often, scientists muck about with pilot studies, and keep tweaking something until they get the result they were hoping to achieve. Unfortunately, each fresh effort increases the risk of getting the right result for the wrong reason, and winding up with a spurious vision of something that doesn’t turn out to be scientifically robust, like a cancer drug that seems to work in trials but fails to work in the real world.
How on Earth are we going to do better? Here are six suggestions, drawn mainly from a just-published special issue of the journal Perspectives on Psychological Science. Two dozen articles offer valuable lessons not only for psychology, but for all consumers and producers of experimental science, from physics to neuroscience to medicine.
Restructure the incentives in science. For many reasons, science has become a race for the swift, but not necessarily the careful. Grants, tenure, and publishing all depend on flashy, surprising results. It is difficult to publish a study that merely replicates a predecessor, and it’s difficult to get tenure (or grants, or a first faculty jobs) without publications in elite journals. From the time a young scientist starts a Ph. D. to the time they’re up for tenure is typically thirteen years (or more), at the end of which the no-longer young apprentice might find him or herself out of a job. It is perhaps, in hindsight, no small wonder that some wind up cutting corners. Instead of, for example, rewarding scientists largely for the number of papers they publish—which credits quick, sloppy results that might not be reliable—we might reward scientists to a greater degree for producing solid, trustworthy research that other people are able to successfully replicate and then extend.
Encourage people to publish studies that fail, as well those that succeed. Thomas Edison realized that the key to inventing a light bulb was to keep track of every experiment that didn’t work. But few journals nowadays are willing to publish efforts that don’t yield immediate fruit, and young scientists who lack immediate success rarely get grants. Without a faithful record of solid efforts that have been tried and failed, the scientific picture inevitably becomes distorted, like the memory of a gambler who remembers only his wins. A registry of studies, in which methods are declared before experiments are conducted and scientists go fishing for analyses that yield data that fit with their preconceived notions, can go a long way towards helping with this and other related problems. More full disclosure of experimental methods can help, too.
Recognize that no single study ever proves anything. Without replication, all results should be taken as preliminary. Here, both science and the media are complicit; there is a tendency to trumpet every new finding as if it proved something, but most new studies are merely evidence toward a conclusion, not the conclusion itself. Everyone—from the public, to the media, to Congress, to the scientists themselves—needs to be more patient.
Promote meta-analysis. This technique combines results across many different labs that allow researchers to sort between tiny but consistent effects that are hard to see in individual studies and spurious effects that occurred by accident. The Cochrane Collaboration’s meta-analytic reviews have become a vital part of evidence-based medicine, and a new website called Open Science Framework is laying the groundwork for more systematic meta-analysis in all areas of science.
Create an ethical code. Hippocrates was right: Every profession needs formal standards of conduct. Doctors, lawyers, and engineers have them; science should not be exempt. There has been some movement in this direction; there needs to be more so that young scientists know what is expected of them ethically as well as intellectually.
Give science some cops. As bitter experience has shown, even the most elite can turn out to be cheats. Business has its Madoffs; science has its own set of high-fliers whose work is too good to be true. A bright spot in the last couple of years has been the emergence of a new breed of detectives, one that uses statistics rather than identikits to ferret out dubious results. “Meta-researchers,” like John Ioannidis, have led the way in medicine, and now people, like Uri Simonsohn, are doing the same for psychology.
In the long run, science is self-correcting. Ptolemy’s epicycles were replaced by Copernicus’s heliocentric system. The theory that stomach ulcers were caused by spicy foods has been replaced by the discovery that many ulcers are caused by a bacterium. A dogma that primates never grew new neurons held sway for forty years, based on relatively little evidence, but was finally chucked recently when new scientists addressed older questions with better methods that had newly become available. The best science is cumulative, not just a list of fun results; as people push deeper, bad ideas that are invalid eventually crumble. Even if nothing changed, we would eventually achieve the deep understanding that all scientists strive for. But there is no doubt that we can get there faster if we clean up our act.
The good news is that scientists across many fields have finally begun to reckon with the magnitude of the many challenges we are collectively facing. The articles in last month’s issue of Perspectives on Psychological Science have already been downloaded over two hundred thousand times (the academic equivalent of going platinum). And new venues and consortia like PLoS, PeerJ, and Open Science Framework are leading the way toward improved systems for screening new results. Those of us working in science lately have learned some very hard lessons, but in the long run science will be better for it. Ω
[Gary Marcus is a Professor of Psychology at New York University. He received a BA from Hampshire College and a PhD from the Massachusetts Institute of Technology. He has written The Birth of the Mind (2003), Kluge (2008), and Guitar Zero: The Science of Becoming Musical At Any Age (2012).]
Copyright © 2012 Condé Nast Digital
Get the Google Reader at no cost from Google. Click on this link to go on a tour of the Google Reader. If you read a lot of blogs, load Reader with your regular sites, then check them all on one page. The Reader's share function lets you publicize your favorite posts.
Sapper's (Fair & Balanced) Rants & Raves by Neil Sapper is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Based on a work at sapper.blogspot.com. Permissions beyond the scope of this license may be available here.
Copyright © 2012 Sapper's (Fair & Balanced) Rants & Raves
No comments:
Post a Comment
☛ STOP!!! Read the following BEFORE posting a Comment!
Include your e-mail address with your comment or your comment will be deleted by default. Your e-mail address will be DELETED before the comment is posted to this blog. Comments to entries in this blog are moderated by the blogger. Violators of this rule can KMA (Kiss My A-Double-Crooked-Letter) as this blogger's late maternal grandmother would say. No e-mail address (to be verified AND then deleted by the blogger) within the comment, no posting. That is the (fair & balanced) rule for comments to this blog.