Science isn’t always science. At least that’s what many of us were reminded of recently with the revelation that a retracted study linking autism to the MMR vaccine conducted by Dr. Andrew Wakefield was an “elaborate fraud,” according to an investigation by British medical journal BMJ.
My colleague Jacob Serebrin wrote that this was one example of how inept peer review and the scientific method can be at detecting fraudulent research. While better prevention methods may need to be put in place to ensure studies like Wakefield’s never see the light of day, there may never be a way to make the peer review system bulletproof. Alternatively, in an age where information is more accessible than ever before, and misinformation can spread more quickly than ever before, there needs to be more transparency in how scientific research is conducted and reviewed.
To put it simply, people need to be more aware of how science becomes science.
Bryan Deer, the British journalist who helped expose Wakefield’s study as a “fraud” told CTV News that the issues with the study were not about vaccines but “about the integrity of science, generally.” He pointed out that it’s often difficult for the average person to decipher the complicated nature of medical research.
“Science publications are all ‘anonymized’: you don’t know anything about the patients, where the data comes from. You can’t check it for yourself, unlike newspapers and broadcasting where other competitive organizations and the public generally can check if the facts are true,” Deer said.
University of British Columbia professor Rosie Redfield shed some light on how scientific research becomes accepted as scientific fact when she heavily criticized a paper published in Science Magazine conducted by a NASA researcher that stated bacteria growing in arsenic instead of phosphorous was able to survive, and that the arsenic had replaced phosphorous as one of the chemical components of the bacteria’s DNA. This finding could increase the chances of the existence of life on other planets in our universe. However, Redfield wrote that the paper had “lots of flim-flam, but very little reliable information.”
“If this data was presented by a PhD student at their committee meeting, I’d send them back to the bench to do more cleanup and controls,” she wrote on her blog.
By critiquing this research online, Redfield not only raised a red flag for a potentially flawed piece of research, but also opened up what is usually an internal debate between scientists to the general public.
CBC science and technology columnist Steven Strauss wrote that Redfield’s writing allowed non-scientists to “understand how much and how often scientific findings aren’t seen by other scientists as The Truth, but rather a tentative step toward such a truth,” and declared her blog the Canadian science story of the year.
“What she did, almost by accident, is illustrate dramatically that the methodology of scientific evaluation has changed,” Strauss wrote.
Redfield noted in a later blog post that before the internet, informal discussions between researchers were held face-to-face, by mail and over the phone, which often influenced the formal papers but “weren’t available to anyone but the direct participants,” she wrote.
“Now that we’re all online, published papers are also being discussed more publicly, in blogs and other places. Such discussions are extraordinarily valuable for the progress of science — they’re written public evaluations, drawn from a wide range of expertise, and usually greatly enriched by comments from and links [to] other researchers.” She also suggested that bloggers should be included in these discussions.
However, NASA spokesperson Dwayne Brown told CBC News that the organization felt that peer reviewed material should only be debated in scientific publications, igniting the wrath of Wired blogger David Dobbs, who called Brown’s statements “pre-enlightenment thinking.”
“Even the best peer-reviewed journals make mistakes. Hype can take over. Groupthink can rule. People screw up. And sometimes journals defend mistakes by refusing to publish sharp critiques of them. All this stuff happens, and not just once in a blue moon,” Dobbs wrote.
To truly understand science, you need to think like a scientist. That doesn’t mean distrusting science. That means thinking critically about studies you read and how the research leading to certain outcomes was conducted before coming to a conclusion.