The Scientific Scourge Of Fake Data

Almost 10 years ago, a significant study on personal honesty was published. It indicated that simple method reduced lying by respondents who were filling out forms: if people signed an honesty declaration at the beginning of the form, rather than the end, they were supposed to be less likely to lie in their answers. The study was cited by other researchers and featured in a bestselling book written by one of its principal authors.

Now that study is being retracted. Over the years, efforts to replicate the results of the study have been unsuccessful, but now a more serious issue has been uncovered. Academics who took a close look at the underlying data cited in the study have determined that one of the main experiments cited in the study was faked, and that the data related to that experiment is fraudulent. The researchers who published the initial study agree and have asked the journal that published the initial study–the Proceedings of the National Academy of Sciences–to formally retract it.

It’s ironic that a study drawing conclusions about personal honesty would be based on fake data, but it’s the latest high-publicity example of a significant problem in the scientific community. Some have called it the “replication crisis.” We remember from our high school science classes that the scientific method involved developing a hypothesis, creating and conducting an experiment designed to test the hypothesis, describing the experiment and honestly publishing its results, and then letting the rest of the scientific community challenge the hypothesis, the experiment, and the data. The last step, in which other scientists played the role of skeptic and fact-checker and verifier by trying to replicate the experiment and test its results, was a key part of the whole process. And in the past, peer-reviewed journals played an important role in ensuring that the results of the experiments could, in fact, be faithfully replicated and the conclusions drawn were credible.

But something has obviously gone wrong, as a number of high-profile research findings can’t be replicated and there is increasing concern that data isn’t being collected or reported honestly or accurately. The “social sciences,” which encompasses the honesty study noted above, has been especially affected by the replication problem. And in the case of the honesty study, no one seems to know how the faked data was created in the first place. Four of the five authors of the study say they weren’t involved with collecting the false data, and the other one denies that he had anything to do with it. So, how did it happen, and why didn’t the initial authors of the study carefully review the faked data and question its bona fides before publishing the results? Some observers wonder if the behavioral studies that are now a staple of news feeds aren’t being influenced by the desire to create headlines and achieve clicks, leading researchers to overlook questionable data or methodologies.

You see signs these days that say that “science is real.” That’s obviously true, but the replication crisis demonstrates that not all scientific results are real. There’s nothing wrong with having a healthy skepticism about groundbreaking studies or sweeping pronouncements until after the underlying data has been thoroughly vetted and other researchers have replicated the results. As our high school science teachers instructed us, that’s what should have been happening in the first place.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s