Big New Replication Study in Nature! Read All About It!
[From the abstract of the article “Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015″, published in Nature Human Behaviour by Colin Camerer et al.]
“Being able to replicate scientific findings is crucial for scientific progress. We replicate 21 systematically selected experimental studies in the social sciences published in Nature and Science between 2010 and 2015. The replications follow analysis plans reviewed by the original authors and pre-registered prior to the replications. The replications are high powered, with sample sizes on average about five times higher than in the original studies. We find a significant effect in the same direction as the original study for 13 (62%) studies, and the effect size of the replications is on average about 50% of the original effect size. Replicability varies between 12 (57%) and 14 (67%) studies for complementary replicability indicators. Consistent with these results, the estimated true-positive rate is 67% in a Bayesian analysis. The relative effect size of true positives is estimated to be 71%, suggesting that both false positives and inflated effect sizes of true positives contribute to imperfect reproducibility. Furthermore, we find that peer beliefs of replicability are strongly related to replicability, suggesting that the research community could predict which results would replicate and that failures to replicate were not the result of chance alone.”
To read the article, click here.
The study has generated much buzz in the media:
Buzzfeed: “In the last four years, 125 journals, mostly in the behavioral sciences, have adopted “registered reports,” …Similarly, more than 20,000 studies have been preregistered on the Center for Open Science’s website.”
Science News: “‘Replication crisis’ spurs reforms in how science studies are done”.
The Atlantic: “Fortunately, there are signs of progress. The number of pre-registered experiments—in which researchers lay out all their plans beforehand to obviate the possibility of p-hacking—has been doubling every year since 2012.”
NPR: “‘The social-behavioral sciences are in the midst of a reformation’… Scientists are..announcing in advance the hypothesis they are testing; they are making their data and computer code available so their peers can evaluate and check their results.”
Wired: “Thousands of researchers now pre-register their methodology and hypothesis before publication, to head off concerns that they’ll massage data after the fact.”
Washington Post: “the experiments scrutinized in this latest effort were published prior to a decision several years ago by Science, Nature and other journals to adopt new guidelines designed to increase reproducibility, in part by greater sharing of data”.
Like this:
Like Loading...