Archives


BROWN: Is the Evidence We Use in International Development Verifiable? Push Button Replication Provides the Answer

[This post is cross-published on FHI 360’s R&E Search for Evidence blog] There are many debates about the definitions and distinctions for replication research, particularly for internal replication research, which is conducted using the original dataset from an article or study. The…

Read More

Using Bayesian Reanalysis to Decide Which Studies to Replicate

[From the preprint “When and Why to Replicate: As Easy as 1, 2, 3?” by Sarahanne Field, Rink Hoekstra, Laura Bringmann, and Don van Ravenzwaaij, posted at PsyArXiv Preprints.] “…a flood of new replications of existing research have reached the…

Read More

Making Meta-Analyses More Replicable

[From the article “Automatic extraction of quantitative data from ClinicalTrials.gov to conduct meta-analyses” by Richeek Pradhan et al., published in the Journal of Clinical Epidemiology] “Systematic reviews and meta-analyses are labor-intensive and time-consuming. Automated extraction of quantitative data from primary…

Read More

Oh No! Not Again!

[From the article “Push button replication: Is impact evaluation evidence for international development verifiable?” by Benjamin Wood, Rui Müller, and Annette Brown, published in PLoS ONE] “…We drew a sample of articles from the ten journals that published the most…

Read More

Excellent, Cross-Disciplinary Overview of Scientific Reproducibility in the Stanford Encyclopedia of Philosophy

[From the article “Reproducibility of Scientific Results”, by Fiona Fidler and John Wilcox, published in The Stanford Encyclopedia of Philosophy] “This review consists of four distinct parts. First, we look at the term “reproducibility” and related terms like “repeatability” and…

Read More

IN THE NEWS: New York Times (November 19, 2018)

[From the article, “Essay: The Experiments Are Fascinating. But Nobody Can Repeat Them” by Andrew Gelman, published in The New York Times] “At this point, it is hardly a surprise to learn that even top scientific journals publish a lot…

Read More

Intro to Open Science in 8 Easy Steps

[From the working paper, “8 Easy Steps to Open Science: An Annotated Reading List” by Sophia Crüwell et al., posted at PsyArXiv Preprints] “In this paper, we provide a comprehensive and concise introduction to open science practices and resources that can help…

Read More

BROWN: How to Conduct a Replication Study – What Not To Do

[This post is based on a presentation by Annette Brown at the Workshop on Reproducibility and Integrity in Scientific Research, held at the University of Canterbury, New Zealand, on October 26, 2018. It is cross-published on FHI 360’s R&E Search for Evidence blog]…

Read More

What Can Meta-Analyses Tell Us About Reproducibility?

[From the abstract of the article “What Meta-Analyses Reveal About the Replicability of Psychological Research” by T.D. Stanley, Evan Carter, and Hristos Doucouliagos, published in Psychological Bulletin] “Can recent failures to replicate psychological research be explained by typical magnitudes of statistical power,bias…

Read More

BROWN: How to Conduct a Replication Study – Which Tests, Not Witch Hunts

[This post is based on a presentation by Annette Brown at the Workshop on Reproducibility and Integrity in Scientific Research, held at the University of Canterbury, New Zealand, on October 26, 2018. It is cross-published on FHI 360’s R&E Search for…

Read More