Archives


The Social Science Prediction Platform Makes Its Debut

Read More

One Question + Many Researchers = Many Different Answers

[Excerpts taken from the article “Crowdsourcing hypothesis tests: Making transparent how design choices shape research results” by Justin Landy and many others, posted at the preprint repository at the University of Essex] “…we introduce a crowdsourced approach to hypothesis testing….

Read More

The Garden of Forking Paths Strikes Again!

[From the abstract of the article, “Many Analysts, One Data Set: Making Transparent How Variations in Analytic Choices Affect Results”, published by Silberzahn et al. in Advances in Methods and Practices in Psychological Science] “Twenty-nine teams involving 61 analysts used the same…

Read More

A Unified Framework for Quantifying Scientific Credibility?

[From the abstract of the paper, “A Unified Framework to Quantify the Credibility of Scientific Findings”, by Etienne LeBel, Randy McCarthy, Brian Earp, Malte Elson, and Wolf Vanpaemel, published in the journal, Advances in Methods and Practices in Psychological Science] “…we…

Read More

Anyone Interested in Participating in an Economics Replication Experiment?

[From the “2018 Economics Replication Project” posted by Nick Huntington-Klein and Andy Gill of California State University, Fullerton] “In this project, we are asking recruited researchers to perform a “blind” replication of one of two studies. Without telling researchers the…

Read More

CLAIRE BOEING-REICHER: Crowdsourcing a Journal’s Replication Policy

As reported in a previous blog post, the Economics E-Journal has launched a new replication section. As part of this initiative, we have developed a set of guidelines for replication submissions. These guidelines seek to strike a reasonable balance among…

Read More