Archives


Intro to Open Science in 8 Easy Steps

[From the working paper, “8 Easy Steps to Open Science: An Annotated Reading List” by Sophia Crüwell et al., posted at PsyArXiv Preprints] “In this paper, we provide a comprehensive and concise introduction to open science practices and resources that can help…

Read More

Modelling Reproducibility

[From the preprint “A Model-Centric Analysis of Openness, Replication, and Reproducibility”, by Bert Baumgaertner, Berna Devezer, Erkan Buzbas, and Luis Nardin, posted at arXiv.org] “In order to clearly specify the conditions under which we may or may not obtain reproducible results,…

Read More

REED: An Update on the Progress of Replications in Economics

[This post is based on a presentation by Bob Reed at the Workshop on Reproducibility and Integrity in Scientific Research, held at the University of Canterbury, New Zealand, on October 26, 2018] In 2015, Duvendack, Palmer-Jones, and Reed (DPJ&R) published…

Read More

HIRSCHAUER et al.: Why replication is a nonsense exercise if we stick to dichotomous significance thinking and neglect the p-value’s sample-to-sample variability

[This blog is based on the paper “Pitfalls of significance testing and p-value variability: An econometrics perspective” by Norbert Hirschauer, Sven Grüner, Oliver Mußhoff, and Claudia Becker, Statistics Surveys 12(2018): 136-172.] Replication studies are often regarded as the means to…

Read More

ALL INVITED: Workshop on Reproducibility and Integrity in Scientific Research

DATE: Friday 26 October. PLACE: University of Canterbury, Business School, Meremere, Room 236, Christchurch, NEW ZEALAND REGISTRATION (important for catering purposes): email to tom.coupe@canterbury.ac.nz COST: Nada ($0) Supported by the University of Canterbury Business School Research Committee. OVERVIEW: There is more…

Read More

And How Are Things Going In Political Science?

[From the working paper “Why Too Many Political Science Findings Cannot be Trusted and What We Can Do About It” by Alexander Wuttke, posted at SocArXiv Papers] “…this article reviewed the meta-scientific evidence with a focus on the quantitative political science…

Read More

VASISHTH: The Statistical Significance Filter Leads To Overoptimistic Expectations of Replicability

[This blog draws on the article “The statistical significance filter leads to overoptimistic expectations of replicability”, authored by Shravan Vasishth, Daniela Mertzen, Lena A. Jäger, and Andrew Gelman, published in the Journal of Memory and Language, 103, 151-175, 2018. An open…

Read More

Significant Effects From Low-Powered Studies Will Be Overestimates

[From the article, “The statistical significance filter leads to overoptimistic expectations of replicability” by Shravan Vasishth, Daniela Mertzen, Lena Jäger, and Andrew Gelman, published in the Journal of Memory and Language] Highlights: “When low-powered studies show significant effects, these will…

Read More

Reproducibility. You Can Do This.

[From the paper, “Practical Tools and Strategies for Researchers to Increase Replicability” by Michele Nuijten, forthcoming in Developmental Medicine & Child Neurology] “Several large-scale problems are affecting the validity and reproducibility of scientific research. … Many of the suggested solutions are…

Read More

A Unified Framework for Quantifying Scientific Credibility?

[From the abstract of the paper, “A Unified Framework to Quantify the Credibility of Scientific Findings”, by Etienne LeBel, Randy McCarthy, Brian Earp, Malte Elson, and Wolf Vanpaemel, published in the journal, Advances in Methods and Practices in Psychological Science] “…we…

Read More