Archives


ALL INVITED: Workshop on Reproducibility and Integrity in Scientific Research

DATE: Friday 26 October. PLACE: University of Canterbury, Business School, Meremere, Room 236, Christchurch, NEW ZEALAND REGISTRATION (important for catering purposes): email to tom.coupe@canterbury.ac.nz COST: Nada ($0) Supported by the University of Canterbury Business School Research Committee. OVERVIEW: There is more…

Read More

And How Are Things Going In Political Science?

[From the working paper “Why Too Many Political Science Findings Cannot be Trusted and What We Can Do About It” by Alexander Wuttke, posted at SocArXiv Papers] “…this article reviewed the meta-scientific evidence with a focus on the quantitative political science…

Read More

VASISHTH: The Statistical Significance Filter Leads To Overoptimistic Expectations of Replicability

[This blog draws on the article “The statistical significance filter leads to overoptimistic expectations of replicability”, authored by Shravan Vasishth, Daniela Mertzen, Lena A. Jäger, and Andrew Gelman, published in the Journal of Memory and Language, 103, 151-175, 2018. An open…

Read More

Significant Effects From Low-Powered Studies Will Be Overestimates

[From the article, “The statistical significance filter leads to overoptimistic expectations of replicability” by Shravan Vasishth, Daniela Mertzen, Lena Jäger, and Andrew Gelman, published in the Journal of Memory and Language] Highlights: “When low-powered studies show significant effects, these will…

Read More

Reproducibility. You Can Do This.

[From the paper, “Practical Tools and Strategies for Researchers to Increase Replicability” by Michele Nuijten, forthcoming in Developmental Medicine & Child Neurology] “Several large-scale problems are affecting the validity and reproducibility of scientific research. … Many of the suggested solutions are…

Read More

A Unified Framework for Quantifying Scientific Credibility?

[From the abstract of the paper, “A Unified Framework to Quantify the Credibility of Scientific Findings”, by Etienne LeBel, Randy McCarthy, Brian Earp, Malte Elson, and Wolf Vanpaemel, published in the journal, Advances in Methods and Practices in Psychological Science] “…we…

Read More

Oh No! Not Zebra Finches Too!

[From the article, “Replication Failures Highlight Biases in Ecology and Evolution Science” by Yao-Hua Law, published at http://www.the-scientist.com%5D  “As robust efforts fail to reproduce findings of influential zebra finch studies from the 1980s, scientists discuss ways to reduce bias in such…

Read More

Making Replications Mainstream: It Takes a Research Community

Just in case you missed it, the latest issue of Behavioral and Brain Sciences includes an article by Rolf Zwaan, Alexander Etz, Richard Lucas, and Brent Donnellan entitled “Making Replications Mainstream”. It is something of a tour-de-force by four prominent…

Read More

Solving the Replication Crisis: Audits Would Help

[From the article “Randomly auditing research labs could be an affordable way to improve research quality: A simulation study” by Adrian Barnett, Pauline Zardo, and Nicholas Graves, published at PLoS One] “The “publish or perish” incentive drives many researchers to increase the…

Read More

Royal Netherlands Academy of Arts and Sciences Calls for Replications To Be a “Normal Part of Science”

[From the article “Make replication studies a normal part of science,” posted at the website of the Royal Netherlands Academy of Arts and Sciences] “The systematic replication of other researchers’ work should be a normal part of science. That is the…

Read More