[From the working paper, “8 Easy Steps to Open Science: An Annotated Reading List” by Sophia Crüwell et al., posted at PsyArXiv Preprints] “In this paper, we provide a comprehensive and concise introduction to open science practices and resources that can help…
Read More[From the preprint “A Model-Centric Analysis of Openness, Replication, and Reproducibility”, by Bert Baumgaertner, Berna Devezer, Erkan Buzbas, and Luis Nardin, posted at arXiv.org] “In order to clearly specify the conditions under which we may or may not obtain reproducible results,…
Read More[This post is based on a presentation by Bob Reed at the Workshop on Reproducibility and Integrity in Scientific Research, held at the University of Canterbury, New Zealand, on October 26, 2018] In 2015, Duvendack, Palmer-Jones, and Reed (DPJ&R) published…
Read More[This blog is based on the paper “Pitfalls of significance testing and p-value variability: An econometrics perspective” by Norbert Hirschauer, Sven Grüner, Oliver Mußhoff, and Claudia Becker, Statistics Surveys 12(2018): 136-172.] Replication studies are often regarded as the means to…
Read MoreDATE: Friday 26 October. PLACE: University of Canterbury, Business School, Meremere, Room 236, Christchurch, NEW ZEALAND REGISTRATION (important for catering purposes): email to tom.coupe@canterbury.ac.nz COST: Nada ($0) Supported by the University of Canterbury Business School Research Committee. OVERVIEW: There is more…
Read More[From the working paper “Why Too Many Political Science Findings Cannot be Trusted and What We Can Do About It” by Alexander Wuttke, posted at SocArXiv Papers] “…this article reviewed the meta-scientific evidence with a focus on the quantitative political science…
Read More[This blog draws on the article “The statistical significance filter leads to overoptimistic expectations of replicability”, authored by Shravan Vasishth, Daniela Mertzen, Lena A. Jäger, and Andrew Gelman, published in the Journal of Memory and Language, 103, 151-175, 2018. An open…
Read More[From the article, “The statistical significance filter leads to overoptimistic expectations of replicability” by Shravan Vasishth, Daniela Mertzen, Lena Jäger, and Andrew Gelman, published in the Journal of Memory and Language] Highlights: “When low-powered studies show significant effects, these will…
Read More[From the paper, “Practical Tools and Strategies for Researchers to Increase Replicability” by Michele Nuijten, forthcoming in Developmental Medicine & Child Neurology] “Several large-scale problems are affecting the validity and reproducibility of scientific research. … Many of the suggested solutions are…
Read More[From the abstract of the paper, “A Unified Framework to Quantify the Credibility of Scientific Findings”, by Etienne LeBel, Randy McCarthy, Brian Earp, Malte Elson, and Wolf Vanpaemel, published in the journal, Advances in Methods and Practices in Psychological Science] “…we…
Read More
You must be logged in to post a comment.