Archives
[From an announcement on the BITSS website] “Research Transparency and Reproducibility Training (RT2) provides participants with an overview of tools and best practices for transparent and reproducible social science research.” “RT2 is designed for researchers in the social and health sciences,…
Read More
[From the preprint “An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014-2017)” by Tom Hardwicke, Joshua Wallach, Mallory Kidwell, & John Ioannidis posted at MetaArXiv Preprints] “In this study, we evaluated a broad range of…
Read More
[From the preprint “Abandoning statistical significance is both sensible and practical” by Valentin Amrhein, Andrew Gelman, Sander Greenland, and Blakely McShane, available at PeerJ Preprints] “Dr Ioannidis writes against our proposals to abandon statistical significance…” “…we disagree that a statistical…
Read More
[From the article, “Effect of Impact Factor and Discipline on Journal Data Sharing Policies” by David Resnik et al., published in Accountability in Research] “…we coded … 447 journals … The breakdown was: 18.1% biological sciences, 18.8% clinical sciences, 21.7%…
Read More
[From the article “Assessing data availability and research reproducibility in hydrology and water resources” by Stagge, Rosenberg, Abdallah, Akbar, Attallah & James, published in Nature’s Scientific Data] “…reproducibility requires multiple, progressive components such as (i) all data, models, code, directions,…
Read More
[From the article “DARPA Wants to Solve Science’s Reproducibility Crisis With AI” by Adam Rogers, published in Wired] “A Darpa program called Systematizing Confidence in Open Research and Evidence—yes, SCORE—aims to assign a “credibility score” … to research findings in the…
Read More
[From the working paper, “Open science and modified funding lotteries can impede the natural selection of bad science” By Paul Smaldino, Matthew Turner, and Pablo Contreras Kallens, posted at OSF Preprints] “…we investigate the influence of three key factors on the…
Read More
[From the press release “Can machines determine the credibility of research claims? The Center for Open Science joins a new DARPA program to find out” from the Center for Open Science] “The Center for Open Science (COS) has been selected…
Read More
[From the article “Push button replication: Is impact evaluation evidence for international development verifiable?” by Benjamin Wood, Rui Müller, and Annette Brown, published in PLoS ONE] “…We drew a sample of articles from the ten journals that published the most…
Read More
[From the article “Reproducibility of Scientific Results”, by Fiona Fidler and John Wilcox, published in The Stanford Encyclopedia of Philosophy] “This review consists of four distinct parts. First, we look at the term “reproducibility” and related terms like “repeatability” and…
Read More
You must be logged in to post a comment.