This short podcast, just 12 minutes, is worth a listen. It is an interview on Science Friday with Dan Simons, Professor of Psychology at the University of Illinois, and Barbara Spellman, former editor at Perspectives on Psychological Science, on a new…
Read More[From the website of the Netherlands Organisation for Scientific Research (NWO)] :”NWO is making 3 million euros available for a Replication Studies pilot programme. In this programme, scientists will be able to repeat research that has been carried out by others….
Read More[From 3ie — International Initiative for Impact Evaluation]: “3ie requests expressions of interest from researchers interested in conducting replication studies under 3ie’s Replication Window 4: Financial Services for the Poor…Funding is available to conduct internal replications of seven highly influential…
Read MoreThis story about academic negligence, if not outright fraud, has many similarities with previous posts about “data mistakes,” though there is enough unique in the story to make it interesting in its own right. To paraphrase Tolstoy, “each unhappy article…
Read MoreRecently, GARRET CHRISTENSEN, project scientist at BITSS, reviewed the literature on research transparency at a talk given at the Western Economic Association meetings. You can access his slides here.
Read MoreRecently, ANDREW GELMAN blogged about a communication he received from Per Pettersson-Lidbom, an economist at Stockholm University. Petterson shared three stories of “scientific fraud” in papers published in top economics journals. Gelman writes, “… I’m sharing Pettersson’s stories, neither endorsing nor disputing their particulars but…
Read MoreETIENNE LEBEL, in a blog for BITSS, gives a brief but wide-ranging summary of the status of “open science” in psychology. Topics include: (i) the use of “badges” to encourage provision of research materials, (ii) pre-registration, (iii) reproducibility, (iv) replications,…
Read More[From the article “Come Again”]: “The GRIM test, short for granularity-related inconsistency of means, is a simple way of checking whether the results of small studies of the sort beloved of psychologists (those with fewer than 100 participants) could be…
Read MoreSo another study finds that X affects Y, and you are a sufficiently cynical TRN reader that you wonder if the authors have p-hacked their way to get their result. Don’t have time (or the incentive) to do a replication? You…
Read More[From the article “Muddled meanings hamper efforts to fix reproducibility crisis” in Nature] “A semantic confusion is clouding one of the most talked-about issues in research. Scientists agree that there is a crisis in reproducibility, but they can’t agree on what ‘reproducibility’…
Read More
You must be logged in to post a comment.