Posted on by replicationnetwork
INTRODUCTION Recently, one of us gave a workshop on how to conduct meta-analyses. The workshop was attended by participants from a number of different disciplines, including economics, finance, psychology, management, and health sciences. During the course of the workshop, it…
Read MoreCategory: GUEST BLOGS Tags: Disciplines, Effect size, Estimation, Fixed Effects, Journals, Meta-analysis, Random Effects
Posted on by replicationnetwork
An estimated 85% of global health research investment is wasted (Chalmers and Glasziou, 2009); a total of one hundred billion US dollars in the year 2009 when it was estimated. The movement to reduce this research waste recommends that previous…
Read MoreCategory: GUEST BLOGS, Uncategorised Tags: Accumulation bias, Meta-analysis
Posted on by replicationnetwork
[Excerpts taken from the article “Conducting a Meta-Analysis in the Age of Open Science: Tools, Tips, and Practical Recommendations” by David Moreau and Beau Gamble, posted at PsyArXiv Preprints] “In this tutorial, we describe why open science is important in the…
Read MoreCategory: NEWS & EVENTS Tags: Meta-analysis, Open Science, Pre-registration, Psychology, R script, Templates, Transparency
Posted on by replicationnetwork
[Excerpts taken from the article “Crowdsourcing hypothesis tests: Making transparent how design choices shape research results” by Justin Landy and many others, posted at the preprint repository at the University of Essex] “…we introduce a crowdsourced approach to hypothesis testing….
Read MoreCategory: NEWS & EVENTS Tags: crowdsourcing, Meta-analysis, Multiple research teams, Psychology, Random Effects, Reproducibility, Research design
Posted on by replicationnetwork
[Excerpts taken from the article, “The Statistics of Replication” by Larry Hedges, pubished in the journal Methodology] Background “Some treatments of replication have defined replication in terms of the conclusions obtained by studies (e.g., did both studies conclude that the…
Read MorePosted on by replicationnetwork
[Excerpts taken from the report “Examining the Reproducibility of Meta-Analyses in Psychology: A Preliminary Report” by Daniel Lakens et al., posted at MetaArXiv Preprints ] “…given the broad array of problems that make it difficult to evaluate the evidential value…
Read MoreCategory: NEWS & EVENTS Tags: BITSS, Daniel Lakens, Meta-analysis, Open Science Framework, Psychology, Reproducibility
Posted on by replicationnetwork
[From the preprint “Accumulation bias in meta-analysis: the need to consider time in error control” by Judith ter Schure and Peter Grünwald, posted at arXiv.org] “Studies accumulate over time and meta-analyses are mainly retrospective. These two characteristics introduce dependencies between…
Read MorePosted on by replicationnetwork
[From the blog “Where Do Non-Significant Results in Meta-Analysis Come From?” by Ulrich Schimmack, posted at Replicability-Index] “It is well known that focal hypothesis tests in psychology journals nearly always reject the null-hypothesis … However, meta-analyses often contain a fairly…
Read MorePosted on by replicationnetwork
[From the blog “Be careful what you wish for: cautionary tales on using single studies to inform policymaking” by Emmanuel Jimenez, posted at http://www.3ieimpact.org.%5D “For a development evaluator, the holy grail is to have evidence from one’s study be taken…
Read MoreCategory: NEWS & EVENTS Tags: 3ie, 3ie replication programme, Dengvaxia, Meta-analysis, Philippines, replication, Systematic reviews
Posted on by replicationnetwork
[From the article “Automatic extraction of quantitative data from ClinicalTrials.gov to conduct meta-analyses” by Richeek Pradhan et al., published in the Journal of Clinical Epidemiology] “Systematic reviews and meta-analyses are labor-intensive and time-consuming. Automated extraction of quantitative data from primary…
Read MoreCategory: NEWS & EVENTS Tags: Automatic extraction, ClinicalTrials.gov, EXACT, Meta-analysis, Python, replication, Systematic reviews
You must be logged in to post a comment.