Using Bayesian Reanalysis to Decide Which Studies to Replicate

[From the preprint “When and Why to Replicate: As Easy as 1, 2, 3?” by Sarahanne Field, Rink Hoekstra, Laura Bringmann, and Don van Ravenzwaaij, posted at PsyArXiv Preprints.]
“…a flood of new replications of existing research have reached the literature, and more are being conducted. In theory, this uptick in the number of replications being conducted is a good development for the field…”
“… however in practice, so much interest in conducting replications leads to a logistical problem. There is a vast body of literature that could be subject to replication. But the question is: how does one select which studies to replicate from the ever-increasing pool of candidates out there?”
“In this paper, we apply a Bayesian reanalysis to several recent research findings, the end-goal being to demonstrate a technique one can use to reduce a large pool of potential replication targets to a manageable list. The Bayesian reanalysis is diagnostic in the sense that it can assist us in separating findings into three classes, or tiers of results:”
“(1) results for which the statistical evidence pro-alternative is compelling;”
“(2) results for which the statistical evidence pro-null is compelling;”
“(3) results for which the statistical evidence is ambiguous.”
“… p–values are unable to differentiate between results which belong in the second of these categorical classes, and those that belong in the third.”
“…we will make an initial selection based on those studies in tier 3: whose results yield only ambiguous evidence in relation to support for their reported hypotheses. For this purpose, we will judge such ambiguity, or low evidential strength, as when a study’s. BF10 lies between 1/3 and 3, which, by Jeffrey’s (1961) classification system provides no more than ‘anecdotal’ evidence for one hypothesis over the other.”
“…The approach we advocate and apply in this article can be simple and relatively fast to conduct, and affords the user access to important information about the strength of evidence contained in a published study. Although efficient, this approach has the potential to maximize the impact of the outcomes of those replications, and minimize the waste of resources that could result from a haphazard approach to replication. Combining a quantitative reanalysis with a qualitative assessment process of a large group of potential replication targets in a simple approach such as the one presented in this paper, allows the information of multiple sources to prioritize replication targets…”
To read the paper, click here.

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: