2017 Leamer-Rosenthal Award Winners Announced

Each year, the Berkeley Initiative for Transparency in the Social Sciences (BITSS) awards prizes to researchers who have made substantial contributions to improving transparency in research practices. The prizes are names after Ed Leamer (economics) and Robert Rosenthal (psychology) through their early efforts in identifying problems in scientific publishing. On October 12th, BITSS announced the recipients of the 2017 Leamer-Rosenthal Awards, 
A total of eight awards were made this year. Two were made in the area of education: Daniel Lakens for, among other things, his creation of the online course “Improving Your Statistical Inferences”; and Simine Vazier, co-founder and President of the Society for the Improvement of Psychological Science. Six other awards were made to emerging researchers for promoting and demonstrating transparent practices in their research.
To read more, click here.

So…Do They Publish Replication Studies in Neuroscience?

[From the article “Do Neuroscience Journals Accept Replications? A Survey of the Literature,” published by Andy Yeung in the September issue of Frontiers in Human Neuroscience]
“Recent reports in neuroscience, especially those concerning brain-injury and neuroimaging, have revealed low reproducibility of results within the field and urged for more replication studies. However, it is unclear if the neuroscience journals welcome or discourage the submission of reports on replication studies.”
“Of the 465 journals reviewed, 28 (6.0%) explicitly stated that they accept replications, 394 (84.7%) did not state their position on replications, 40 (8.6%) implicitly discouraged replications by emphasizing on the novelty of the manuscripts, and 3 (0.6%) explicitly stated that they reject replications.”
To read more, click here.

GRUNOW: Say Hello to IREE – A New Economics Journal Dedicated to the Publishing of Replication Studies

Replications are pivotal for the credibility of empirical economics. Evidence-based policy requires findings that are robust and reproducible. Despite this, there has been a notable absence of serious effort to establish the reliability of empirical research in economics. As Edward Leamer famously noted, “Hardly anyone takes data analysis seriously. Or perhaps more accurately, hardly anyone takes anyone else’s data analysis seriously.” This is evidenced by the fact that replication studies are rarely published in economic journals.
However, the situation may be changing. Recently, the Deutsche Forschungsgemeinschaft (DFG) released a Statement on the Replicability of Research Results in which it emphasized the importance of replication to ensure the reliability of empirical research. Accordingly, DFG is funding a new scientific journal, the “International Journal for Re-Views in Empirical Economics (IREE)”.
IREE is a joint project of Leuphana University of Lüneburg (Joachim Wagner), the German Institute for Economic Research (DIW Berlin) (Gert G. Wagner), the Institute of Labor Economics (Hilmar Schneider), and the ZBW. Nobel laureate Sir Angus Deaton (Princeton University), Jeffrey M. Wooldridge (Michigan State University), and Richard A. Easterlin (University of Southern California) are members of the advisory board of IREE.
The International Journal for Re-Views in Empirical Economics (IREE) is the first journal dedicated to the publication of replication studies based on economic micro-data. Furthermore, IREE publishes synthesizing reviews, micro-data sets and descriptions thereof, as well as articles dealing with replication methods and the development of standards for replications. Up to now, authors of replication studies, data sets and descriptions have had a hard time gaining recognition for their work via citable publications. As a result, the incentives for conducting these important kinds of work were immensely reduced. Richard A. Easterlin notes the paradox when he states: “Replication, though a thankless task, is essential for the progress of social science.”
To make replication a little less thankless, all publications in IREE are citable. Each article, data set, and computer program is assigned a DOI. In addition, data sets are stored in a permanent repository, the ZBW Journal Data Archive. This provides a platform for authors to gain credit for their replication-related research.
Up to now, publication of replication studies has often been results-dependent, with publication being more likely if the replication study refutes the original research. This induces a severe publication bias. When this happens, replication, rather than improving things, can actually further undermine the reliability of economic research. Compounding this are submission and publication fees which discourage replication research that is unlikely to get published.
IREE is committed to publishing research independent of the results of the study. Publication is based on technical and formal criteria without regard to results. To encourage open and transparent discourse, IREE is open access. There are no publication or submission fees, and the journal is committed to a speedy and efficient peer-review process.
To learn more about IREE, including how to submit replication research for publication, click here.
Dr. Martina Grunow is Managing Editor of the International Journal for Re-Views in Empirical Economics (IREE) and is an associate researcher at the Canadian Centre for Health Economics (CCHE). She can be contacted by email at m.grunow@zbw.eu.
REFERENCES:
Duvendack, M., Palmer-Jones, R.W. & Reed, W.R., 2015. Replications in Economics: A Progress Report, Econ Journal Watch, 12(2): 164-191.
Leamer, Edward E., 1983. Let’s Take the Con Out of Econometrics, The American Economic Review, 73(1): 31-43.
Mueller-Langer, F.,  Fecher, B.,Harhoff, D. & Wagner, G. G., 2017. The Economics of Replication, IZA Discussion Papers 10533, Institute for the Study of Labor (IZA).

Abandon Statistical Significance?

[From the abstract of a recent working paper by Blakeley McShane, David Gal, Andrew Gelman, Christian Robert, and Jennifer Tackett.]
“In science publishing and many areas of research, the status quo is a lexicographic decision rule in which any result is first  required to have a p-value that surpasses the 0.05 threshold and only then is consideration—often scant—given to such factors as prior and related evidence, plausibility of mechanism, study design and data quality, real world costs and benefits, novelty of finding, and other factors that vary by research domain. There have been recent proposals to change the p-value threshold, but instead we recommend abandoning the null hypothesis significance testing paradigm entirely, leaving p-values as just one of many pieces of information with no privileged role in scientific publication and decision making. We argue that this radical approach is both practical and sensible.”
To read more, click here.

Not One, Not Two, Not Three, But Eight Replications?

In a recent blog at Open Philanthropy Project summarizing some of his research on the effect of incarceration on crime, David Roodman conducted, count them, 8 replications. Here is what he said: “Among three dozen studies I reviewed, I obtained or reconstructed the data and code for eight. Replication and reanalysis revealed significant methodological concerns in seven and led to major reinterpretations of four.”
To read more, click here.
To read the response of one of those whose work was replicated (Alex Tabarrok), click here.

RCTs To Fix Science?

[From a letter published in the September 22 issue of Science entitled “Addressing scientific integrity scientifically”]
“To introduce greater rigor into the study of research integrity and the factors that foster or discourage responsible behavior, the participants at the Fifth World Conference on Research Integrity endorsed the “Amsterdam Agenda” (1). Under this Agenda, the newly created World Conferences on Research Integrity Foundation plans to establish a registry for research on research integrity. The registry will ask researchers to describe the integrity problem they are addressing, how the problem affects research, the intervention they are introducing to address the problem, why they hypothesize that the intervention will work, how they will assess the outcome, and their plans for data sharing.”
To read more, click here (the full text is behind a paywall)

Results-Free Peer Review: The Video

Previous posts at TRN have highlighted “results-free peer review” (RFPR) efforts at a variety of journals: see here, here, and here. The journal BMC Psychology recently put together a short (approximately 2 minutes) video discussing their new policy of “results-free peer review.” For those not familiar with RFPR, the video is a nice introduction to the subject.
To see the video, click here.
To the best of TRN’s knowledge, no economics journals do RFPR, though the subject (“results-blind review”) is addressed in Christensen and Miguel’s forthcoming JEL paper, “Transparency, Reproducibility, and the Credibility of Economics Research“.  The closest would the AEA’s Randomized Control Trial registry where researchers post their research plans prior to undertaking  their data collection and analysis. 

GRUNOW: Notes from a Workshop: “Replications in Empirical Economics – Ways Out of the Crisis”

“Next year, this topic should not be discussed in a pre-conference workshop but in the opening plenum of the conference!” This statement by a young researcher not only concluded the workshop but also gave bright prospects to replications in Economics.
On September 8, 2017 the ZBW Leibniz Information Center for Economics hosted the workshop “Replications in Empirical Economics – Ways out of the Crisis” at the Annual Conference of the Verein für Socialpolitik in Vienna, Austria. Thirty participants and four speakers engaged in lively and stimulating discussions about replications and the publication of replications in Economics. .
Hilmar Schneider, Director of the Institute of Labor Economics (IZA, Germany), made an enthusiastic plea for replications in economics and sharply criticized the credibility of much economic research. He argued that without replication research, knowledge can only grow horizontally rather than vertically. According to Schneider, the core problem lies in the pressure for novelty and original research that is exerted by the academic culture and the scientific publication system. As a consequence, pseudo-innovations are preferred to research that may actually have an impact on society. Arbitrary findings that lack verification are insufficient to properly inform and support policy decisions.
Christiane Joerk, Program Director from the German Research Foundation (DFG), summarized the DFG Statement on the Replicability of Research Results. The DFG already financially supports projects that strengthen the infrastructure for replications (e.g. IREE – see below) and will also fund replication studies in the future.
“Is it possible to publish replication studies in the International Journal for Re-Views in Empirical Economics using a pseudonym?” This question reflects the situation of young researchers, who are caught between good scientific practice and the current academic culture. Alarmingly, good scientific practice and academic culture apparently pose a conflict to researchers.
The wish for the anonymous publication of replication studies also reflects the bad reputation of replication studies in economics. On the one hand, most researchers state that replication studies are a critical part of scientific progress and indispensable for good scientific practice. On the other hand, as documented by Maren Duvendack from the University of East Anglia (UK), researchers who do replications are seen to be engaged in “bullying” and “persecution”, and referred to as “research parasites”. Together with her colleagues Richard Palmer Jones and Bob Reed she analyzed the publication market for replications in economics. The authors found that published replications in economics have been increasing since the mid-seventies. However the absolute number is still alarmingly low (with a peak of 22 published replications in economics in 2012). Compared to other disciplines such as psychology and political science, economics lags behind with respect to replication efforts. According to Duvendack, a key obstacle could be the lack of publication outlets for replication studies in economics.
As a partial solution to the replication crisis, Martina Grunow, Project Manager at the ZBW Leibniz Information Center for Economics (Germany), concluded the talks with her presentation of the International Journal for Re-Views in Empirical Economics (IREE). She summarized the incentive problems for authors to conduct replication studies and as a result a lack of publication possibilities for this kind of research. IREE is a peer-reviewed and open-access e-journal which is dedicated to the publication of replication studies in empirical economics. Replication studies are published without regard to their results. IREE-papers are made citable with DOIs and are internationally disseminated via EconStor and RePEc. As the project is funded by the German Research Foundation (DFG) and the ZBW Leibniz Information Center for Economics, IREE does not charge any publication fees. By providing a publication platform for replication studies, IREE aims to encourage the credibility of economics research based on robust and replicable findings.
With the prospect of publishing replication studies, doctoral students participating in the workshop expressed the view that replications should become a mandatory part of their dissertation research. They argued that by doing so, replication research would receive deserved credibility and not be buried under the pressure for pseudo-innovations. This is especially important at the beginning of their academic careers. The participants also agreed on the importance of teaching the value of replications to students.
Dr. Martina Grunow is Managing Editor of the International Journal for Re-Views in Empirical Economics (IREE) and is an associate researcher at the Canadian Centre for Health Economics (CCHE). She can be contacted by email at m.grunow@zbw.eu.
REFERENCES:
Duvendack, M., Palmer-Jones, R.W. & Reed, W.R., 2015. Replications in Economics: A Progress Report, Econ Journal Watch, 12(2): 164-191.

 

Incentives to Replicate: Show Me the Money!

[From the article “Go Forth and Replicate: On Creating Incentives for Repeat Studies” by Michael Schulson at the web magazine Undark]
“Suggested reasons for the [replication] crisis are many. Some researchers have blamed the scientific publishing culture itself …. But some stakeholders also point to a more concrete and obvious root cause: a lack of direct incentives to replicate other researchers’ work, including very little funding to do replications. This raises some key questions, including whether the federal government — the largest single funder of basic science in the world — ought to be doing more to encourage and underwrite basic replication research. At the moment, only token reforms have been undertaken in the U.S., but there are signs that novel programs are gestating elsewhere, including a European pilot program that could serve as a model for replication funding in other countries.”
To read more, click here.

All Researchers Want Open Data, Right?

[From the blog by Justin Gallagher entitled “Public Data that Isn’t (or Wasn’t) Public” posted at BITSS]
“I recently completed a coauthored working paper, together with Paul J. Fisher, that examines whether electronic monitoring via red light traffic cameras leads to fewer vehicle accidents and injuries.  As part of the project, we encountered the bizarre situation where researchers at The University of Texas at Austin, a flagship public state university, fought tooth and nail against releasing vehicle accident data originally collected by the Texas Department of Transportation (TxDOT).”
To read more, click here.