In the past, the findings of numerous replication studies in economics have raised serious concerns regarding the credibility and reliability of published applied economic research. Literature suggests several explanations for these findings: Beyond missing incentives and rewards for the disclosure of data and code, only a small proportion of journals in economic sciences have implemented functional data availability policies. Also authors frequently do not comply with the demands of these policies or provide insufficient data and code. Our paper (“Journals in Economic Sciences: Paying Lip Service to Reproducible Research?”) regards an additional aspect and asks to which degree editorial offices enforce the data policies of their journals.
To date, only a minority of journals in economics possesses a policy on the disclosure of data and code that has been used to achieve the results of an empirical research article. But the count is growing over the years. Drivers of this shift could be located in the ongoing debates on replicable research and also in the growing demands of research funders and science policy.
In our paper we ask how much journals with a data policy enforced their policy in the past. To answer this question, we started our analyses with an evaluation of 599 articles published by 37 journals with a data availability policy. All articles have been published in the years 2013 and 2014.
At first, our analysis carved out the share of articles that fall under a data policy, because replication data is needed to verify the results of these articles. In total, we classified more than 75% of these articles to be empirical – or as we defined it, to be ‘data-based’.
Afterwards, we checked the journal data archives (if available) and the supplemental information section of each data-based article for the availability of replication files. We distinguished between articles using restricted data and such using non-restricted data. On average, only slightly more than a third of the data-based articles had accompanying data and code available.
Subsequently, we compared the demands of journals’ data policies with the replication files available for a subsample of 245 articles published by 17 journals in detail. Thereby, we were able to determine for each of the journals investigated how much the journal enforces its data policy.
For the years 2013 and 2014, our findings suggest a mixed picture: While one group of journals achieved high or very high compliance rates, a significant share of journals only sporadically provides replication files.
Due to the limited sample size and our focus on two years of publication, our analysis only provides a snapshot of journals’ practises at that time. Therefore, our findings should not be seen as a general overview of journals’ willingness to enforce their data policies. Also, our findings make not statement regarding the replicability of the research findings. We only checked for the availability of the prerequisites for potential replication attempts.
Based on the outcome of our analyses, we recommend editorial offices to pay more attention whether journal’s data policy has been fulfilled by their authors. Journals should be stricter in enforcing their data policies, because replicability of published research is a cornerstone of the scientific method.
In the first place editors are accountable for enforcing journals’ data policies, but also reviewers should feel a responsibility to take care of a periodical’s data policy. Both, editors and reviewers, invest time in ensuring that authors comply with journal’s style sheet. To also invest efforts in ensuring that replication files are available according to journal’s data policy is a task that would strengthen the scientific reputation of a periodical furthermore.
First and foremost journals play a crucial role for the scientific quality assurance. Thereby journals are also important for promoting a culture of research integrity because published articles are the most visible output of research.
In the spirit of replicable research, you can find the data, code and supplemental information of our analysis in the ZBW Journal Data Archive.
Sven Vlaeminck is a research assistant at the ZBW – Leibniz Information Centre for Economics. He also is the product manager of the ZBW Journal Data Archive. His biographical and contact information can be found here.