The problems of publication misconduct – manipulation, fabrication and plagiarism – and other dodgy practices such as salami-style publications are attracting increasing attention. In the newly published paper “Misconduct, Marginality, and Editorial Practices in Management, Business, and Economics” (full text available here), we present findings on these problems in MBE-journals and the diffusion of editorial practices to combat them (Karabag and Berggren, 2016).
The data were collected by bibliometric searches of retracted papers in the seven major databases that cover almost all MBE journals and from two surveys to editors of these journals; the first to 60 journals with at least one public retraction, then to all journals investigated in the bibliometric study. A total of 298 journals editors answered the second survey.
The bibliometric study identified a strongly increasing trend of retraction, from 1 retraction in 2005 to 60 in 2015. Compared to the number of published papers, the figure is still very small, but since there are strong disincentives for editors to engage in retractions, the reported number is only the tip of the misconduct iceberg. As for the problem of marginality, more than half of the editors reported experiences of salami publications, which can be defined as the slicing of output into the smallest publishable unit.
The survey also enquired about specific practices to deal with these problems. We found that 42% of the journals have started to use software to detect possible plagiarism, while 30% ask authors to provide a data file. Only 6% ask the author(s) to provide information on each author’s specific roles. Many editors stated that they rely on the reviewers to detect and prevent misconduct. Despite this importance of the reviewers, less than 40% of the journals have public rewards for good reviewers, and less than half add good reviewers to the advisory board.
Only 10% of the journals publish replication studies, and the exact meaning of these answers may be doubted, in view of other studies which show a much lower percentage publishing replications. According to Duvendack et al. 2015, for example, only 3% of the studied journals actually do publish replication papers.
The discrepancy between the findings may be explained by social desirability. Editors might think or believe that it is socially desirable to publish replication studies and/or they may plan to publish them. Our survey had a free space for comments, but the editors did not comment on replication studies. Our interpretation is that editors are still not particularly interested in replication studies, even though many affirm the theoretical importance of replication for building theory and to disclose manipulation.
The paper also presents positive ideas to combat marginality by means of supporting more creative contributions. 98 journal editors provided a rich menu of ideas. We classified them into 14 sub-themes, grouped under four major themes. The first major theme, “editorial vision and engaged board”, included suggestions that editors should open up the journal and take risks, be visionary, have engaged editorial board members and change the editorial teams regularly. The second theme, “curate papers and connect authors,” comprised three sub-themes: curating and developing manuscripts, connecting authors, and constructive screening. The third major theme, labeled “open up the discussion” contained suggestions such as publish criticism instead of rejecting the papers, invite comments and involve industry specialists. The fourth theme, “go beyond the mainstream,” involved ideas on mixing disciplines, limiting individual authors and avoiding perfection.
Overall the findings indicate a broad editorial engagement with misconduct and marginality; however, we remain concerned with the amount of undetected misconduct. Several authors have argued that there is a knowledge gap in MBE due to the small number of replications and papers testing previously presented models (c.f. Duvendack et al., 2015; Kacmar and Whitfield, 2000).
Two recent cases show how valuable replication studies are. In 2010, Reinhart and Rogoff, based on a comparative sample of countries, alleged that public debt beyond a very specific level would stifle growth during periods of crisis. This conclusion was widely cited both within and outside the academic community, but when a graduate student at the University of Massachusett Amherst tried to replicate the study, he uncovered serious data flaws in the original paper which undermined both its conclusions and the theoretical assumption. The critical study was published (Herndon et al., 2014), but not in the American Economic Review, which had carried the original paper.
In the management field, Lepore (2014) re-studied the famous disk drive industry cases in Christensen´s (1997) theory on disruptive innovation. Extending the time period, the re-study found a very different pattern to the one suggested by Christensen. This finding calls for a re-assessment of the theoretical framework of disruptive innovation (Bergek et al., 2013).
The crucial point here is that none of these important and critical replication papers were published in the journals which published the original papers, and nor did these journals actively encourage submission of other replication studies. Public retractions of published papers will always be a means of last resort for a journal. Therefore it is so important to encourage close reading, reflection and replications!
Bergek, A., Berggren, C., Magnusson, T. and Hobday, M., 2013. Technological discontinuities and the challenge for incumbent firms: Destruction, disruption or creative accumulation? Research Policy, 42(6), pp.1210-1224.
Christensen C. M. 1997. The innovator’s dilemma: When new technologies cause great firms to fail, Boston: Harvard Business School Press.
Duvendack, M., Palmer-Jones, R.W. and Reed, W.R., 2015. Replications in Economics: A progress report. Econ Journal Watch, 12(2), pp.164-191.
Herndon, T., Ash, M. and Pollin, R., 2014. Does high public debt consistently stifle economic growth? A critique of Reinhart and Rogoff. Cambridge Journal of Economics, 38,, pp. 257-279.
Kacmar, K.M. and Whitfield, J.M., 2000. An additional rating method for journal articles in the field of management. Organizational Research Methods, 3(4), pp.392-406.