Discussion Papers 1572, 19 S. : Anh.
Benedikt Fecher, Mathis Fräßdorf, Gert G. Wagner
get_appDownload (PDF 4.79 MB)
We live in a time of increasing publication rates and specialization of scientific disciplines. More and more, the research community is facing the challenge of assuring the quality of research and maintaining trust in the scientific enterprise. Replication studies are necessary to detect erroneous research. Thus, the replicability of research is considered a hallmark of good scientific practice and it has lately become a key concern for research communities and science policy makers alike. In this case study we analyze perceptions and practices regarding replication studies in the social and behavioral sciences. Our analyses are based on a survey of almost 300 researchers that use data from the German Socio-Economic Panel Study (SOEP), a multidisciplinary longitudinal multicohort study. We find that more than two thirds of respondents disagree with the statement that replications are not worthwhile, because major mistakes will be found at some point anyway. Nevertheless, most respondents are not willing to spend their time to conduct replication studies. This situation can be characterized as a “tragedy of the commons”: everybody knows that replications are useful, but almost everybody counts on others to conduct them. Our most important finding concerning practical consequences is that among the few replications that are reported, a large majority is conducted in the context of teaching. In our view, this is a promising detail: in order to foster replicability, one avenue may be to make replication studies a mandatory part of curricula as well as of doctoral theses. Furthermore, we argue that replication studies need to be more attractive for researchers. For example, successful replications could be listed in the publication lists of replicated authors. Vice versa, data sharing needs to receive more recognition, for example by considering data production and subsequent data sharing as scientific output.
Keywords: Replicability, good science, data sharing, research policy
Frei zugängliche Version: (econstor)