Ludwig-Maximilians-Universität München
print

Language Selection

Breadcrumb Navigation


Content

Psychology

The acid test – reproducibility

München, 11/20/2018

An international team of 186 researchers has analyzed the reproducibility of published studies in psychology: 50% of those (re)tested could not be reproduced. The new work enhances the understanding of the conditions of replicable research.

One of the studies that were repeated had set out to measure attitudes to climate change. That study had asked subjects to reconstruct a jumbled text by rearranging the sentences in the right order. The authors concluded that people who had been confronted (primed) with terms such as ‘warm’ or ‘hot’ were more likely to agree that climate change is real. This finding could not be reproduced.

An international team of researchers in the field of psychology has published a massive new study in the journal Advances in Methods and Practices in Psychological Science. More than 15,000 experimental subjects took part in the project, the purpose of which was to repeat 28 psychological studies that have already been published in the literature. The outcome of their efforts was a sobering one: the results of only 14 could be reproduced. In all, 60 research groups working at institutions all over the world were involved in the replication study, including one led by Dr. Felix Schönbrodt of the Faculty of Psychology and Educational Sciences at LMU.

The authors tested the reliability of classical as well as more recent studies in psychology. The studies were selected on the basis of an agreed set of defined criteria. Among the factors considered was the overall impact of each study on the field as a whole, as reflected by its citation frequency. Furthermore, only studies that could be performed online were chosen, so that they could be repeated on the Internet platform to which all participating research groups had access. The investigation was coordinated by the Center for Open Science, an international network of scientists, research institutions, professional associations and publishers, based in the US, whose primary aim is to stimulate the application of good scientific practice in research.

The design of the new study differed significantly from those of previous efforts to test the reproducibility of published experiments. In this case, all 28 of the studies assessed were repeated independently by all 60 research groups involved. The take-home lesson – that the findings of only half of the previously published studies could be confirmed – is compatible with the results of previous replication projects in the field. However, as Felix Schönbrodt points out: “It must be emphasized that the 28 studies involved do not represent a random sample. Consequently, the result cannot be simply extrapolated to all published research studies in psychology.”

Nevertheless, the results of the new study do permit reliable inferences with respect to the factors that may have played a role in determining whether or not the studies chosen proved to be reproducible. For example, the new data – collected by 186 researchers in 36 countries – refute the widespread assumption that cultural differences between the study populations used in the original experiments can account for non-reproducibility. “Studies whose original conclusions could be verified were in fact confirmed with minor deviations by all the research groups that contributed to the new work. “In cases where a clearly defined effect could be measured, that same effect was observed by essentially all participating groups,” says Schönbrodt.

“Meanwhile, in response to the unexpectedly low replication rates in such studies, self-critical and very constructive reform movements have formed in many scientific disciplines”, says Schönbrodt, Managing Director of the Open Science Center set up at LMU in 2017. The purpose of the interdisciplinary OSC is to promote and foster openness, transparency and replicability in research.