A recent study published in Nature shows that in the social sciences there is growing attention to the reproducibility of results, which is increasingly considered central to ensuring the quality and reliability of research
The study, to which the Scuola Superiore Sant’Anna also contributed thanks to the work of Mario Martinoli, re-examined one hundred studies published between 2009 and 2018. In more than half of the cases, the new results diverge from the original ones. The aim of the study is “to contribute to producing more robust results and to improve the way scientific evidence is evaluated, compared, and accumulated.”
What happens if, years later, a scientific study in the field of social sciences is analyzed by different researchers? Will the results be the same as the original, or will they vary depending on who conducts the analysis, the methodology used, and how the data are interpreted?
The study, published in Nature under the title Investigating the Analytical Robustness of Social and Behavioural Sciences, highlights how decisive analytical choices are in the construction of scientific results. A total of 457 scientists from universities and institutions around the world re-examined 100 studies published between 2009 and 2018. In more than half of the cases, the new results diverge from the original ones.
The Scuola Superiore Sant’Anna participated in the study thanks to the contribution of Mario Martinoli, a research fellow at the Institute of Economics.
“The aim of the study is to assess how much the results of research in the social and behavioural sciences remain valid when the same data are analyzed using different methodological choices, highlighting how conclusions depend on the decisions made throughout the research process,” Martinoli states.
The study selected 100 scientific papers produced between 2009 and 2018 in the fields of economics and finance, management, marketing and organizational behaviour, political science, psychology, sociology, criminology, and demography. For each study, the original data were independently reanalyzed by at least five different researchers, and the outcomes obtained were compared with the original conclusions.
The results showed that only in one-third of the cases do the new findings match the original ones. In the remaining cases, the divergence is driven by a range of factors identified as analytical variability.
The study does not aim to challenge the validity of the original findings but, as Martinoli emphasizes, to “make research more transparent, more replicable, and less sensitive to arbitrary choices. Considering analytical decisions as a central part of inference, rather than a mere methodological detail, can help produce more robust results and improve the way scientific evidence is evaluated, compared, and accumulated.”