AAU Lecturer Contributes to Major International Study on Reproducibility and Robustness in Economics and Political Science
AAU congratulates professor Zuzana Irsova Havrankova who contributed to a major study on reproducibility and robustness in economics and political science, published in one of the most prestigious scientific journals globally, Nature.
The study found high rates of computational reproducibility and substantial robustness among recent articles from leading economics and political science journals that require data and code sharing. It offers encouraging evidence about the field’s credibility and highlights the positive impact of open data policies. This is good news for science. For years, there has been concernthat many published studies may be flawed, exaggerated, or impossible to replicate, a problem widely known as the ‘replication crisis.’
Researchers from across the career spectrum, from graduate students to senior professors, and from institutions large and small, all contributed. This achievement highlights AAU’s exceptional faculty and commitment to academic excellence and research.
“This was a massive collaborative effort, over 300 researchers from institutions around the world participated, so my contribution was part of a much larger collective endeavor organized by the Institute for Replication led by Abel Brodeur from the University of Ottawa. What made it exciting was that each of us brought our own expertise to bear on a shared scientific question: can we actually trust published research findings?” said Havránková.
How was it done?
The research was a large-scale “reproduction” study, meaning independent researchers took the original data and codes from 110 recently published articles in top economics and political science journals to see whether they could reproduce the published results using the papers’ shared data and code.
The researchers also ran “robustness checks” by testing the data using alternative analytical methods to see if the original conclusions would still hold up under scrutiny.
“The methodology is very transferable. In fact, similar approaches have already been used in psychology and medicine. Any field that relies on data analysis: public health, education research, climate science, all could benefit from this kind of systematic, independent verification. The infrastructure we built, including the ‘replication games’ format where researchers gather for a day to reproduce a study together, is scalable and could be adapted across disciplines,” said Havránková.
What did the study find?
The study found that over 85% of the published claims were computationally reproducible using the original data and codes. Furthermore, 72% of the originally significant findings remained statistically significant and pointed in the same direction, even when researchers changed the testing methods.
The median size of the effects found in the reproduced studies was almost identical (99%) to what the original publications reported. However, there were also points to learn from: some studies had coding errors and there was evidence of publication bias with some results just barely crossing the threshold of statistical significance.
Why does this matter?
For years, researchers and those in academia feared that many studies were not replicable and, therefore, not accurate or meaningful in their contributions. This mega-study suggests that many findings in this selective sample are reproducible and often robust to reasonable reanalysis.
“The broader implication is that transparency and open science practices, so sharing data, sharing code, inviting independent scrutiny, those practices genuinely work. Journals that introduced data editors saw rapid improvements in reproducibility. If this kind of large-scale, community-driven verification becomes a normal part of science, it could substantially increase public and policy trust in research findings. That matters enormously at a time when evidence-based decision-making is under pressure,” concluded Havránková.
Find the full research on the reproducibility and robustness of economics and political science here.