An analysis of psychological meta-analyses reveals a reproducibility problem

psychology
Credit: CC0 Public Domain

Meta-analysis research studies in psychology aren't always reproducible due to a lack of transparency of reporting in the meta-analysis process, according to a new study published May 27, 2020 in the open-access journal PLOS ONE by Esther Maassen of Tilburg University, the Netherlands, and colleagues.

Meta-analysis is a widely used method to combine and compare from multiple primary studies. The statistical approach used in meta-analyses can reveal whether study outcomes differ based on particular study characteristics, and help compute an overall effect size—for instance, the magnitude of a treatment effect—for the topic of interest. However, many steps of a meta-analysis involve decisions and judgements that can be arbitrary or differ by .

In the new study, researchers analyzed 33 meta-analysis articles in the field of psychology. The meta-analytical studies were all published in 2011 and 2012, all had data tables with primary studies, and all included at least ten primary studies. For each meta-analysis, the team searched for the corresponding primary study articles, followed any methods detailed in the meta-analysis , and recomputed a total of 500 effect sizes reported in the meta-analyses.

Out of 500 primary study effect sizes, the researchers were able to reproduce 276 (55%) without any problems. (In this case, reproducibility was defined as arriving at the same result after reanalyzing the same data following the reported procedures.) However, in some cases, the meta-analyses did not contain enough information to reproduce the study effect size, while in others a different effect than stated was calculated. 114 effect sizes (23%) showed discrepancies compared to what was reported in the meta-analytical article. 30 of the 33 meta-analyses contained at least one effect size that could not be easily reproduced.

When the erroneous or unreproducible effect sizes were integrated into each meta-analysis itself, the team found that 13 of the 33 (39%) meta-analyses had discrepancies in their results, although many were negligible. The researchers recommend adding to existing guidelines for the publication of psychological meta-analyses to make them more reproducible.

The authors add: Individual from in psychology are difficult to reproduce due to inaccurate and incomplete reporting in the meta-analysis. To increase the trustworthiness of meta-analytic results, it is essential that researchers explicitly document their data handling practices and workflow, as well as publish their data and code online.

More information: Maassen E, van Assen MALM, Nuijten MB, Olsson-Collentine A, Wicherts JM (2020) Reproducibility of individual effect sizes in meta-analyses in psychology. PLoS ONE 15(5): e0233107. doi.org/10.1371/journal.pone.0233107

Journal information: PLoS ONE
Citation: An analysis of psychological meta-analyses reveals a reproducibility problem (2020, May 27) retrieved 18 March 2024 from https://medicalxpress.com/news/2020-05-analysis-psychological-meta-analyses-reveals-problem.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Blockchain can strengthen the credibility of meta-analyses

76 shares

Feedback to editors