Approach used to conduct meta-analyses may affect outcomes

August 12, 2014

Depending on the analysis strategy used, estimating treatment outcomes in meta­analyses may differ and may result in major alterations in the conclusions derived from the analysis, according to a study in the August 13 issue of JAMA.

Meta-analyses of (RCTs) are generally considered to provide among the best evidence of efficacy of medical interventions. They should be conducted as part of a systematic review, a scientifically rigorous approach that identifies, selects, and appraises all relevant studies. Which to combine in a meta analysis remains a persistent dilemma. Meta-analysis of all trials may produce a precise but biased estimate, according to background information in the article.

Agnes Dechartres, M.D., Ph.D., of the Centre de Recherche Epidemiologie et Statistique, INSERM U1153, Paris, and colleagues compared treatment outcomes estimated by meta-analysis of all trials and several alternative strategies for analysis: single most precise trial (i.e., trial with the narrowest confidence interval), meta-analysis restricted to the 25 percent largest trials, limit meta-analysis (a meta-analysis model adjusted for small-study effect), and meta-analysis restricted to trials at low overall risk of bias. The researchers included 163 meta-analyses published between 2008 and 2010 in high-impact-factor journals and between 2011 and 2013 in the Cochrane Database of Systematic Reviews: 92 (705 RCTs) with subjective outcomes and 71 (535 RCTs) with objective outcomes.

The researchers found that estimates differed depending on the analytic strategy used, with treatment outcomes frequently being larger with meta-analysis of all trials than with the single most precise trial, meta-analysis of the largest trials, and limit meta­analysis. The difference in treatment outcomes between these strategies was substantial in 47 of 92 (51 percent) meta-analyses of subjective outcomes and in 28 of 71 (39 percent) meta-analyses of objective outcomes. The authors did not find any difference in treatment outcomes by overall risk of bias.

"In this study, we compared meta-analysis of all trials with several 'best­evidence' alternative strategies and found that estimated treatment outcomes differed depending on the strategy used. We cannot say which strategy is the best because … we cannot know with 100 percent certainty the truth in any research question. Nevertheless, our results raise important questions about meta-analyses and outline the need to re­think certain principles," the researchers write.

"We recommend that authors of meta-analyses systematically assess the robustness of their results by performing sensitivity analyses. We suggest the comparison of the meta-analysis result to the result for the single most precise trial or meta­analysis of the largest trials and careful interpretation of the meta-analysis result if they disagree."

Jesse A. Berlin, Sc.D., of Johnson & Johnson, Titusville, N.J., and Robert M. Golub, M.D., Deputy Editor, JAMA, write in an accompanying editorial that "findings such as those in the study by Dechartres et al reinforce concerns that journals and readers have about as a study design. Those findings deserve consideration not only in the planning of the studies but in the journal peer review and evaluation. They also reinforce the need for circumspection in study interpretation."

"Meta-analysis has the potential to be the best source of evidence to inform decision making. The underlying methods have become much more sophisticated in the last few decades, but achieving this potential will require continued advances in the underlying science, parallel to the advances that have occurred with other biomedical research design and statistics. Until that occurs, an informed reader must approach these studies, as with all other literature, as imperfect information that requires critical appraisal and assessment of applicability of the findings to individual patients. This is not easy, and it requires skill and intelligence. Whatever clinical evidence looks like, and wherever it is placed on a pyramid, there are no shortcuts to truth."

Explore further: Bias pervades the scientific reporting of animal studies

More information: DOI: 10.1001/jama.2014.8166
DOI: 10.1001/jama.2014.8167

Related Stories

Bias pervades the scientific reporting of animal studies

July 17, 2013

A new study published in the open access journal PLOS Biology suggests that the scientific literature could be compromised by substantial bias in the reporting of animal studies, and may be giving a misleading picture of ...

Recommended for you

Researchers grow retinal nerve cells in the lab

November 30, 2015

Johns Hopkins researchers have developed a method to efficiently turn human stem cells into retinal ganglion cells, the type of nerve cells located within the retina that transmit visual signals from the eye to the brain. ...

Shining light on microbial growth and death inside our guts

November 30, 2015

For the first time, scientists can accurately measure population growth rates of the microbes that live inside mammalian gastrointestinal tracts, according to a new method reported in Nature Communications by a team at the ...

Functional human liver cells grown in the lab

November 26, 2015

In new research appearing in the prestigious journal Nature Biotechnology, an international research team led by The Hebrew University of Jerusalem describes a new technique for growing human hepatocytes in the laboratory. ...

Gut microbes signal to the brain when they are full

November 24, 2015

Don't have room for dessert? The bacteria in your gut may be telling you something. Twenty minutes after a meal, gut microbes produce proteins that can suppress food intake in animals, reports a study published November 24 ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.