Evid Based Spine Care J 2014; 05(01): 002-005
DOI: 10.1055/s-0034-1371445
Science in Spine
Georg Thieme Verlag KG Stuttgart · New York

Credibility Matters: Mind the Gap

Andrea C. Skelly
1   Spectrum Research, Inc., Tacoma, Washington, United States
› Author Affiliations
Further Information

Publication History

Publication Date:
28 March 2014 (online)

Introduction

Clinicians, policy makers, and patients need to be able to rely on high-quality scientific research to make informed decisions about health care options and policy. Frustration ensues on all levels when there is low confidence in the quality and integrity of available research on spine care. When research quality is low and reporting of it is poor, clinicians and patients may have confusion regarding best health care options. Policy makers may not reimburse for treatments or diagnostic modalities that are deemed not effective based on available evidence. At the most basic level, all parties want the same thing: to do what “works,” yet they all suffer when there is low confidence in evidence.

There is a credibility gap that spans all aspects of medical research, from study planning to study reporting to availability of data for verification to final study publication. Several studies provide empirical evidence on publication and related biases and how conclusions may differ based on what is and is not reported and how.[1] [2] [3] One example of publication-related bias is seen in the recent controversy surrounding results from the Yale Open Data Access (YODA) studies[4] [5] as compared with original trial publications on bone morphogenetic protein. A primary conclusion from consideration of these reports was a call for timely and complete transparency of data reporting.[6] [7]

Subsequently, media and scientific circles have reiterated strong calls to reduce study bias in study analysis and reporting.[8] [9] [10] [11]

Outcome reporting bias is one type of publication-related bias and is an under-recognized problem.[12] [13] This occurs when there is selective reporting of some outcomes but not others, possibly depending on the nature and direction of the findings. In addition to ethical concerns regarding such selective reporting, the reported results can be misleading. One example of the impact of such selective reporting is an analysis of 283 Cochrane Reviews. Kirkham et al report that 34% of reviews contained at least one trial with high suspicion of outcome reporting bias for the primary outcome.[12] Sensitivity analysis on these reviews revealed that the treatment effect was reduced by 20% or more in 23% of reviews. After adjustment for outcome reporting bias, 19% of meta-analyses with a statistically significant result became non-significant and 26% would have overestimated the treatment effect by 20% or more. This can impact policy making and clinical decision making and potentially result in harm to patients.

Transparency and attention to detail in research design, specification of outcomes, analysis, reporting, and dissemination are critical to “minding the gap” regardless of study design or level and type of funding. This article (and previous Science in Spine articles) describes some key components for such transparency related to conducting research with a focus on outcomes reporting.

 
  • References

  • 1 Song F, Eastwood AJ, Gilbody S, Duley L, Sutton AJ. Publication and related biases. Health Technol Assess 2000; 4 (10) 1-115
  • 2 Song F, Parekh S, Hooper L , et al. Dissemination and publication of research findings: an updated review of related biases. Health Technol Assess 2010; 14 (8) iii , ix–xi, 1–193
  • 3 Dwan K, Altman DG, Cresswell L, Blundell M, Gamble CL, Williamson PR. Comparison of protocols and registry entries to published reports for randomised controlled trials. Cochrane Database Syst Rev 2011; (1) MR000031
  • 4 Fu R, Selph S, McDonagh M , et al. Effectiveness and harms of recombinant human bone morphogenetic protein-2 in spine fusion: a systematic review and meta-analysis. Ann Intern Med 2013; 158 (12) 890-902
  • 5 Simmonds MC, Brown JV, Heirs MK , et al. Safety and effectiveness of recombinant human bone morphogenetic protein-2 for spinal fusion: a meta-analysis of individual-participant data. Ann Intern Med 2013; 158 (12) 877-889
  • 6 Kuntz RE. The changing structure of industry-sponsored clinical research: pioneering data sharing and transparency. Ann Intern Med 2013; 158 (12) 914-915
  • 7 Resnick D, Bozic KJ. Meta-analysis of trials of recombinant human bone morphogenetic protein-2: what should spine surgeons and their patients do with this information?. Ann Intern Med 2013; 158 (12) 912-913
  • 8 AllTrials. All Trials Registered, All Trials Reported; 2013. Available at: http://www.alltrials.net/blog/
  • 9 Ross JS, Gross CP, Krumholz HM. Promoting transparency in pharmaceutical industry-sponsored research. Am J Public Health 2012; 102 (1) 72-80
  • 10 Ross JS, Krumholz HM. Ushering in a new era of open science through data sharing: the wall must come down. JAMA 2013; 309 (13) 1355-1356
  • 11 Zarin D. Benefits of Sharing Clinical Research Data. Presented at the Institute of Medicine Workshop on Sharing Clinical Research Data; Washington, DC; October 4–5, 2012. Available at: www.iom.edu/Reports/2013/Sharing-Clinical-Research-Data.aspx
  • 12 Kirkham JJ, Dwan KM, Altman DG , et al. The impact of outcome reporting bias in randomised controlled trials on a cohort of systematic reviews. BMJ 2010; 340: c365
  • 13 Dwan K, Gamble C, Williamson PR, Kirkham JJ ; Reporting Bias Group. Systematic review of the empirical evidence of study publication bias and outcome reporting bias - an updated review. PLoS ONE 2013; 8 (7) e66844
  • 14 Raich AL, Skelly AC. Asking the right question: specifying your study question. Evid Based Spine Care J 2013; 4 (2) 68-71
  • 15 Rios LP, Ye C, Thabane L. Association between framing of the research question using the PICOT format and reporting quality of randomized controlled trials. BMC Med Res Methodol 2010; 10: 11
  • 16 Lee MJ, Norvell DC, Dettori JR, Skelly AC, Chapman JR , eds. SMART Handbook for Spine Clinical Research. New York: Thieme; 2013
  • 17 Methods Guide for Effectiveness and Comparative Effectiveness Reviews. AHRQ Publication No. 10(12)-EHC063-EF. Rockville, MD; 2012. Available at: www.effectivehealthcare.ahrq.gov
  • 18 Whitlock EP, Lopez SA, Chang S, Helfand M, Eder M, Floyd N. Identifying, selecting, and refining topics. Methods Guide for Effectiveness and Comparative Effectiveness Reviews. Available at: http://www.effectivehealthcare.ahrq.gov/index.cfm/search-for-guides-reviews-and-reports/?pageaction=displayproduct&productid=318 . Rockville, MD: Agency for Healthcare Research and Quality; 2009
  • 19 Chan AW, Hróbjartsson A, Haahr MT, Gøtzsche PC, Altman DG. Empirical evidence for selective reporting of outcomes in randomized trials: comparison of protocols to published articles. JAMA 2004; 291 (20) 2457-2465
  • 20 Dwan K, Gamble C, Kolamunnage-Dona R, Mohammed S, Powell C, Williamson PR. Assessing the potential for outcome reporting bias in a review: a tutorial. Trials 2010; 11: 52