Appl Clin Inform 2018; 09(01): 072-081
DOI: 10.1055/s-0037-1621702
Research Article
Schattauer GmbH Stuttgart

Exploring Data Quality Management within Clinical Trials

Lauren Houston
,
Yasmine Probst
,
Ping Yu
,
Allison Martin
Funding None.
Further Information

Publication History

17 August 2017

06 December 2017

Publication Date:
31 January 2018 (online)

Abstract

Background Clinical trials are an important research method for improving medical knowledge and patient care. Multiple international and national guidelines stipulate the need for data quality and assurance. Many strategies and interventions are developed to reduce error in trials, including standard operating procedures, personnel training, data monitoring, and design of case report forms. However, guidelines are nonspecific in the nature and extent of necessary methods.

Objective This article gathers information about current data quality tools and procedures used within Australian clinical trial sites, with the aim to develop standard data quality monitoring procedures to ensure data integrity.

Methods Relevant information about data quality management methods and procedures, error levels, data monitoring, staff training, and development were collected. Staff members from 142 clinical trials listed on the National Health and Medical Research Council (NHMRC) clinical trials Web site were invited to complete a short self-reported semiquantitative anonymous online survey.

Results Twenty (14%) clinical trials completed the survey. Results from the survey indicate that procedures to ensure data quality varies among clinical trial sites. Centralized monitoring (65%) was the most common procedure to ensure high-quality data. Ten (50%) trials reported having a data management plan in place and two sites utilized an error acceptance level to minimize discrepancy, set at <5% and 5 to 10%, respectively. The quantity of data variables checked (10–100%), the frequency of visits (once-a-month to annually), and types of variables (100%, critical data or critical and noncritical data audits) for data monitoring varied among respondents. The average time spent on staff training per person was 11.58 hours over a 12-month period and the type of training was diverse.

Conclusion Clinical trial sites are implementing ad hoc methods pragmatically to ensure data quality. Findings highlight the necessity for further research into “standard practice” focusing on developing and implementing publicly available data quality monitoring procedures.

Authors' Contributions

L.H. conceptualized and formulated the research question, designed the study, performed the study, evaluated the data, drafted the initial manuscript, revised the manuscript, and approved the final manuscript as submitted. Y.P., P.Y., and A.M. made substantial contributions to the study design, analysis, and interpretation of the data, critically reviewed the manuscript, and approved the final manuscript as submitted.


Protection of Human and Animal Subjects

This study was performed in compliance with the World Medical Association Declaration of Helsinki on Ethical Principles for Medical Research Involving Human Subjects and was approved by the University of Wollongong Human Research Ethics Committee (HE16/131).


 
  • References

  • 1 Chen H, Hailey D, Wang N, Yu P. A review of data quality assessment methods for public health information systems. Int J Environ Res Public Health 2014; 11 (05) 5170-5207
  • 2 Hasan S, Padman R. Analyzing the effect of data quality on the accuracy of clinical decision support systems: a computer simulation approach. AMIA Annual Symposium Proceedings/AMIA Symposium AMIA Symposium. 2006:324–328
  • 3 Huneycutt BJ, Illes J, Boone C, Jackson K, Woolett G, Health A. In the Patient's Interest: Improving Access to Clinical Trial Data. 2014. Available at: http://www.fdanews.com/ext/resources/files/09/09-03-2014-Avalerereport.pdf . Accessed July 6, 2016
  • 4 Kahn MG, Brown JS, Chun AT. , et al. Transparent reporting of data quality in distributed data networks. EGEMS (Wash DC) 2015; 3 (01) 1052
  • 5 Bhatt A. Quality of clinical trials: a moving target. Perspect Clin Res 2011; 2 (04) 124-128
  • 6 Bowman S. Impact of electronic health record systems on information integrity: quality and safety implications. Perspect Health Inf Manag 2013; 10: 1c
  • 7 Society for Clinical Data Management. Good Clinical Data Management Practices (GCDMP) October 2013 Edition. 2013. Available at: https://www.scdm.org/publications/gcdmp/ . Accessed August 20, 2016
  • 8 Nahm ML, Pieper CF, Cunningham MM. Quantifying data quality for clinical trials using electronic data capture. PLoS One 2008; 3 (08) e3049
  • 9 International Conference on Harmonisation of technical requirements for registration of pharmaceuticals for human use 1996, ICH Harmonized Tripartite Guidelines: Guideline for Good Clinical Practice E6(R1). 1996. Available at: http://www.ich.org/fileadmin/Public_Web_Site/ICH_Products/Guidelines/Efficacy/E6/E6_R1_Guideline.pdf . Accessed October 12, 2014
  • 10 International Conference on Harmonisation of technical requirements for registration of pharmaceuticals for human use 2015, E6(R2) Good Clinical Practice 2015. Available at: http://www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/Guidances/UCM464506.pdf . Accessed June 30, 2016
  • 11 International Standards Organisation. Clinical investigation of medical devices for human subjects - Good clinical practice ISO14155:2011. 2011. Available at: https://www.iso.org/obp/ui/#iso:std:iso:14155:ed-2:v1:en . Accessed September 19, 2017
  • 12 Food and Drug Administration (FDA) Guidance for Industry, oversight of clinical investigations–a risk-based approach to monitoring. 2013. Available at: https://www.fda.gov/downloads/Drugs/.../Guidances/UCM269919.pdf . Accessed July 6, 2016
  • 13 European Commisson. Clinical trials - Directive 2001/20/EC Public Health Medical Products for Human Use. 2001. Available at: https://ec.europa.eu/health/sites/health/files/files/eudralex/vol-1/dir_2001_20/dir_2001_20_en.pdf . Accessed October 30, 2013
  • 14 The National Health and Medical Research Council (NHMRC), the Australian Research Council and the Australian Vice-Chancellors' Committee. National Statement on Ethical Conduct in Human Research 2007 (Updated May 2015). Commonwealth of Australia, Canberra; 2007. Available at: https://www.nhmrc.gov.au/_files_nhmrc/publications/attachments/e72_national_statement_may_2015_150514_a.pdf . Accessed July 2, 2016
  • 15 Bakobaki JM, Rauchenberger M, Joffe N, McCormack S, Stenning S, Meredith S. The potential for central monitoring techniques to replace on-site monitoring: findings from an international multi-centre clinical trial. Clin Trials 2012; 9 (02) 257-264
  • 16 De S. Hybrid approaches to clinical trial monitoring: practical alternatives to 100% source data verification. Perspect Clin Res 2011; 2 (03) 100-104
  • 17 Macefield RC, Beswick AD, Blazeby JM, Lane JA. A systematic review of on-site monitoring methods for health-care randomised controlled trials. Clin Trials 2013; 10 (01) 104-124
  • 18 Brosteanu O, Schwarz G, Houben P. , et al. Risk-adapted monitoring is not inferior to extensive on-site monitoring: results of the ADAMON cluster-randomised study. Clin Trials 2017; 14 (06) 584-596
  • 19 Tantsyura V, Dunn IM, Waters J. , et al. Extended risk-based monitoring model, on-demand query-driven source data verification, and their economic impact on clinical trial operations. Ther Innov Regul Sci 2015; 50 (01) 115-122
  • 20 Olsen R, Bihlet AR, Kalakou F, Andersen JR. The impact of clinical trial monitoring approaches on data integrity and cost–a review of current literature. Eur J Clin Pharmacol 2016; 72 (04) 399-412
  • 21 Mealer M, Kittelson J, Thompson BT. , et al. Remote source document verification in two national clinical trials networks: a pilot study. PLoS One 2013; 8 (12) e81890
  • 22 Tudur Smith C, Stocken DD, Dunn J. , et al. The value of source data verification in a cancer clinical trial. PLoS One 2012; 7 (12) e51623
  • 23 Journot V, Pignon JP, Gaultier C. , et al; Optimon Collaborative Group. Validation of a risk-assessment scale and a risk-adapted monitoring plan for academic clinical research studies–the Pre-Optimon study. Contemp Clin Trials 2011; 32 (01) 16-24
  • 24 Morrison BW, Cochran CJ, White JG. , et al. Monitoring the quality of conduct of clinical trials: a survey of current practices. Clin Trials 2011; 8 (03) 342-349
  • 25 Center for Scientific Integrity. Retraction Watch; 2015. Available at: http://retractionwatch.com/ . Accessed October 13, 2017
  • 26 Australian Government, National Health and Medical Research Council (NHMRC), Australia Clinical Trials, Trial Sites. Department of Industry Innovation and Science. 2016. Available at: https://www.australianclinicaltrials.gov.au/trial-websites . Accessed August 10, 2016
  • 27 Kuchinke W, Ohmann C, Yang Q. , et al. Heterogeneity prevails: the state of clinical trial data management in Europe - results of a survey of ECRIN centres. Trials 2010; 11 (01) 79
  • 28 Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)–a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009; 42 (02) 377-381
  • 29 Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006; 3 (02) 77-101
  • 30 Ohmann C, Kuchinke W, Canham S. , et al; ECRIN Working Group on Data Centres. Standard requirements for GCP-compliant data management in multinational clinical trials. Trials 2011; 12 (01) 85
  • 31 Krishnankutty B, Bellary S, Kumar NB, Moodahadu LS. Data management in clinical research: an overview. Indian J Pharmacol 2012; 44 (02) 168-172
  • 32 Califf RM, Karnash SL, Woodlief LH. Developing systems for cost-effective auditing of clinical trials. Control Clin Trials 1997; 18 (06) 651-660
  • 33 Eisenstein EL, Lemons II PW, Tardiff BE, Schulman KA, Jolly MK, Califf RM. Reducing the costs of phase III cardiovascular clinical trials. Am Heart J 2005; 149 (03) 482-488
  • 34 Andersen JR, Byrjalsen I, Bihlet A. , et al. Impact of source data verification on data quality in clinical trials: an empirical post hoc analysis of three phase 3 randomized clinical trials. Br J Clin Pharmacol 2015; 79 (04) 660-668
  • 35 Clarke DR, Breen LS, Jacobs ML. , et al. Verification of data in congenital cardiac surgery. Cardiol Young 2008; 18 (Suppl. 02) 177-187
  • 36 Houston L, Probst Y, Humphries A. Measuring data quality through a source data verification audit in a clinical research setting. Stud Health Technol Inform 2015; 214: 107-113
  • 37 Kleppinger CF, Ball LK. Building quality in clinical trials with use of a quality systems approach. Clin Infect Dis 2010; 51 (Suppl. 01) S111-S116
  • 38 Arts DGT, De Keizer NF, Scheffer GJ. Defining and improving data quality in medical registries: a literature review, case study, and generic framework. J Am Med Inform Assoc 2002; 9 (06) 600-611
  • 39 Catarci T, Scannapieco M. Data quality under the computer science perspective. Arch Comput 2002; 2: 1-12
  • 40 Batini C, Cappiello C, Francalanci C, Maurino A. Methodologies for data quality assessment and improvement. ACM Comput Surv 2009; 41 (03) 16
  • 41 Wang RY, Storey VC, Firth CP. A framework for analysis of data quality research. IEEE Trans Knowl Data Eng 1995; 7 (04) 623-640
  • 42 Tayi GK, Ballou DP. Examining data quality. Commun ACM 1998; 41 (02) 54-57
  • 43 Pipino LL, Lee YW, Wang RY. Data quality assessment. Commun ACM 2002; 45 (04) 211-218
  • 44 Maydanchik A. Data Quality Assessment. Bradley Beach, NJ: Technics Publications; 2007
  • 45 Davidson B, Lee YW, Wang R. Developing data production maps: meeting patient discharge data submission requirements. Int J Healthc Technol Manag 2004; 6 (02) 223-240
  • 46 Riain C, Helfert M. An evaluation of data quality related problem patterns in healthcare information systems. Lisbon, Portugal: IADIS Virtual Multi Conference on Computer Science and Information Systems; 2005
  • 47 Mettler T, Rohner P, Baacke L. Improving data quality of health information systems: a holistic design- oriented approach. European Conference on Information Systems (ECIS) Proceedings Paper 1642008
  • 48 Kahn MG, Raebel MA, Glanz JM, Riedlinger K, Steiner JF. A pragmatic framework for single-site and multisite data quality assessment in electronic health record-based clinical research. Med Care 2012; 50 (Suppl): S21-S29
  • 49 Rostami R, Nahm M, Pieper CF. What can we learn from a decade of database audits? The Duke Clinical Research Institute experience, 1997–2006. Clin Trials 2009; 6 (02) 141-150
  • 50 Hullsiek KH, Kagan JM, Engen N. , et al. Investigating the efficacy of clinical trial monitoring strategies: design and implementation of the Cluster Randomized START Monitoring Substudy. Ther Innov Regul Sci 2015; 49 (02) 225-233
  • 51 Stenning S, Joffe N, Batra P. , et al. Update on the temper study: targeted monitoring, prospective evaluation and refinement. Trials 2013; 14 (01) 138