Methods Inf Med 2012; 51(02): 122-130
DOI: 10.3414/ME10-01-0066
Original Articles
Schattauer GmbH

Health Services Research Evaluation Principles

Broadening a General Framework for Evaluating Health Information Technology
P. S. Sockolow
1   College of Nursing and Health Professions, Drexel University, Philadelphia, PA, USA
,
P. R. Crawford
2   Division of Health Sciences Informatics, School of Medicine, The Johns Hopkins University, Baltimore, MD, USA
3   Department of International Health, Bloomberg School of Public Health, The Johns Hopkins University, Baltimore, MD, USA
,
H. P. Lehmann
2   Division of Health Sciences Informatics, School of Medicine, The Johns Hopkins University, Baltimore, MD, USA
4   Department of Health Policy and Management, Bloomberg School of Public Health, The Johns Hopkins University, Baltimore, MD, USA
5   Department of Biostatistics, Bloomberg School of Public Health, The Johns Hopkins University, Baltimore, MD, USA
› Author Affiliations
Further Information

Publication History

received:09 September 2010

accepted:04 February 2011

Publication Date:
19 January 2018 (online)

Summary

Background: Our forthcoming national experiment in increased health information technology (HIT) adoption funded by the American Recovery and Reinvestment Act of 2009 will require a comprehensive approach to evaluating HIT. The quality of evaluation studies of HIT to date reveals a need for broader evaluation frameworks that limits the generalizability of findings and the depth of lessons learned.

Objective: Develop an informatics evaluation framework for health information technology (HIT) integrating components of health services research (HSR) evaluation and informatics evaluation to address identified shortcomings in available HIT evaluation frameworks.

Method: A systematic literature review updated and expanded the exhaustive review by Ammenwerth and deKeizer (AdK). From retained studies, criteria were elicited and organized into classes within a framework. The resulting Health Information Technology Research-based Evaluation Framework (HITREF) was used to guide clinician satisfaction survey construction, multi-dimensional analysis of data, and interpretation of findings in an evaluation of a vanguard community health care EHR.

Results: The updated review identified 128 electronic health record (EHR) evaluation studies and seven evaluation criteria not in AdK: EHR Selection/Development/Training; Patient Privacy Concerns; Unintended Consequences/ Benefits; Functionality; Patient Satisfaction with EHR; Barriers/Facilitators to Adoption; and Patient Satisfaction with Care. HITREF was used productively and was a complete evaluation framework which included all themes that emerged.

Conclusions: We can recommend to future EHR evaluators that they consider adding a complete, research-based HIT evaluation framework, such as HITREF, to their evaluation tools suite to monitor HIT challenges as the federal government strives to increase HIT adoption.

 
  • References

  • 1 Health Information Technology CMS Information Related to the Economic Recovery Act of 2009. Available at http://www.cms.hhs.gov/Recovery/11_HealthIT.asp Accessed Aug 4 2009
  • 2 Ammenwerth E, Graber S, Herrmann G, Burkle T, Konig J. Evaluation of health information systems-problems and challenges. Int J Med Inf 2003; 71 (02) (03) 125-135.
  • 3 Stoop AP, Berg M. Integrating quantitative and qualitative methods in patient care information system evaluation: guidance for the organizational decision maker. Methods Inf Med 2003; 42 (04) 458-462.
  • 4 Han YY, Carcillo JA, Venkataraman ST, Clark RS, Watson RS, Nguyen TC. et al Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system. Pediatrics 2005; 116 (06) 1506-1512.
  • 5 Del Beccaro MA, Jeffries HE, Eisenberg MA, Harry ED. Computerized provider order entry implementation: no association with increased mortality rates in an intensive care unit. Pediatrics 2006; 118 (01) 290-295.
  • 6 Ammenwerth E, Talmon J, Ash JS, Bates DW, Beuscart-Zephir MC, Duhamel A. et al Impact of CPOE on mortality rates--contradictory findings, important messages. Methods Inf Med 2006; 45 (06) 586-593.
  • 7 Yusof MM, Papazafeiropoulou A, Paul RJ, Stergioulas LK. Investigating evaluation frameworks for health information systems. Int J Med Inform 2008; 77 (06) 377-385.
  • 8 Currie LM. Evaluation frameworks for nursing informatics. I J Med Info 2005; 74 (11) (12) Nursing Informatics Special Issue 908-916.
  • 9 Davis FD, Bagozzi RP, Warshaw PR. User Acceptance of Computer Technology: A Comparison of Two Theoretical Models. Management Science 1989; 35 (08) 982.
  • 10 Labkoff SE, Yasnoff WA. A framework for systematic evaluation of health information infrastructure progress in communities. Journal of Biomedical Informatics 2007; 40 (02) 100-105.
  • 11 Sittig DF, Shiffman RN, Leonard K, Friedman C, Rudolph B, Hripcsak G. et al A draft framework for measuring progress towards the development of a National Health Information Infrastructure. BMC Med Inform Decis Mak 2005; 5 (01) 14.
  • 12 DeLone WH, McLean ER. Information systems success: The quest for the dependent variable. Inf Sys Res 1992; 3: 60-95.
  • 13 Goodhue DL, Thompson RL. Task-Technology Fit and Individual Performance. MIS Quarterly 1995; 19 (02) 213.
  • 14 Horsky J, Kaufman DR, Oppenheim MI, Patel VL. A framework for analyzing the cognitive complexity of computer-assisted clinical ordering. J Biomed Inform 2003; 36 (01) (02) 4-22.
  • 15 Kaplan B. Addressing organizational issues into the evaluation of medical systems. JAMIA 1997; 4 (02) 94-101.
  • 16 Winkelman WJ, Leonard KJ. Overcoming structural constraints to patient utilization of electronic medical records: a critical review and proposal for an evaluation framework. JAMIA 2004; 11 (02) 151-61.
  • 17 Green CJ, Moehr J. Performance evaluation frameworks for vertically integrated health care systems: Shifting paradigms in Canada. JAMIA 2000; 7: 315
  • 18 Lau F, Hagens S, Muttitt S. A proposed benefits evaluation framework for health information systems in Canada. HealthC Q 2007; 10 (01) 112.
  • 19 Kukafka R, Johnson SB, Linfante A, Allegrante JP. Grounding a new information technology implementation framework in behavioral science: a systematic analysis of the literature on IT use. J Biomed Inform 2003; 36 (03) 218-227.
  • 20 Holden RJ, Karsh BT. A review of medical error reporting system design considerations and a proposed cross-level systems research framework. Hum. Factors 2007; 49 (02) 257-276.
  • 21 Aqil A, Lippeveld T. RHINO Network: PRISM Framework. Available at http://www.rhinonet.org/tiki-index.php?page=PRISM%20Framework Accessed July 1 2010
  • 22 Stead WW, Haynes RB, Fuller S, Friedman CP, Travis LE, Beck JR. et al Designing medical informatics research and library - resource projects to increase what is learned. JAMIA 1994; 1 (01) 28-33.
  • 23 Chiasson M, Reddy M, Kaplan B, Davidson E. Expanding multi-disciplinary approaches to healthcare information technologies: What does information systems offer medical informatics?. Int J Med Inform 2007; 76 (Suppl. 01) Supp S89-S97.
  • 24 Yusof MM, Kuljis J, Papazafeiropoulou A, Stergioulas LK. An evaluation framework for Health Information Systems: human, organization and technology-fit factors (HOT-fit). Int J Med Inform 2008; 77 (06) 386-398.
  • 25 Kazanjian A, Green CJ. Beyond effectiveness: the evaluation of information systems using a comprehensive health technology assessment framework. Comput Biol Med 2002; 32 (03) 165-177.
  • 26 Scott I, Campbell D. Health services research: what is it and what does it offer?. Intern Med J 2002; 32 (03) 91-99.
  • 27 Institute of Medicine Division of Health Care Services. Health Services Research: Report of a Study. 1979 p 112.
  • 28 Kukafka R. et al Issues and opportunities in public health informatics: A panel discussion. J Pub Hlth Mngmt and Prac 2001; 7 (06) 31-42.
  • 29 Mandl KD, Lee TH. Integrating medical informatics and health services research: the need for dual training at the clinical health systems and policy levels. JAMIA 2002; 9 (02) 127-132.
  • 30 UBC Library: Information Page for Cochrane Central Register of Controlled Trials (CENTRAL). Available at http://toby.library.ubc.ca/resources/infopage.cfm?id=619 Accessed Mar 6 2008
  • 31 Ammenwerth E, de Keizer N. An inventory of evaluation studies of information technology in health care - Trends in evaluation research 1982-2002. Methods Inf Med 2005; 44 (01) 44-56.
  • 32 The Cochrane Collaboration - Cochrane Review structure. Available at http://www.cochrane.org/reviews/revstruc.htm Accessed Mar 6 2008
  • 33 Urquhart C. An encounter with grounded theory: tackling the practical and philosophical issues. Qualitative research in IS: issues and trends. Hershey, PA, USA: IGI Publishing 2001 p 104.
  • 34 Tan K, Dear P, Newell SJ. Clinical decision support systems for neonatal care. Cochrane Database Syst Rev 2007. 03
  • 35 Rossi PH, Lipsey MW, Freeman HE. Evaluation :a systematic approach. 7th ed.. Thousand Oaks, CA: Sage Publications; 2004.
  • 36 ISO/DTR 20514. Health Informatics - Electronic Health Record - Definition, Scope, and Context. 2004 20514.
  • 37 Donabedian A. Evaluating the Quality of Medical Care. Milbank Q 2005; 83 (04) 691-729.
  • 38 Burns N, Grove SK. Practice of Nursing Research. 4th ed.. Philadelphia: W. B.; Saunders 2001.
  • 39 Sockolow PS, Weiner JP, Bowles KH, Lehmann HP. A new instrument for measuring clinician satisfaction with electronic health records. CIN; in press.
  • 40 Sockolow PS. et al Advice for decision makers based on an electronic health record evaluation at a Program for All-inclusive Care for Elders site. Appl Clin Inf 2011; 2 (01) 18-38.
  • 41 A diffusion of innovations model of physician order entry. Proceedings/AMIA …Annual Symposium. AMIA Symposium United States 2001
  • 42 Harrison MI, Koppel R, Bar-Lev S. Unintended Consequences of Information Technologies in Health Care An Interactive Sociotechnical Analysis. JAMIA 2007; 14 (05) 542-549.
  • 43 Ammenwerth E.. personal communication. 2010 Feb 1; personal communication.
  • 44 Nohr C. et al Development, implementation and diffusion of EHR systems in Denmark. Int J Med Inform 2005; 74 (02) (04) 229-234.
  • 45 Kaplan B, Brennan PF. Consumer informatics supporting patients as co-producers of quality.[see comment]. J Am Med Inform Assoc 2001; 8 (04) 309-316.
  • 46 Friedman CP. “Smallball” evaluation: a prescription for studying community-based information interventions. J Med Libr Assoc 2005 93: October (Suppl. 04) Suppl Oct 2006; S43-S48.
  • 47 Board of Regents of the University of Wisconsin System. Program Development and Evaluation. 2005 Available at http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html Accessed Dec 2010