Appl Clin Inform 2013; 04(02): 212-224
DOI: 10.4338/ACI-2012-12-RA-0053
Research Article
Schattauer GmbH

Diagnostic Performance of Electronic Syndromic Surveillance Systems in Acute Care

A Systematic Review
M. Kashiouris
1   Division of Pulmonary and Critical Care Medicine, Department of Internal Medicine, Mayo Clinic, Rochester, Minnesota, USA
2   M.E.T.R.I.C., Mayo Clinic, Rochester, Minnesota, USA
,
J.C. O’Horo
1   Division of Pulmonary and Critical Care Medicine, Department of Internal Medicine, Mayo Clinic, Rochester, Minnesota, USA
2   M.E.T.R.I.C., Mayo Clinic, Rochester, Minnesota, USA
,
B.W. Pickering
2   M.E.T.R.I.C., Mayo Clinic, Rochester, Minnesota, USA
3   Department of Anesthesiology, Division of Critical Care Medicine, Mayo Clinic, Rochester, Minnesota, USA
,
V. Herasevich
2   M.E.T.R.I.C., Mayo Clinic, Rochester, Minnesota, USA
3   Department of Anesthesiology, Division of Critical Care Medicine, Mayo Clinic, Rochester, Minnesota, USA
› Institutsangaben
Weitere Informationen

Correspondence to:

Vitaly Herasevich MD PhD
Department of Anesthesiology,
Division of Critical Care Medicine
Mayo Clinic College of Medicine 200 First Street SW
Rochester, MN 55905
Telefon: 507–255–7002   

Publikationsverlauf

received: 13. Dezember 2012

accepted: 29. April 2013

Publikationsdatum:
19. Dezember 2017 (online)

 

Summary

Context: Healthcare Electronic Syndromic Surveillance (ESS) is the systematic collection, analysis and interpretation of ongoing clinical data with subsequent dissemination of results, which aid clinical decision-making.

Objective: To evaluate, classify and analyze the diagnostic performance, strengths and limitations of existing acute care ESS systems.

Data Sources: All available to us studies in Ovid MEDLINE, Ovid EMBASE, CINAHL and Scopus databases, from as early as January 1972 through the first week of September 2012.

Study Selection: Prospective and retrospective trials, examining the diagnostic performance of inpatient ESS and providing objective diagnostic data including sensitivity, specificity, positive and negative predictive values.

Data Extraction: Two independent reviewers extracted diagnostic performance data on ESS systems, including clinical area, number of decision points, sensitivity and specificity. Positive and negative likelihood ratios were calculated for each healthcare ESS system. A likelihood matrix summarizing the various ESS systems performance was created.

Results: The described search strategy yielded 1639 articles. Of these, 1497 were excluded on abstract information. After full text review, abstraction and arbitration with a third reviewer, 33 studies met inclusion criteria, reporting 102,611 ESS decision points. The yielded I2 was high (98.8%), precluding meta-analysis. Performance was variable, with sensitivities ranging from 21% –100% and specificities ranging from 5%-100%.

Conclusions: There is significant heterogeneity in the diagnostic performance of the available ESS implements in acute care, stemming from the wide spectrum of different clinical entities and ESS systems. Based on the results, we introduce a conceptual framework using a likelihood ratio matrix for evaluation and meaningful application of future, frontline clinical decision support systems.

Citation: Kashiouris M, O’Horo JC, Pickering BW, Herasevich V. Diagnostic performance of electronic syndromic surveillance systems in acute care – a systematic review. Appl Clin Inf 2013; 4: 212–224

http://dx.doi.org/10.4338/ACI-2012-12-RA-0053


#

 


#

Conflict of Interest Statement

All authors report no research grants or conflicts of interest exist for this stud

  • References

  • 1 Langmuir AD. The surveillance of communicable diseases of national importance. N Engl J Med 1963; 268: 182-192.
  • 2 Teutsch SM, Thacker SB. Planning a public health surveillance system. Epidemiol Bull 1995; 16: 1-6.
  • 3 Chaudhry B. et al. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med 2006; 144: 742-752.
  • 4 Lenert L, Sundwall DN. Public health surveillance and meaningful use regulations: a crisis of opportunity. Am J Public Health 2012; 102: e1-e7.
  • 5 Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med 2009; 151: 264-269 W64.
  • 6 Higgins JPT, Green S. Cochrane Collaboration.. Cochrane handbook for systematic reviews of interventions. Chichester, England; Hoboken, NJ: Wiley-Blackwell; 2008
  • 7 Martus P. A measurement model of disease severity in absence of a gold standard. Methods Inf Med 2001; 40: 265-271.
  • 8 Fàbregas N. et al. Clinical diagnosis of ventilator associated pneumonia revisited: comparative validation using immediate post-mortem lung biopsies. Thorax 1999; 54: 867-873.
  • 9 Dwamena BA. MIDAS: A program for Meta-analytical Integration of Diagnostic Accuracy Studies in Stata. Division of Nuclear Medicine, Department of Radiology. University of Michigan Medical School, Ann Arbor; Michigan: 2007
  • 10 Berlin A, Sorani M, Sim I. A taxonomic description of computer-based clinical decision support systems. J Biomed Inform 2006; 39: 656-667.
  • 11 Whiting P, Harbord R, Kleijnen J. No role for quality scores in systematic reviews of diagnostic accuracy studies. BMC Med Res Methodol 2005; 5: 19.
  • 12 Hooper MH. et al. Randomized trial of automated, electronic monitoring to facilitate early detection of sepsis in the intensive care unit. Crit Care Med 2012; 40: 2096-2101.
  • 13 Azzam HC. et al. Validation study of an automated electronic acute lung injury screening tool. J Am Med Inform Assoc 2009; 16: 503-508.
  • 14 Koenig HC. et al. Performance of an automated electronic acute lung injury screening system in intensive care unit patients. Crit Care Med 2011; 39: 98-104.
  • 15 Schmickl CN. et al. Decision support tool for early differential diagnosis of acute lung injury and cardiogenic pulmonary edema in medical critically ill patients. Chest 2012; 141: 43-50.
  • 16 Garcia-Esquirol O. et al. Validation of an automatic continuous system to detect expiratory asynchronies during mechanical ventilation. Intensive Care Medicine 2010; 36: S349.
  • 17 Mojoli F. et al. Automatic detection of patient-ventilator asynchronies during pressure support ventilation. Intensive Care Medicine 2010; 36: S111.
  • 18 Herasevich V. et al. Limiting ventilator-induced lung injury through individual electronic medical record surveillance. Crit Care Med 2011; 39: 34-39.
  • 19 Slooter AJC. et al. Seizure detection in adult ICU patients based on changes in EEG synchronization likelihood. Neurocritical Care 2006; 5: 186-192.
  • 20 Kho A. et al. Utility of commonly captured data from an EHR to identify hospitalized patients at risk for clinical deterioration. AMIA Annu Symp Proc. 2007 Annual Symposium Proceedings/AMIA Symposium.: 404-408.
  • 21 Pokorny L. et al. Automatic detection of patients with nosocomial infection by a computer-based surveil-lance system: A validation study in a general hospital. Infection Control and Hospital Epidemiology. 2006; 27: 500-503.
  • 22 Leth RA, Moller JK. Surveillance of hospital-acquired infections based on electronic hospital registries. J Hosp Infect 2006; 62: 71-79.
  • 23 Bellini C. et al. Comparison of automated strategies for surveillance of nosocomial bacteremia. Infection Control & Hospital Epidemiology 2007; 28: 1030-1035.
  • 24 Schurink CAM. et al. A Bayesian decision-support system for diagnosing ventilator-associated pneumonia. Intensive Care Medicine 2007; 33: 1379-1386.
  • 25 Woeltje KF. et al. Automated surveillance for central line-associated bloodstream infection in intensive care units. Infection Control & Hospital Epidemiology 2008; 29: 842-846.
  • 26 Klompas M, Kleinman K, Platt R. Development of an algorithm for surveillance of ventilator-associated pneumonia with electronic data and comparison of algorithm results with clinician diagnoses. Infection Control & Hospital Epidemiology 2008; 29: 31-37.
  • 27 Claridge JA. et al. Who is monitoring your infections: shouldn‘t you be?. Surg Infect 2009; 10: 59-64.
  • 28 Wright MO, Komutanon V, Peterson LR, Robicsek A. Automated central line-associated bloodstream infection surveillance. American Journal of Infection Control 2009; 37: E176.
  • 29 McGrane T. et al. Electronic SIRS alert facilitating recognition of sepsis by housestaff. Crit Care Med 2010; 38: A170.
  • 30 Koller W. et al. Electronic surveillance of healthcare-associated infections with MONI-ICU –a clinical breakthrough compared to conventional surveillance systems. Stud Health Technol Inform 2010; 160: 432-436.
  • 31 Hota B. et al. Electronic algorithmic prediction of central vascular catheter use. Infection Control & Hospital Epidemiology 2010; 31: 4-11.
  • 32 Thiel SW. et al. Early prediction of septic shock in hospitalized patients. J Hosp Med 2010; 5: 19-25.
  • 33 Choudhuri JA. et al. An electronic catheter-associated urinary tract infection surveillance tool. Infection Control & Hospital Epidemiology 2011; 32: 757-762.
  • 34 Woeltje KF. et al. Electronic surveillance for healthcare-associated central line-associated bloodstream infections outside the intensive care unit. Infection Control & Hospital Epidemiology 2011; 32: 1086-1090.
  • 35 Bouzbid S. et al. Automated detection of nosocomial infections: evaluation of different strategies in an intensive care unit 2000-2006. J Hosp Infect 2011; 79: 38-43.
  • 36 van Gils M. et al. Using artificial neural networks for classifying ICU patient states. IEEE Eng Med Biol Mag 1997; 16: 41-47.
  • 37 Helfenbein ED. et al. An algorithm for continuous real-time QT interval monitoring. J Electrocardiol 2006; 39: S123-S127.
  • 38 Gharaviri A, Teshnehlab M, Moghaddam HA. Ischemia detection via ECG using ANFIS. Conf Proc IEEE Eng Med Biol Soc 2008; 2008: 1163-1166.
  • 39 Aboukhalil A. et al. Reducing false alarm rates for critical arrhythmias using the arterial blood pressure waveform. Journal of Biomedical Informatics 2008; 41: 442-451.
  • 40 Jonsbu J. et al. Prospective evaluation of an EDB-based diagnostic program to be used in patients admitted to hospital with acute chest pain. Eur Heart J 1993; 14: 441-446.
  • 41 Kennedy RL. et al. An artificial neural network system for diagnosis of acute myocardial infarction (AMI) in the accident and emergency department: evaluation and comparison with serum myoglobin measurements. Comput Methods Programs Biomed 1997; 52: 93-103.
  • 42 Eshelman LJ. et al. Development and evaluation of predictive alerts for hemodynamic instability in ICU patients. AMIA Annu Symp Proc 2008; 379-383.
  • 43 Lorenzoni R. et al. A computer protocol to evaluate subjects with chest pain in the emergency department: a multicenter study. J Cardiovasc Med (Hagerstown) 2006; 7: 203-203.
  • 44 Govindan M. et al. Automated detection of harm in healthcare with information technology: a systematic review. Qual Saf Health Care 2010; 19: e11.
  • 45 Deeks JJ, Altman DG. Diagnostic tests 4: likelihood ratios. BMJ 2004; 329: 168-169.
  • 46 HIMSS.. Overview of CDS five rights. AHRQ; 2009 [updated 2009; cited 2012 09/05/2012]; Available from: http://healthit.ahrq.gov/images/mar09_cds_book_chapter/ CDS_MedMgmnt_ch_1_sec_2_five_rights.htm.
  • 47 Herasevich V. et al. Informatics infrastructure for syndrome surveillance, decision support, reporting, and modeling of critical illness. Mayo Clin Proc 2010; 85: 247-254.
  • 48 Embi PJ, Leonard AC. Evaluating alert fatigue over time to EHR-based clinical trial alerts: findings from a randomized controlled study. JAMIA 2012; 19: e145-e148.

Correspondence to:

Vitaly Herasevich MD PhD
Department of Anesthesiology,
Division of Critical Care Medicine
Mayo Clinic College of Medicine 200 First Street SW
Rochester, MN 55905
Telefon: 507–255–7002   

  • References

  • 1 Langmuir AD. The surveillance of communicable diseases of national importance. N Engl J Med 1963; 268: 182-192.
  • 2 Teutsch SM, Thacker SB. Planning a public health surveillance system. Epidemiol Bull 1995; 16: 1-6.
  • 3 Chaudhry B. et al. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med 2006; 144: 742-752.
  • 4 Lenert L, Sundwall DN. Public health surveillance and meaningful use regulations: a crisis of opportunity. Am J Public Health 2012; 102: e1-e7.
  • 5 Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med 2009; 151: 264-269 W64.
  • 6 Higgins JPT, Green S. Cochrane Collaboration.. Cochrane handbook for systematic reviews of interventions. Chichester, England; Hoboken, NJ: Wiley-Blackwell; 2008
  • 7 Martus P. A measurement model of disease severity in absence of a gold standard. Methods Inf Med 2001; 40: 265-271.
  • 8 Fàbregas N. et al. Clinical diagnosis of ventilator associated pneumonia revisited: comparative validation using immediate post-mortem lung biopsies. Thorax 1999; 54: 867-873.
  • 9 Dwamena BA. MIDAS: A program for Meta-analytical Integration of Diagnostic Accuracy Studies in Stata. Division of Nuclear Medicine, Department of Radiology. University of Michigan Medical School, Ann Arbor; Michigan: 2007
  • 10 Berlin A, Sorani M, Sim I. A taxonomic description of computer-based clinical decision support systems. J Biomed Inform 2006; 39: 656-667.
  • 11 Whiting P, Harbord R, Kleijnen J. No role for quality scores in systematic reviews of diagnostic accuracy studies. BMC Med Res Methodol 2005; 5: 19.
  • 12 Hooper MH. et al. Randomized trial of automated, electronic monitoring to facilitate early detection of sepsis in the intensive care unit. Crit Care Med 2012; 40: 2096-2101.
  • 13 Azzam HC. et al. Validation study of an automated electronic acute lung injury screening tool. J Am Med Inform Assoc 2009; 16: 503-508.
  • 14 Koenig HC. et al. Performance of an automated electronic acute lung injury screening system in intensive care unit patients. Crit Care Med 2011; 39: 98-104.
  • 15 Schmickl CN. et al. Decision support tool for early differential diagnosis of acute lung injury and cardiogenic pulmonary edema in medical critically ill patients. Chest 2012; 141: 43-50.
  • 16 Garcia-Esquirol O. et al. Validation of an automatic continuous system to detect expiratory asynchronies during mechanical ventilation. Intensive Care Medicine 2010; 36: S349.
  • 17 Mojoli F. et al. Automatic detection of patient-ventilator asynchronies during pressure support ventilation. Intensive Care Medicine 2010; 36: S111.
  • 18 Herasevich V. et al. Limiting ventilator-induced lung injury through individual electronic medical record surveillance. Crit Care Med 2011; 39: 34-39.
  • 19 Slooter AJC. et al. Seizure detection in adult ICU patients based on changes in EEG synchronization likelihood. Neurocritical Care 2006; 5: 186-192.
  • 20 Kho A. et al. Utility of commonly captured data from an EHR to identify hospitalized patients at risk for clinical deterioration. AMIA Annu Symp Proc. 2007 Annual Symposium Proceedings/AMIA Symposium.: 404-408.
  • 21 Pokorny L. et al. Automatic detection of patients with nosocomial infection by a computer-based surveil-lance system: A validation study in a general hospital. Infection Control and Hospital Epidemiology. 2006; 27: 500-503.
  • 22 Leth RA, Moller JK. Surveillance of hospital-acquired infections based on electronic hospital registries. J Hosp Infect 2006; 62: 71-79.
  • 23 Bellini C. et al. Comparison of automated strategies for surveillance of nosocomial bacteremia. Infection Control & Hospital Epidemiology 2007; 28: 1030-1035.
  • 24 Schurink CAM. et al. A Bayesian decision-support system for diagnosing ventilator-associated pneumonia. Intensive Care Medicine 2007; 33: 1379-1386.
  • 25 Woeltje KF. et al. Automated surveillance for central line-associated bloodstream infection in intensive care units. Infection Control & Hospital Epidemiology 2008; 29: 842-846.
  • 26 Klompas M, Kleinman K, Platt R. Development of an algorithm for surveillance of ventilator-associated pneumonia with electronic data and comparison of algorithm results with clinician diagnoses. Infection Control & Hospital Epidemiology 2008; 29: 31-37.
  • 27 Claridge JA. et al. Who is monitoring your infections: shouldn‘t you be?. Surg Infect 2009; 10: 59-64.
  • 28 Wright MO, Komutanon V, Peterson LR, Robicsek A. Automated central line-associated bloodstream infection surveillance. American Journal of Infection Control 2009; 37: E176.
  • 29 McGrane T. et al. Electronic SIRS alert facilitating recognition of sepsis by housestaff. Crit Care Med 2010; 38: A170.
  • 30 Koller W. et al. Electronic surveillance of healthcare-associated infections with MONI-ICU –a clinical breakthrough compared to conventional surveillance systems. Stud Health Technol Inform 2010; 160: 432-436.
  • 31 Hota B. et al. Electronic algorithmic prediction of central vascular catheter use. Infection Control & Hospital Epidemiology 2010; 31: 4-11.
  • 32 Thiel SW. et al. Early prediction of septic shock in hospitalized patients. J Hosp Med 2010; 5: 19-25.
  • 33 Choudhuri JA. et al. An electronic catheter-associated urinary tract infection surveillance tool. Infection Control & Hospital Epidemiology 2011; 32: 757-762.
  • 34 Woeltje KF. et al. Electronic surveillance for healthcare-associated central line-associated bloodstream infections outside the intensive care unit. Infection Control & Hospital Epidemiology 2011; 32: 1086-1090.
  • 35 Bouzbid S. et al. Automated detection of nosocomial infections: evaluation of different strategies in an intensive care unit 2000-2006. J Hosp Infect 2011; 79: 38-43.
  • 36 van Gils M. et al. Using artificial neural networks for classifying ICU patient states. IEEE Eng Med Biol Mag 1997; 16: 41-47.
  • 37 Helfenbein ED. et al. An algorithm for continuous real-time QT interval monitoring. J Electrocardiol 2006; 39: S123-S127.
  • 38 Gharaviri A, Teshnehlab M, Moghaddam HA. Ischemia detection via ECG using ANFIS. Conf Proc IEEE Eng Med Biol Soc 2008; 2008: 1163-1166.
  • 39 Aboukhalil A. et al. Reducing false alarm rates for critical arrhythmias using the arterial blood pressure waveform. Journal of Biomedical Informatics 2008; 41: 442-451.
  • 40 Jonsbu J. et al. Prospective evaluation of an EDB-based diagnostic program to be used in patients admitted to hospital with acute chest pain. Eur Heart J 1993; 14: 441-446.
  • 41 Kennedy RL. et al. An artificial neural network system for diagnosis of acute myocardial infarction (AMI) in the accident and emergency department: evaluation and comparison with serum myoglobin measurements. Comput Methods Programs Biomed 1997; 52: 93-103.
  • 42 Eshelman LJ. et al. Development and evaluation of predictive alerts for hemodynamic instability in ICU patients. AMIA Annu Symp Proc 2008; 379-383.
  • 43 Lorenzoni R. et al. A computer protocol to evaluate subjects with chest pain in the emergency department: a multicenter study. J Cardiovasc Med (Hagerstown) 2006; 7: 203-203.
  • 44 Govindan M. et al. Automated detection of harm in healthcare with information technology: a systematic review. Qual Saf Health Care 2010; 19: e11.
  • 45 Deeks JJ, Altman DG. Diagnostic tests 4: likelihood ratios. BMJ 2004; 329: 168-169.
  • 46 HIMSS.. Overview of CDS five rights. AHRQ; 2009 [updated 2009; cited 2012 09/05/2012]; Available from: http://healthit.ahrq.gov/images/mar09_cds_book_chapter/ CDS_MedMgmnt_ch_1_sec_2_five_rights.htm.
  • 47 Herasevich V. et al. Informatics infrastructure for syndrome surveillance, decision support, reporting, and modeling of critical illness. Mayo Clin Proc 2010; 85: 247-254.
  • 48 Embi PJ, Leonard AC. Evaluating alert fatigue over time to EHR-based clinical trial alerts: findings from a randomized controlled study. JAMIA 2012; 19: e145-e148.