Keywords monitoring and surveillance - public health - patient records - influenza - human
- automation
Background and Significance
Background and Significance
The category of pneumonia and influenza is a top ten leading cause of death in the
United States.[1 ] Each year, influenza infection results in up to 710,000 hospitalizations, 95,000
intensive care unit admissions, and 27,000 deaths nationwide.[2 ]
[3 ] Influenza-associated hospitalization (IAH) is an essential metric for hospital operations
and infection prevention activities and, as of 2017, is reportable to public health
authorities in at least nine states in the United States, including our state of Ohio.[4 ] The Ohio Department of Health (ODH) generally defines IAH as a patient with a clinically
compatible illness who has a positive influenza test collected within 14 days before
or 3 days after admission to an inpatient location of an acute care hospital.[5 ]
Surveillance for both hospital-associated and community-acquired infections has traditionally
been the responsibility of hospitals' departments of infection prevention. Measurement
and public reporting of IAH are particularly burdensome for hospital infection preventionists
(IPs) during times of high influenza activity. Incomplete reporting of notifiable
communicable diseases to public health is a well-established problem.[6 ]
[7 ]
[8 ] While computerized surveillance software and electronic laboratory reporting (ELR)
to public health have greatly enhanced surveillance practice over the last several
years, the data quality issues associated with person-dependent case finding remain.[9 ]
[10 ]
The Cleveland Clinic is a multinational health system consisting of 11 hospitals and
5 free-standing emergency departments in the state of Ohio in the United States. The
major limitation to ELR for IAH at the Cleveland Clinic is the inability to transmit
cases involving a positive influenza test result during an emergency department encounter
that result in subsequent admission to the hospital. Our ELR messages rely on patient
status data at the instant of test result, thus the ELR system cannot detect an emergency
department patient who was admitted after an influenza-positive test result. These
emergency department patients represent a large proportion of total IAH and have to
be reported individually by IPs.
Documentation in the electronic medical record (EMR) recorded as part of routine clinical
practice can be successfully leveraged to automate influenza detection.[11 ]
[12 ] Existing literature establishing the superiority of electronic surveillance for
influenza has primarily involved syndromic surveillance as opposed to laboratory-confirmed
influenza.[13 ]
[14 ] Defining a gold standard against which to assess the performance of a novel surveillance
system is challenging. Previous research has described joining data from disparate
information systems and medical record review for this purpose.[15 ]
[16 ] Recent research has illustrated the utility of electronic infection surveillance
and data visualization for informing decision making.[17 ]
Objectives
Our objectives were to develop a fully automated surveillance system for laboratory-confirmed
IAH in our multihospital health system, to evaluate the performance of the automated
system during the 2018 to 2019 influenza season at eight hospitals by comparing its
sensitivity and positive predictive value (PPV) to that of the manual surveillance
system, and to estimate the time and cost savings associated with reliance on the
automated surveillance system.
Methods
Manual Surveillance System
At the Cleveland Clinic health system, IPs at each hospital manually detect and record
cases of IAH using commercial surveillance software (TheraDoc, Premier, Inc., Charlotte,
North Carolina, United States). During business hours, IPs review all positive influenza
tests from hospital and emergency department encounters and, if the patient meets
the ODH case definition for IAH, create a Notifiable Disease Document in TheraDoc.
The case is then reported to public health in accordance with the State Administrative
Code by manually keying required patient data into the State's Web-based Ohio Disease
Reporting System (ODRS). All test results for influenza are reviewable in TheraDoc
except for one point-of-care molecular test used in the emergency department at one
hospital. All hospitals in our health system use TheraDoc, creating a system-wide
database for manually recorded IAH. System-wide manual surveillance summaries are
generated using the Notifiable Disease Document reporting functionality of TheraDoc.
For this study, 24 IPs at 11 hospitals in our health system were asked the following
question by email: “During peak activity, approximately and on average, how many minutes
per day do you spend on inpatient influenza surveillance?” In this survey, “peak influenza
activity” was not objectively defined.
Automated Surveillance System
All hospitals in our health system use an interoperable EMR (EPIC, Epic Systems Corporation,
Verona, Wisconsin, United States) and share a home-grown universal data repository
called the Enterprise Data Vault (EDV). We programmed a Teradata (Teradata Corporation,
San Diego, California, United States) database to prospectively report cases of IAH
from EDV. The health system's Enterprise Analytics Division developed EDV to make
accessible to clinicians and information systems practitioners the clinical and laboratory
information recorded in our EMR, as well as International Classification of Diseases
- 10th Revision (ICD-10) data.
Our database joined all positive molecular influenza test data to associated hospital
and free-standing emergency department encounters. Data fields included influenza
test results, and encounter dates and locations, including unit of discharge. The
database, automatically updated daily with extracts from EPIC, was set to feed a Tableau
(Tableau Software, Seattle, Washington, United States) dashboard that summarized and
reported the total number of IAH to hospital stakeholders weekly via email subscription.
Cases in this automatic system included all patients with positive influenza tests
collected within the first three calendar days of an inpatient or observation hospital
encounter or during an emergency encounter that resulted in admission to any of our
acute care hospitals.
Surveillance System Comparison
We designed our working definition of a “true case” to closely approximate the ODH
case definition for IAH. For this project, a true case was defined as any patient
admitted to an inpatient ward of an acute care hospital who had a positive influenza
test in the EMR collected within 14 days before or 3 days after admission to an inpatient
ward of an acute care hospital.
Patients with a positive influenza test joined to an encounter resulting in discharge
from an inpatient hospital ward in EDV were considered true cases, as were patients
with a positive influenza test collected during an emergency department encounter
resulting in transfer to an acute care hospital. We considered this link in EDV between
a positive test and an inpatient ward encounter to be equivalent to a medical record
review. One IP performed medical record review of the remaining possible cases to
determine whether they met our case definition of IAH ([Fig. 1 ]) and to determine why they were not detected by the automated system.
Fig. 1 Process flow for ascertaining whether patients met the definition of a true case
of influenza-associated hospitalization.
To establish a gold standard of true cases of IAH against which to assess the performance
of the surveillance systems, a list of all possible cases was generated by linking
records by patient name and date of hospitalization from four independent information
systems. Possible cases included all records of eight of our hospitals between September
1, 2018 and April 15, 2019 in:
The automatic system (Teradata - EDV query).
The manual system (TheraDoc) Notifiable Disease Documents for IAH.
Patients admitted with an influenza-related diagnosis (ICD-10 codes).
The ODRS IAH reports.
Eight of the 11 hospitals in our health system were selected for this analysis because
a list of patients was available from all four databases. IPs at the 3 of the 11 hospitals
did not routinely enter cases of IAH directly into ODRS during the study period. Because
of the unreliability of IP-entered cases of IAH in ORDS at those three hospitals,
we decided to exclude them from our study.
We queried our billing information system independently for a list of patients admitted
during the study period with an admission diagnosis related to influenza virus infection.
The following ICD-10 codes were included in the query: J09.X1, J10.00, J10.1, J11.00,
J11.1, and Z87.09.
IPs at each hospital generated a list of IAHs that they reported to ODRS between September
1, 2018 and April 30, 2019. ODRS data extraction is based on date of report, so cases
were excluded if the hospitalization date occurred after April 15, 2019.
Based on guidelines from the United States Centers for Disease Control and Prevention
for evaluating the performance of public health surveillance systems,[18 ] we calculated the sensitivity and PPV of our manual and automated IAH surveillance
systems. The sensitivity was calculated as the percent of true cases detected. PPV
was the percent of all cases detected by each system that met our case definition
for IAH.
Data were joined and managed using SAS Enterprise Guide version 7.15 (SAS Institute,
Cary, North Carolina, United States) and Microsoft Excel (Microsoft Corp, Redmond,
Washington, United States). The incidence of IAH reported by each surveillance system
was calculated using MedCalc for Windows, version 15.0 (MedCalc Software, Ostend,
Belgium).
Results
A list of 1,031 distinct patients with possible IAH was generated from the four databases.
There were 646 total patients reported by the manual system and 877 total patients
reported by the automated system from 8 hospitals ([Table 1 ]). As illustrated in [Table 1 ], the percent of cases in each system that were true IAH (PPV) varied between hospitals,
but the range of variation was narrower for the manual system compared with the automated
system (83–100 vs. 64–100, respectively).
Table 1
Number of patients detected by two surveillance systems for IAH that did (true case)
and did not (not true case) meet our case definition for IAH and PPV, eight hospitals,
September 1, 2018 to April 15, 2019
Automated system
Manual system
True cases
N (%)
Not true cases
N (%)
Total
True cases
N (%)
Not true cases
N (%)
Total
Hospital 1
70 (64)
39 (36)
109
57 (86)
9 (14)
66
Hospital 2
73 (87)
11 (13)
84
69 (84)
13 (16)
82
Hospital 3
172 (86)
29 (14)
201
121 (83)
25 (17)
146
Hospital 4
41 (100)
0 (0)
41
42 (91)
4 (9)
46
Hospital 5
173 (100)
0 (0)
173
133 (97)
4 (3)
137
Hospital 6
97 (92)
8 (8)
105
103 (89)
13 (11)
116
Hospital 7
86 (98)
2 (2)
88
5 (100)
0 (0)
5
Hospital 8
62 (82)
14 (18)
76
47 (98)
1 (2)
48
Total
774 (88)
103 (12)
877
577 (89)
69 (11)
646
PPV
88.3%
89.3%
Abbreviations: IAH, influenza-associated hospitalization; PPV, positive predictive
value.
We established 844 true cases of IAH. An admission diagnosis related to influenza
infection was present in 688 (82%) true cases. Thirty-five (4%) true cases diagnosed
with influenza appeared in none of the three other databases. There were 767 hospitalizations
reported to ODRS, of which 690 (90%) were true cases.
The distribution of cases reported in each surveillance system by whether they met
the case definition is shown in [Table 2 ]. Sensitivity and PPV of the manual system was 68.4 and 89.3%, while that of the
automated system was 91.7 and 88.3%. The cumulative incidence of true IAH was 1.03
per 100 admissions by the automated system and was 0.76 per 100 admissions by the
manual system (rate ratio: 1.34, 95% confidence interval: 1.20–1.49).
Table 2
Number of patients in two surveillance systems that did and did not meet the case
definition of influenza-associated hospitalization, eight hospitals, September 1,
2018 to April 15, 2019
True case
N (%)
Not a true case
N
Detected by automated system
774 (92)
103
Not detected by automated system
70 (8)
74,496
Automated system sensitivity
91.7%
Detected by manual system
577 (68)
69
Not detected by manual system
267 (32)
74,530
Manual system sensitivity
68.4%
From the 10 IPs who responded to our inquiry (survey response rate: 42%), we calculated
an average of 82 minutes per day per IP is spent on manual IAH surveillance and reporting
during peak influenza activity (range: 30–180 minutes).
Discussion
We successfully developed and implemented a fully automated, highly sensitive surveillance
system for IAH that detected significantly more true cases of IAH than were recorded
manually. As illustrated in [Table 1 ], the proportion of manually recorded cases that are true IAH varies between hospitals
in our health care system. Centralized surveillance leveraging documentation in the
EMR, as has been described for device-associated infection denominators,[19 ] frees IP time, is reliable, and reduces interhospital variability in case detection.
Our findings are consistent with others that have found that technological solutions
for influenza surveillance may outperform manual methods. Automated surveillance systems
developed at one health care system can potentially be replicated at other institutions.[20 ]
[21 ] With the vast majority of acute care hospitals having now adopted EMRs,[22 ] replication of this database development at other health care systems is feasible.
Of the true cases not detected by the automatic system, 58 (83%) were patients who
had a positive test collected at an outpatient visit, were sent home, and were later
admitted within 14 days. It is quite possible that some of these 58 patients were
admitted for issues not related to influenza infection, therefore did not meet the
actual ODH case definition. Nine (13%) influenza test results did not appear discretely
in the EMR. Six of those nine patients had positive point-of-care rapid influenza
antigen tests that are no longer in use within our health care system and have been
replaced by molecular influenza tests. Our Teradata query searched for the word “positive”
in the test result component. Because the point-of-care rapid tests did not contain
the word “positive” in the result component, our system did not detect them. That
these missing tests are no longer in use carries implications for the future performance
of the surveillance system. By replacing missing tests with the molecular tests that
were detected, we can anticipate that the sensitivity and PPV of the automated system
will be improved after our study period.
We cannot be certain that additional true cases of IAH did not remain hidden from
our effort to establish a master list of all possible cases, resulting in an overestimation
of the sensitivities of both systems. For instance, we would not have detected a patient
with a positive influenza test during an outpatient encounter who was later admitted
and discharged from a hospital outside of normal business hours and did not have an
influenza-related diagnosis. In this study, only 13 (2%) true cases missed by the
automatic system lacked a diagnosis of influenza, so the prevalence of still-hidden
cases is likely low.
Limitations to relying on computerized surveillance are extract intervals and technical
downtime. Our database extracts from the EMR at 24-hour intervals, so during normal
operation, there is a maximum delay of 24 hours before IAH cases are reported. If
the database fails to update at the normal interval, reporting of cases through the
dashboard might be delayed further. However, the computerized component of manual
surveillance subjects it to this same limitation. The sum of these limitations to
fully automated surveillance for IAH is no greater than the issues of reliability
associated with person-dependent case finding and record creation.
Cost Savings
The majority of IP time spent related to IAH surveillance now involves manually keying
patient data into ODRS. Historically, each of the 30 IPs in our health system was
responsible for keying reportable IAH into ODRS. Automated IAH database development
enabled us to centralize the clerical task of reporting IAH to public health to a
single individual. With an average of 82 IP minutes per day saved during the 4 weeks
of peak influenza activity, this project resulted in an estimated infection prevention
payroll redirection of $32,880 through those four intensive weeks of case reporting.
With the adoption of automated surveillance for IAH, IP time can be redirected to
more clinically meaningful activities during a time of the year when in-person consultation
is particularly important.
Conclusion
Surveillance for high-volume infections requiring low cognitive effort, such as IAH,
can be entirely automated with greater sensitivity than is practical through manual
surveillance. Elimination of interrater variability, a lodestar of public health surveillance,
can be achieved by removing the limitations inherent to person-dependent case finding
and reporting.
Clinical Relevance Statement
Clinical Relevance Statement
While accurate and timely surveillance for IAH provides essential intelligence for
hospital operations, it may also inform clinical practice decisions including those
related to empiric antiviral treatment and chemoprophylaxis. Leveraging technology
to improve the efficiency and accuracy of surveillance for IAH has the potential to
impact lives by informing risk as it relates to infectious disease activity. Unburdening
infection preventionists from clerical tasks frees up their time for more valuable
patient care activities.
Multiple Choice Questions
Multiple Choice Questions
1. How did the authors describe the performance of their automated surveillance system
for influenza-associated hospitalization relative to manual surveillance?
Correct Answer: The correct answer is choice a. The authors calculated the sensitivity of their automated
system to be 91.7% while that of the manual system was 68.4%.
2. Which of the following sources of information were used to create a master list
of cases of IAH against which to assess the performance of the surveillance systems?
Emergency room roster of patients with influenza.
Patients hospitalized with ICD-10 codes related to influenza.
School absenteeism reports.
Health department reports of influenza-like illness.
Correct Answer: The correct answer is b. True cases of IAH were ascertained by reviewing the medical
records of patients in the automated surveillance system, the manual surveillance
system, the Department of Health Disease Reporting System, and a list of patients
with ICD-10 codes related to influenza virus infection.