Appl Clin Inform 2019; 10(01): 001-009
DOI: 10.1055/s-0038-1676587
Research Article
Georg Thieme Verlag KG Stuttgart · New York

CDS in a Learning Health Care System: Identifying Physicians' Reasons for Rejection of Best-Practice Recommendations in Pneumonia through Computerized Clinical Decision Support

Barbara E. Jones
1   VA Salt Lake City IDEAS Center, VA Salt Lake City Healthcare System, Salt Lake City, Utah, United States
2   Division of Pulmonary and Critical Care Medicine, University of Utah, Salt Lake City, Utah, United States
,
Dave S. Collingridge
3   Intermountain Healthcare, Murray, Utah, United States
,
Caroline G. Vines
3   Intermountain Healthcare, Murray, Utah, United States
,
Herman Post
4   Homer Warner Center for Informatics, Intermountain Healthcare, Murray, Utah, United States
,
John Holmen
4   Homer Warner Center for Informatics, Intermountain Healthcare, Murray, Utah, United States
,
Todd L. Allen
5   Department of Emergency Medicine, Intermountain Healthcare, Murray, Utah, United States
,
Peter Haug
4   Homer Warner Center for Informatics, Intermountain Healthcare, Murray, Utah, United States
,
Charlene R. Weir
6   Department of Biomedical Informatics, University of Utah, Salt Lake City, Utah, United States
,
Nathan C. Dean
7   Division of Pulmonary and Critical Care Medicine, Intermountain Healthcare and University of Utah, Murray, Utah, United States
› Author Affiliations
Funding This work was supported by Intermountain Healthcare and the Intermountain Research and Medical Foundation. The Research Electronic Data Capture (REDCap) tool is funded by a grant from the National Institutes of Health (CTSA 3UL1RR025764–02). Dr. Jones is funded by a career development award from the Veterans Affairs Health Services Research & Development (# IK2HX001908).
Further Information

Address for correspondence

Barbara E. Jones, MD, MSc
VA Salt Lake City IDEAS Center, VA Salt Lake City Healthcare System
500 Foothill Drive Building 2, Salt Lake City, UT 84148-0002
United States   

Publication History

30 August 2018

09 November 2018

Publication Date:
02 January 2019 (online)

 

Abstract

Background Local implementation of guidelines for pneumonia care is strongly recommended, but the context of care that affects implementation is poorly understood. In a learning health care system, computerized clinical decision support (CDS) provides an opportunity to both improve and track practice, providing insights into the implementation process.

Objectives This article examines physician interactions with a CDS to identify reasons for rejection of guideline recommendations.

Methods We implemented a multicenter bedside CDS for the emergency department management of pneumonia that integrated patient data with guideline-based recommendations. We examined the frequency of adoption versus rejection of recommendations for site-of-care and antibiotic selection. We analyzed free-text responses provided by physicians explaining their clinical reasoning for rejection, using concept mapping and thematic analysis.

Results Among 1,722 patient episodes, physicians rejected recommendations to send a patient home in 24%, leaving text in 53%; reasons for rejection of the recommendations included additional or alternative diagnoses beyond pneumonia, and comorbidities or signs of physiologic derangement contributing to risk of outpatient failure that were not processed by the CDS. Physicians rejected broad-spectrum antibiotic recommendations in 10%, leaving text in 76%; differences in pathogen risk assessment, additional patient information, concern about antibiotic properties, and admitting physician preferences were given as reasons for rejection.

Conclusion While adoption of CDS recommendations for pneumonia was high, physicians rejecting recommendations frequently provided feedback, reporting alternative diagnoses, additional individual patient characteristics, and provider preferences as major reasons for rejection. CDS that collects user feedback is feasible and can contribute to a learning health system.


#

Background and Significance

As the leading infectious cause of death in the United States, pneumonia is a major target for quality improvement. Timely and accurate decision-making surrounding diagnosis, site of care, and treatment with antibiotics is crucial to optimize outcomes and typically occurs through a complex integration of information from the patient and electronic health record (EHR) ([Fig. 1]). Adherence to best-practice guidelines for the management of pneumonia has been associated with improved outcomes.[1] [2] [3] Local adaptation and implementation of best-practice guidelines is thus a grade I recommendation.[4] [5] However, adherence to guidelines is low, and widespread variation in practice and outcomes exists,[6] [7] [8] [9] [10] which may be due to provider uncertainty in the guidelines.[11] The best approaches to adaptation of evidence-based practice across health care systems are not well defined, and differences in settings, patient populations, and providers may create contextual challenges to standardizing practice.[12]

Zoom Image
Fig. 1 Workflow.

Computerized clinical decision support (CDS) embedded in the EHR is a promising way to implement best practices reliably and sustainably across a health care system,[13] although few CDS tools for pneumonia have successfully impacted practice or outcomes.[14] [15] [16] We implemented a computerized CDS for pneumonia, “ePneumonia,” across four emergency departments (EDs). In an ecological pre–post study design, we found that implementation of ePneumonia was associated with a reduction of 30-day mortality, increase in high-risk hospitalizations, and increase in first-line antibiotics use compared with three control EDs that used paper-based guidelines without the CDS (previously reported[17]). However, we found no reduction in hospitalizations of low-risk patients, which we had anticipated based upon other implementation efforts[18] and examination of baseline practice patterns.[19] Computerized CDS offers the opportunity to probe more deeply into the rejection or adoption of best-practice recommendations, generate important feedback for CDS improvement, and enable us to learn from and engage physicians. We thus sought to examine physician interactions with the CDS to identify factors driving rejection or adoption of CDS recommendations.


#

Objectives

The aims of this study were to:

  1. Track the adoption versus rejection of best-practice guideline recommendations provided to ED physicians through their interaction with a bedside CDS.

  2. Examine clinical reasons provided by physicians for rejecting best-practice recommendations.


#

Methods

Setting and Intervention

Intermountain Healthcare comprises 22 hospitals in Utah and Idaho and has been a pioneer in the development of a clinically oriented electronic health data and CDS.[20] In 1998, a paper-based pneumonia guideline was implemented with moderate success in all EDs.[21] [22] In 2011, we implemented the computerized CDS tool at four of seven EDs in the urban central region of Salt Lake City for ED physicians caring for patients diagnosed with suspected pneumonia that integrates individual patient data with guideline-based recommendations at the point of care.


#

Description of CDS Design and Implementation Process

ePneumonia is integrated into the physician workflow at a critical point in decision-making, when the physician has typically completed an initial evaluation of the patient and is synthesizing the patient's history, physical examination, and results of laboratory and radiographic tests to form a diagnosis and treatment plan ([Fig. 1]). The tool initially screens all ED patients with chest imaging for evidence of pneumonia based upon a Bayesian analysis of clinical data, including natural language processing of chest imaging reports, vital signs, chief complaint, and laboratory values.[23] If the estimated likelihood of pneumonia exceeds 40%, the provider is alerted through the electronic ED patient tracking board, which displays continuously updated information on ED patient census and status displayed on every ED work station computer. Additionally, the physician can initiate ePneumonia independently of the screening tool through a desktop icon. If he/she confirms suspected pneumonia, ePneumonia proceeds to extract patient age and comorbidities, vital signs, nursing assessments, laboratory values, prior microbiology data, and radiographic evidence of pleural effusions and multilobar infiltrates. Using this information, ePneumonia automatically calculates a 30-day mortality risk estimate,[24] and identifies hospital admission criteria,[19] severe community-acquired pneumonia criteria,[25] and risk factors for infection with resistant organisms (“health care-associated pneumonia”).[4] The tool integrates the patient data, risk assessments, and guideline-based management recommendations, including site of care (intensive care unit [ICU], hospital ward, or outpatient), diagnostic studies, and antibiotic selection, onto four sequential screens.

All decisions and orders for patient care remain under the control and responsibility of the ED provider, who is fully supported regardless of whether he/she accepts or rejects any of the recommendations. When a recommendation is rejected, ePneumonia asks the physician to provide a reason. Common reasons for rejection of recommendations identified in the literature[6] [26] [27] are offered in a structured dropdown menu; however, a text box is also available for physicians to leave unstructured text as either an addition or alternative to the prespecified responses.

We collaborated with clinician stakeholders during the development of ePneumonia through outreach to the four participating EDs, reviewing standard management of pneumonia with ED physicians and meeting with clinical leadership. We conducted preliminary testing of the tool in one of the EDs during December 2010 to May 2011, including face-to-face visits with physicians during clinical shifts to receive feedback to improve the design. ePneumonia was implemented into routine clinical care at four hospital EDs in May 2011. A usability survey of 72 physicians with experience using the CDS was conducted in November 2011, in which physicians reported high satisfaction, specifically in its usefulness for antibiotic recommendations and site-of-care decisions ([Supplementary Material], available in the online version).


#

Participants

We identified all cases in which providers used ePneumonia during the period of December 1, 2011 through November 30, 2012. Providers could use the tool multiple times for the same patient to refresh the clinical data; when this occurred, we examined results only from the final iteration. Cases in which the tool was launched but not completed to a treatment recommendation were excluded.


#

Measurements

For each provider–CDS interaction, we measured rejection or adoption of two major recommendations: (1) site of care (hospitalization to ICU or medical ward, or discharge from the ED), and (2) antibiotic selection. Physicians could proceed to the site-of-care recommendation but then stop before receiving the antibiotic selection recommendation; for these cases, we evaluated the site-of-care recommendations only. When the physician rejected the tool recommendation, we collected structured data of prespecified reasons from the provided dropdown menu selections, as well as free-text entries entered by the physician. Levels of reported satisfaction with the CDS were also measured among those physicians who completed the postimplementation usability survey ([Supplementary Material: Survey Methods and Results], available in the online version).


#

Analysis

Comparisons between proportions of cases with agreements/disagreements were tested for significance using Fisher's exact and chi-square tests, where appropriate. We used generalized estimating equations analysis to examine the relationships between repeated measures of physician agreement (yes vs. no) with site-of-care and antibiotic recommendations and the following provider variables: age, gender, years worked, attending or resident, number of encounters, and usability of ePneumonia based on survey responses ([Supplementary Material]), available in the online version).

We analyzed the unstructured text left by physicians rejecting recommendations by applying concept mapping, a mixed methods analysis that combines group sorting with multidimensional scaling.[28] Two clinician investigators (N.D. and B.J.) reviewed each entry, merged, and parsed the constructs to ensure similar length and level of abstraction. Then, 15 practicing clinicians including physicians and nurse practitioners independently participated in a Delphi-type card sorting exercise, in which those cards that represented a similar concept were sorted together. We applied multidimensional scaling to the card sorting data by computing average distances between concepts and generated cluster trees according to average distances based upon complete linkage. N.D. and B.J. then identified themes representative of each cluster.

The study was approved by the Intermountain Institutional Review Board (IRB #1017598). Implied consent was obtained from all surveyed physicians by completion of the survey and from clinicians participating in the card-sorting exercise by their participation, and both were approved by the IRB; waiver of patient consent for CDS data collection was approved by the IRB. All statistical analyses were performed using SPSS (Version 19.0. IBM SPSS Statistics for Windows, Armonk, New York, United States) or STATA MP (Version 14.1, StataCorp, College Station, Texas, United States); the card-sorting exercise and concept mapping analysis was performed using the X-Sort software (http://xsortapp.com). Data analysis code is available upon request.


#
#

Results

The CDS provided site-of-care recommendations for 1,722 patient encounters, and antibiotic recommendations for 1,507 during the study period. Adoption of antibiotic recommendations was slightly higher at Intermountain Medical Center (95% vs. 89% at lowest-agreeing facility, p < 0.01); we found no significant between-hospital differences in adoption of site-of-care recommendations.

Physicians rejected site-of-care recommendations in 16% of visits, of which 84% reflected a provider's decision to place the patient in a higher-acuity site of care than recommended. Physicians rejected the outpatient recommendation in 24% of visits. Among the prespecified reasons for rejecting recommendations, physicians most commonly selected greater severity of illness than determined by the CDS, and uncontrolled comorbidities requiring hospitalization ([Table 1]). Providers rejecting site-of-care recommendations left free-text reasons for rejection in 53% of cases. Concept mapping and multidimensional scaling ([Fig. 2]) revealed two reasons for rejection of the outpatient recommendation that overlapped with prespecified constructs—severity of illness not extracted by the tool and clinical comorbidities—and three new themes: alternative or additional diagnoses, risks for outpatient failure due to functional status, and risk of outpatient failure due to social comorbidities such as homelessness and lack of access to care ([Table 1]).

Zoom Image
Fig. 2 Revision. Concept mapping site-of-care deviations.
Table 1

Reasons for rejection of best-practice recommendations

Percent (N) of disagreements

Site-of-care (N = 266, 16% of all cases)

 Prespecified reasons for rejection

  

  Patient sicker than estimated

  56% (148)

  Clinical judgment

  22% (58)

  Uncontrolled comorbid illnesses

  17% (46)

  Patient less sick than estimated

  15% (43)

  Patient preference

  6% (17)

  Oxygen requirements not appreciated by tool

  5% (12)

  No caregiver support

  4% (10)

  Critical care needs not appreciated by tool

  3% (8)

  Failed outpatient therapy

  3% (8)

  Can't tolerate PO meds

  1% (3)

  Pregnancy

  0.4% (1)

  Provider left free-text entry

  53% (141)

 Additional themes from provider free-text entries

  

  Alternative/additional diagnoses

  21% (58)

 Risk of outpatient failure:

  

  Functional status

  4% (12)

  Social comorbidities

  2% (6)

Antibiotics ( N  = 104, 7% of all cases):

 Prespecified reasons for rejection

  

  Medication allergy

  38% (39)

  Immune compromised

  10% (10)

  Risk factors for MRSA not identified by tool

  3% (3)

  Risk factors for anaerobes not identified by tool

  3% (3)

  Provider left free-text entry

  75% (78)

 Additional themes from provider free-text entries

  

  Previous treatments (antibiotics, interventions)

  20% (20)

  Preferences of admitting physician

  11% (11)

  Differences in pathogen risk assessment

  9% (9)

Abbreviations: MRSA, methicillin-resistant Staphylococcus aureus; PO, twice daily.


Physicians rejected antibiotic recommendations in only 7% of cases. Medication allergies not extracted by the CDS were the most common prespecified reasons for rejection ([Table 1]). The most common antibiotic recommendation rejected was to prescribe broad-spectrum antibiotics for patients meeting the criteria for health care-associated pneumonia, with a rejection rate of 10%; 76% provided free-text reasons for deviating. Concept mapping and multidimensional scaling ([Fig. 3]) also revealed properties of the antibiotics such as allergy potential. New themes identified included additional patient history including previous treatments, preferences of the admitting hospital physician, and differences in pathogen risk assessment ([Table 1]).

Zoom Image
Fig. 3 Revision. Concept mapping antibiotic deviations.

Adoption of the site-of-care recommendation was positively associated with increased physician age, but negatively associated with perception of the tool's usefulness ([Table 2]). In contrast, adoption of the antibiotic recommendations was positively associated with younger physician age and perception of the tool's usefulness ([Table 3]). We found no significant relationships between adoption and other provider characteristics including gender, utilization, attending versus resident status, or other survey responses of satisfaction with the CDS.

Table 2

Physician characteristics and adoption of site-of-care recommendation

Odds ratio

Lower 95% CI

Upper 95% CI

p-Value

Age ≤ 40

1

Age 41–50

1.3

0.8

2.3

0.26

Age 51–60

2.7

1.3

5.5

0.007

Age > 60

4.1

1.5

5.5

0.007

Years worked in ED

0.95

0.75

0.99

0.009

Overall tool usefulness (1–5)

0.80

0.67

0.96

0.016

Usefulness for ordering diagnostic studies (1–5)

0.87

0.75

0.99

0.009

Experience of technical difficulties (1–5)

0.77

0.66

.91

0.002

Abbreviations: CI, confidence interval; ED, emergency department; GEE, generalized estimating equations.


Note: GEE regression model of N = 1,293 interactions and 58 physicians. Additional variables in the model found not to be significant included provider gender, number of tool uses per physician, reported usefulness of screening tool, and reported impact on clinical activity.


Table 3

Physician characteristics versus adoption of antibiotic recommendation

Odds ratio

Lower 95% CI

Upper 95% CI

p-Value

Age ≤ 40

1

Age 41–50

0.22

0.11

0.45

< 0.001

Age 51–60

0.30

0.11

0.84

0.02

Age > 60

0.26

0.10

0.65

0.004

Overall tool usefulness (1–5)

2.0

1.4

2.8

< 0.001

Abbreviations: CI, confidence interval; ED, emergency department; GEE, generalized estimating equations.


Note: GEE regression model of N = 1,293 interactions, 58 physicians. Additional variables in the model included provider gender, years worked in ED, reported usefulness of screening tool, usefulness for ordering diagnostic studies, and usability of tool.



#

Discussion

In a four-hospital ED implementation of a computerized CDS for pneumonia that improved care process, we found that adoption of recommendations was high, but physicians who rejected best-practice recommendations in practice provided new reasons for rejection not previously highlighted in the literature, including alternative diagnoses and additional patient information used for decision-making not extracted by the tool that led to uncertainty in the recommendations. Examining physician–CDS interactions was feasible and provided insights to the implementation and adaptation of evidence-based practice in pneumonia.

Effective implementation of innovations in a complex system requires the ability of users to constantly reinvent and adapt innovations to different contexts.[29] Effective adaptation of CDS can be supported through the participation by users at the individual patient level to generate continuous feedback.[30] CDS designs are never perfect at the time of implementation and require some form of surveillance, but system-level monitoring can be time-consuming and costly.[31] By studying the interaction between physicians and a computerized CDS for pneumonia during its implementation, we sought to develop a sustainable approach to understand the adaptation process for continuous learning and CDS improvement.

CDS tools have been increasingly proposed as a way to improve processes in acute health care settings.[32] The ED is an ideal setting where CDS may enhance decision-making, specifically by reducing cognitive errors that can occur under time pressure and information overload.[33] However, a recent review of computerized CDS in the ED concluded that few studies have demonstrated a positive impact.[34] A lack of high-quality intervention studies may explain this failure to observe effectiveness, but it may also be a function of CDS implementation, which can require substantial outreach and understanding of barriers to CDS utilization.[35] For pneumonia, implementation of guideline-based CDS may be further challenged by provider uncertainty in the applicability or appropriateness of those guidelines, or in the information used by CDS to generate individual recommendations. The importance of usability in CDS is widely recognized,[36] [37] as is the potential for qualitative research to identify barriers to use of CDS.[38] In order for implementation of CDS to be successful, systems must be adapted locally, and they must be built to empower clinicians to provide feedback for continued development and learning.[39]

While we found a high rate of adoption in best-practice recommendations among physicians using our CDS, we also found that physicians who rejected recommendations were motivated to leave unstructured text entries in most cases. We had designed the CDS to integrate well into the workflow of a busy ED provider; thus, we took care to provide “quick” options from the dropdown menu. We were surprised by the number of physicians who contributed free-text entries given their time constraints, often giving information with themes that overlapped with those from the prespecified constructs. This resulted in a dialogue between the physician and the CDS and its creators, allowing the physician to teach the tool exceptions to its rules, and enabling us to refine our design. The dialogue may also heighten provider's awareness when objective data integrated by the tool is inconsistent with his/her assessment.

Some reasons for rejection were expected, and related to discordance in information used by physicians versus the CDS. Failure to identify allergies was the most common reason for rejection of the antibiotic recommendation, due to unreliable allergy information within the EHR. The EHR often either missed relevant antibiotic allergies, or more frequently provided incorrect information such as classifying nausea from amoxicillin as a penicillin allergy. Thus, providers had to manually revise allergy information at the time of the recommendation, leading to some discordance. We have since worked to improve the CDS' extraction of accurate allergy information.

We found that 84% of rejections of the site-of-care decision resulted in a higher acuity of care. This is consistent with our previously reported finding that the CDS increased high-risk hospitalizations, but did not significantly reduce low-risk hospitalizations. Failure to capture the full complexity and context of individual patients is a known limitation of both CDS and pneumonia guidelines, and our finding that physicians used additional information beyond the tool to assess severity was consistent with other studies.[12] [40] Physicians consider a multitude of other factors when choosing site of care, and tools that predict 30-day mortality or ICU needs are not a perfect surrogate for the estimation the physician truly has to make: the risk of failure if a patient receives the counterfactual/opposite of the treatment under consideration (i.e., if the patient is discharged home). Social comorbidities such as homelessness and mental illness are crucial to decision-making but are difficult to glean from structured data. This is a known blind spot for computerized CDS. Applying more clinically relevant risk prediction, and developing approaches such as natural language processing of clinical documents to identify unstructured data such as social risk factors,[41] [42] can improve CDS' ability to better align with physician decision-making and are the subjects of future work.

Other reasons for rejection represented new findings, which our thematic analysis of text revealed. We identified alternative diagnoses as an additional important reason to reject guideline recommendations not previously highlighted in the literature. Diagnostic uncertainty in pneumonia could be caused by many factors: several other diagnoses can present similarly, microbiologic confirmation is rarely identified, and accurate diagnosis can be challenged by time pressure and desire to treat.[43] [44] The diagnosis of pneumonia often changes during a hospital course.[45] CDS may have an important role in supporting clinicians' uncertainty, by synthesizing disconfirming evidence, suggesting alternative diagnoses, or recommending additional diagnostic testing. This may better align with decision-making, help reduce anchoring, and enhance providers' ability to consider alternative diagnoses throughout the clinical course.

Our study has some limitations. The CDS required voluntary participation by providers; thus, our results represent feedback from only those physicians who used it. Our examination of provider characteristics suggested a relationship between provider age and guideline adherence for site-of-care and antibiotic use. However, additional provider and patient characteristics in a larger sample of physicians would better inform these relationships. Our focus was on capturing provider-reported reasons for deviating from guidelines, so we did not examine patterns of disagreements by patient characteristics, as previous studies have done.[12] Reported reasons for rejecting may not always reflect the “true” reasons that physicians might deviate. Individuals are often not aware of factors affecting decision-making such as cognitive load, uncertainty, bias, economic or time pressures, or practice norms.[46] Responses may also be influenced by social desirability.[47] Citing patient factors as reasons for rejecting guidelines may be a more socially desirable and conscious response than reporting personal uncertainty. Our future work is directed at examining both reported and unreported patient and provider characteristics that influence physician behavior.

Our results have stimulated ongoing, continuous improvements of ePneumonia in our system. Since our evaluation, we have incorporated additional clinical data that providers reported using to make decisions, including continuous updates of patient data throughout the encounter, efforts to improve the allergy assessment, more information contributing to illness severity, and more accurate pathogen risk assessment. Utilization of ePneumonia increased from 63% during the study period to 90% over the following year with subsequent iterations. Future work includes the adaptation of ePneumonia to hospital EDs in our system, including rural and urgent care settings, exploration of CDS design that provides diagnosis support, and automating the collection of qualitative data from the interaction. Our study demonstrates the feasibility of CDS to engage physicians at the bedside, empower them to share their clinical and CDS experiences, and leverage those experiences to improve CDS design and pneumonia care across the system.


#

Conclusion

In a study aimed to understand reasons for rejection of best-practice recommendations for pneumonia during the implementation of a computerized CDS, we found that the majority of physicians provided reasons for rejection using unstructured text. Thematic analysis of text data revealed new reasons for rejection not previously highlighted in the literature, including alternative diagnoses to pneumonia and additional patient information used for decision-making not extracted by the tool. CDS implementation that promotes and examines physician–CDS interactions is feasible and provides insights to the implementation and adaptation process across a health care system.


#

Clinical Relevance Statement

During implementation of a computerized clinical decision support (CDS) tool for pneumonia, we found that the majority of physicians rejecting best-practice recommendations in practice provided reasons for rejection using unstructured text. Thematic analysis of text data revealed alternative diagnoses to pneumonia and additional patient information used for decision-making not specified in best-practice guidelines. CDS that engages physicians in a dialogue is feasible and provides insights to the implementation and adaptation of pneumonia guidelines across a learning health care system.


#

Multiple Choice Questions

  1. Which of the following statements about pneumonia is true?

    • There is little evidence to guide pneumonia diagnosis and management.

    • Pneumonia diagnosis and management is very straightforward, with minimal variation.

    • Pneumonia management demonstrates widespread variation in antibiotic selection and site-of-care decisions.

    • Clinical decision support tools have dramatically reduced variation and improved practice and outcomes for patients with pneumonia.

    Correct Answer: The correct answer is option c. While evidence-based guidelines are associated with improved outcomes, widespread variation in antibiotic selection and site-of-care decisions exist. Clinical decision support tools, are a promising way to reduce the gap between evidence and practice, but they have not been consistently shown to dramatically impact practice.

  2. Which of the following statements about the results from this study is true?

    • Physicians rejected recommendations from the CDS most of the time.

    • Physicians who rejected recommendations from the CDS left reasons for adoptions.

    • Physicians found the CDS difficult to use.

    • Physicians used the CDS for all pneumonia patients.

    Correct Answer: The correct answer is option b. Physicians often agreed with recommendations, but when they rejected recommendations from the CDS, they left reasons for adoptions most of the time.


#
#

Conflict of Interest

None declared.

Protection of Human and Animal Subjects

The study was performed in compliance with the World Medical Association Declaration of Helsinki on Ethical Principles for Medical Research Involving Human Subjects, and was reviewed and approved by the Intermountain Healthcare Institutional Review Board (IRB #1017598). Implied consent was obtained from all surveyed physicians by completion of the survey, and was approved by the IRB; waiver of consent was approved by the IRB for tool data collection.


Supplementary Material

  • References

  • 1 Micek ST, Lang A, Fuller BM, Hampton NB, Kollef MH. Clinical implications for patients treated inappropriately for community-acquired pneumonia in the emergency department. BMC Infect Dis 2014; 14: 61
  • 2 Frei CR, Attridge RT, Mortensen EM. , et al. Guideline-concordant antibiotic use and survival among patients with community-acquired pneumonia admitted to the intensive care unit. Clin Ther 2010; 32 (02) 293-299
  • 3 Aliberti S, Faverio P, Blasi F. Hospital admission decision for patients with community-acquired pneumonia. Curr Infect Dis Rep 2013; 15 (02) 167-176
  • 4 Mandell LA, Wunderink RG, Anzueto A. , et al; Infectious Diseases Society of America; American Thoracic Society. Infectious Diseases Society of America/American Thoracic Society consensus guidelines on the management of community-acquired pneumonia in adults. Clin Infect Dis 2007; 44 (Suppl. 02) S27-S72
  • 5 Bunce AE, Gold R, Davis JV. , et al. “Salt in the Wound”: safety net clinician perspectives on performance feedback derived from EHR data. J Ambul Care Manage 2017; 40 (01) 26-35
  • 6 Capp R, Chang Y, Brown DF. Effective antibiotic treatment prescribed by emergency physicians in patients admitted to the intensive care unit with severe sepsis or septic shock: where is the gap?. J Emerg Med 2011; 41 (06) 573-580
  • 7 Jenkins TC, Stella SA, Cervantes L. , et al. Targets for antibiotic and healthcare resource stewardship in inpatient community-acquired pneumonia: a comparison of management practices with National Guideline Recommendations. Infection 2013; 41 (01) 135-144
  • 8 Jones BE, Brown KA, Jones MM. , et al. Variation in empiric coverage versus detection of methicillin-resistant Staphylococcus aureus and Pseudomonas aeruginosa in hospitalizations for community-onset pneumonia across 128 US Veterans Affairs medical centers. Infect Control Hosp Epidemiol 2017; 38 (08) 937-944
  • 9 Busby J, Purdy S, Hollingworth W. A systematic review of the magnitude and cause of geographic variation in unplanned hospital admission rates and length of stay for ambulatory care sensitive conditions. BMC Health Serv Res 2015; 15: 324
  • 10 Dean NC, Jones JP, Aronsky D. , et al. Hospital admission decision for patients with community-acquired pneumonia: variability among physicians in an emergency department. Ann Emerg Med 2012; 59 (01) 35-41
  • 11 Eddy DM. Variations in physician practice: the role of uncertainty. Health Aff (Millwood) 1984; 3 (02) 74-89
  • 12 Halm EA, Atlas SJ, Borowsky LH. , et al. Understanding physician adherence with a pneumonia practice guideline: effects of patient, system, and physician factors. Arch Intern Med 2000; 160 (01) 98-104
  • 13 Committee on the Learning Health Care System in America, Institute of Medicine; Smith M, Saunders R, Stuckhardt L, McGinnis JM, eds. Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. Washington (DC): National Academies Press (US);2013
  • 14 Garg AX, Adhikari NK, McDonald H. , et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA 2005; 293 (10) 1223-1238
  • 15 Vines C, Dean NC. Technology implementation impacting the outcomes of patients with CAP. Semin Respir Crit Care Med 2012; 33 (03) 292-297
  • 16 Kellermann AL, Jones SS. What it will take to achieve the as-yet-unfulfilled promises of health information technology. Health Aff (Millwood) 2013; 32 (01) 63-68
  • 17 Dean NC, Jones BE, Jones JP. , et al. Impact of an electronic clinical decision support tool for emergency department patients with pneumonia. Ann Emerg Med 2015; 66 (05) 511-520
  • 18 Marrie TJ, Lau CY, Wheeler SL, Wong CJ, Vandervoort MK, Feagan BG. A controlled trial of a critical pathway for treatment of community-acquired pneumonia. CAPITAL Study Investigators. Community-Acquired Pneumonia Intervention Trial Assessing Levofloxacin. JAMA 2000; 283 (06) 749-755
  • 19 Jones BE, Jones JP, Vines CG, Dean NC. Validating hospital admission criteria for decision support in pneumonia. BMC Pulm Med 2014; 14: 149
  • 20 Evans RS, Lloyd JF, Pierce LA. Clinical use of an enterprise data warehouse. AMIA Annu Symp Proc 2012; 2012: 189-198
  • 21 Dean NC, Silver MP, Bateman KA, James B, Hadlock CJ, Hale D. Decreased mortality after implementation of a treatment guideline for community-acquired pneumonia. Am J Med 2001; 110 (06) 451-457
  • 22 Dean NC, Suchyta MR, Bateman KA, Aronsky D, Hadlock CJ. Implementation of admission decision support for community-acquired pneumonia. Chest 2000; 117 (05) 1368-1377
  • 23 Lanspa MJ, Jones BE, Brown SM, Dean NC. Mortality, morbidity, and disease severity of patients with aspiration pneumonia. J Hosp Med 2013; 8 (02) 83-90
  • 24 Jones BE, Jones J, Bewick T. , et al. CURB-65 pneumonia severity assessment adapted for electronic decision support. Chest 2011; 140 (01) 156-163
  • 25 Brown SM, Jones BE, Jephson AR, Dean NC. ; Infectious Disease Society of America/American Thoracic Society 2007. Validation of the Infectious Disease Society of America/American Thoracic Society 2007 guidelines for severe community-acquired pneumonia. Crit Care Med 2009; 37 (12) 3010-3016
  • 26 Aujesky D, McCausland JB, Whittle J, Obrosky DS, Yealy DM, Fine MJ. Reasons why emergency department providers do not rely on the pneumonia severity index to determine the initial site of treatment for patients with pneumonia. Clin Infect Dis 2009; 49 (10) e100-e108
  • 27 Schouten JA, Hulscher ME, Natsch S, Kullberg BJ, van der Meer JWM, Grol RPTM. Barriers to optimal antibiotic use for community-acquired pneumonia at hospitals: a qualitative study. Qual Saf Health Care 2007; 16 (02) 143-149
  • 28 Trochim W, Kane M. Concept mapping: an introduction to structured conceptualization in health care. Int J Qual Health Care 2005; 17 (03) 187-191
  • 29 Rogers EM. Diffusion of Innovations. New York: Free Press; 2003
  • 30 Yoshida E, Fei S, Bavuso K, Lagor C, Maviglia S. The value of monitoring clinical decision support interventions. Appl Clin Inform 2018; 9 (01) 163-173
  • 31 Kassakian SZ, Yackel TR, Gorman PN, Dorr DA. Clinical decisions support malfunctions in a commercial electronic health record. Appl Clin Inform 2017; 8 (03) 910-923
  • 32 Sahota N, Lloyd R, Ramakrishna A. , et al; CCDSS Systematic Review Team. Computerized clinical decision support systems for acute care management: a decision-maker-researcher partnership systematic review of effects on process of care and patient outcomes. Implement Sci 2011; 6: 91
  • 33 Coiera E. Technology, cognition and error. BMJ Qual Saf 2015; 24 (07) 417-422
  • 34 Bennett P, Hardiker NR. The use of computerized clinical decision support systems in emergency care: a substantive review of the literature. J Am Med Inform Assoc 2017; 24 (03) 655-668
  • 35 Ballard DW, Vemula R, Chettipally UK. , et al; KP CREST Network Investigators. Optimizing clinical decision support in the electronic health record. Clinical characteristics associated with the use of a decision tool for disposition of ED patients with pulmonary embolism. Appl Clin Inform 2016; 7 (03) 883-898
  • 36 Zhang J, Walji MF. TURF: toward a unified framework of EHR usability. J Biomed Inform 2011; 44 (06) 1056-1067
  • 37 Graham TA, Kushniruk AW, Bullard MJ, Holroyd BR, Meurer DP, Rowe BH. How usability of a web-based clinical decision support system has the potential to contribute to adverse medical events. AMIA Annu Symp Proc 2008; 2008: 257-261
  • 38 Miller A, Moon B, Anders S, Walden R, Brown S, Montella D. Integrating computerized clinical decision support systems into clinical work: a meta-synthesis of qualitative research. Int J Med Inform 2015; 84 (12) 1009-1018
  • 39 Werth GR, Connelly DP. Continuous quality improvement and medical informatics: the convergent synergy. Proc Annu Symp Comput Appl Med Care 1992; 631-635
  • 40 Chow AL, Lye DC, Arah OA. Patient and physician predictors of patient receipt of therapies recommended by a computerized decision support system when initially prescribed broad-spectrum antibiotics: a cohort study. J Am Med Inform Assoc 2016; 23 (e1): e58-e70
  • 41 Gundlapalli AV, Carter ME, Divita G. , et al. Extracting concepts related to homelessness from the free text of VA electronic medical records. AMIA Annu Symp Proc 2014; 2014: 589-598
  • 42 Gundlapalli AV, Redd A, Carter M. , et al. Validating a strategy for psychosocial phenotyping using a large corpus of clinical text. J Am Med Inform Assoc 2013; 20 (e2): e355-e364
  • 43 Welker JA, Huston M, McCue JD. Antibiotic timing and errors in diagnosing pneumonia. Arch Intern Med 2008; 168 (04) 351-356
  • 44 Kanwar M, Brar N, Khatib R, Fakih MG. Misdiagnosis of community-acquired pneumonia and inappropriate utilization of antibiotics: side effects of the 4-h antibiotic administration rule. Chest 2007; 131 (06) 1865-1869
  • 45 Jones BE, Jones M, Xi Z. , et al. The “Working” Diagnosis: Changes in the Pneumonia Diagnosis Among Hospitalized Veterans. Society for Medical Decision-Making Annual Conference; 2015 . Available at: https://smdm.confex.com/smdm/2015mo/webprogram/Paper9321.html . Accessed November 28, 2018
  • 46 Nisbett RE, Wilson TD. Telling more than we can know - verbal reports on mental processes. Psychol Rev 1977; 84 (03) 231-259
  • 47 Bradburn NM, Sudman S, Wansink B. Asking Questions: The Definitive Guide to Questionnaire Design–For Market Research, Political Polls, and Social and Health Questionnaires. Revised edition. San Francisco, CA: Jossey-Bass; 2004

Address for correspondence

Barbara E. Jones, MD, MSc
VA Salt Lake City IDEAS Center, VA Salt Lake City Healthcare System
500 Foothill Drive Building 2, Salt Lake City, UT 84148-0002
United States   

  • References

  • 1 Micek ST, Lang A, Fuller BM, Hampton NB, Kollef MH. Clinical implications for patients treated inappropriately for community-acquired pneumonia in the emergency department. BMC Infect Dis 2014; 14: 61
  • 2 Frei CR, Attridge RT, Mortensen EM. , et al. Guideline-concordant antibiotic use and survival among patients with community-acquired pneumonia admitted to the intensive care unit. Clin Ther 2010; 32 (02) 293-299
  • 3 Aliberti S, Faverio P, Blasi F. Hospital admission decision for patients with community-acquired pneumonia. Curr Infect Dis Rep 2013; 15 (02) 167-176
  • 4 Mandell LA, Wunderink RG, Anzueto A. , et al; Infectious Diseases Society of America; American Thoracic Society. Infectious Diseases Society of America/American Thoracic Society consensus guidelines on the management of community-acquired pneumonia in adults. Clin Infect Dis 2007; 44 (Suppl. 02) S27-S72
  • 5 Bunce AE, Gold R, Davis JV. , et al. “Salt in the Wound”: safety net clinician perspectives on performance feedback derived from EHR data. J Ambul Care Manage 2017; 40 (01) 26-35
  • 6 Capp R, Chang Y, Brown DF. Effective antibiotic treatment prescribed by emergency physicians in patients admitted to the intensive care unit with severe sepsis or septic shock: where is the gap?. J Emerg Med 2011; 41 (06) 573-580
  • 7 Jenkins TC, Stella SA, Cervantes L. , et al. Targets for antibiotic and healthcare resource stewardship in inpatient community-acquired pneumonia: a comparison of management practices with National Guideline Recommendations. Infection 2013; 41 (01) 135-144
  • 8 Jones BE, Brown KA, Jones MM. , et al. Variation in empiric coverage versus detection of methicillin-resistant Staphylococcus aureus and Pseudomonas aeruginosa in hospitalizations for community-onset pneumonia across 128 US Veterans Affairs medical centers. Infect Control Hosp Epidemiol 2017; 38 (08) 937-944
  • 9 Busby J, Purdy S, Hollingworth W. A systematic review of the magnitude and cause of geographic variation in unplanned hospital admission rates and length of stay for ambulatory care sensitive conditions. BMC Health Serv Res 2015; 15: 324
  • 10 Dean NC, Jones JP, Aronsky D. , et al. Hospital admission decision for patients with community-acquired pneumonia: variability among physicians in an emergency department. Ann Emerg Med 2012; 59 (01) 35-41
  • 11 Eddy DM. Variations in physician practice: the role of uncertainty. Health Aff (Millwood) 1984; 3 (02) 74-89
  • 12 Halm EA, Atlas SJ, Borowsky LH. , et al. Understanding physician adherence with a pneumonia practice guideline: effects of patient, system, and physician factors. Arch Intern Med 2000; 160 (01) 98-104
  • 13 Committee on the Learning Health Care System in America, Institute of Medicine; Smith M, Saunders R, Stuckhardt L, McGinnis JM, eds. Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. Washington (DC): National Academies Press (US);2013
  • 14 Garg AX, Adhikari NK, McDonald H. , et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA 2005; 293 (10) 1223-1238
  • 15 Vines C, Dean NC. Technology implementation impacting the outcomes of patients with CAP. Semin Respir Crit Care Med 2012; 33 (03) 292-297
  • 16 Kellermann AL, Jones SS. What it will take to achieve the as-yet-unfulfilled promises of health information technology. Health Aff (Millwood) 2013; 32 (01) 63-68
  • 17 Dean NC, Jones BE, Jones JP. , et al. Impact of an electronic clinical decision support tool for emergency department patients with pneumonia. Ann Emerg Med 2015; 66 (05) 511-520
  • 18 Marrie TJ, Lau CY, Wheeler SL, Wong CJ, Vandervoort MK, Feagan BG. A controlled trial of a critical pathway for treatment of community-acquired pneumonia. CAPITAL Study Investigators. Community-Acquired Pneumonia Intervention Trial Assessing Levofloxacin. JAMA 2000; 283 (06) 749-755
  • 19 Jones BE, Jones JP, Vines CG, Dean NC. Validating hospital admission criteria for decision support in pneumonia. BMC Pulm Med 2014; 14: 149
  • 20 Evans RS, Lloyd JF, Pierce LA. Clinical use of an enterprise data warehouse. AMIA Annu Symp Proc 2012; 2012: 189-198
  • 21 Dean NC, Silver MP, Bateman KA, James B, Hadlock CJ, Hale D. Decreased mortality after implementation of a treatment guideline for community-acquired pneumonia. Am J Med 2001; 110 (06) 451-457
  • 22 Dean NC, Suchyta MR, Bateman KA, Aronsky D, Hadlock CJ. Implementation of admission decision support for community-acquired pneumonia. Chest 2000; 117 (05) 1368-1377
  • 23 Lanspa MJ, Jones BE, Brown SM, Dean NC. Mortality, morbidity, and disease severity of patients with aspiration pneumonia. J Hosp Med 2013; 8 (02) 83-90
  • 24 Jones BE, Jones J, Bewick T. , et al. CURB-65 pneumonia severity assessment adapted for electronic decision support. Chest 2011; 140 (01) 156-163
  • 25 Brown SM, Jones BE, Jephson AR, Dean NC. ; Infectious Disease Society of America/American Thoracic Society 2007. Validation of the Infectious Disease Society of America/American Thoracic Society 2007 guidelines for severe community-acquired pneumonia. Crit Care Med 2009; 37 (12) 3010-3016
  • 26 Aujesky D, McCausland JB, Whittle J, Obrosky DS, Yealy DM, Fine MJ. Reasons why emergency department providers do not rely on the pneumonia severity index to determine the initial site of treatment for patients with pneumonia. Clin Infect Dis 2009; 49 (10) e100-e108
  • 27 Schouten JA, Hulscher ME, Natsch S, Kullberg BJ, van der Meer JWM, Grol RPTM. Barriers to optimal antibiotic use for community-acquired pneumonia at hospitals: a qualitative study. Qual Saf Health Care 2007; 16 (02) 143-149
  • 28 Trochim W, Kane M. Concept mapping: an introduction to structured conceptualization in health care. Int J Qual Health Care 2005; 17 (03) 187-191
  • 29 Rogers EM. Diffusion of Innovations. New York: Free Press; 2003
  • 30 Yoshida E, Fei S, Bavuso K, Lagor C, Maviglia S. The value of monitoring clinical decision support interventions. Appl Clin Inform 2018; 9 (01) 163-173
  • 31 Kassakian SZ, Yackel TR, Gorman PN, Dorr DA. Clinical decisions support malfunctions in a commercial electronic health record. Appl Clin Inform 2017; 8 (03) 910-923
  • 32 Sahota N, Lloyd R, Ramakrishna A. , et al; CCDSS Systematic Review Team. Computerized clinical decision support systems for acute care management: a decision-maker-researcher partnership systematic review of effects on process of care and patient outcomes. Implement Sci 2011; 6: 91
  • 33 Coiera E. Technology, cognition and error. BMJ Qual Saf 2015; 24 (07) 417-422
  • 34 Bennett P, Hardiker NR. The use of computerized clinical decision support systems in emergency care: a substantive review of the literature. J Am Med Inform Assoc 2017; 24 (03) 655-668
  • 35 Ballard DW, Vemula R, Chettipally UK. , et al; KP CREST Network Investigators. Optimizing clinical decision support in the electronic health record. Clinical characteristics associated with the use of a decision tool for disposition of ED patients with pulmonary embolism. Appl Clin Inform 2016; 7 (03) 883-898
  • 36 Zhang J, Walji MF. TURF: toward a unified framework of EHR usability. J Biomed Inform 2011; 44 (06) 1056-1067
  • 37 Graham TA, Kushniruk AW, Bullard MJ, Holroyd BR, Meurer DP, Rowe BH. How usability of a web-based clinical decision support system has the potential to contribute to adverse medical events. AMIA Annu Symp Proc 2008; 2008: 257-261
  • 38 Miller A, Moon B, Anders S, Walden R, Brown S, Montella D. Integrating computerized clinical decision support systems into clinical work: a meta-synthesis of qualitative research. Int J Med Inform 2015; 84 (12) 1009-1018
  • 39 Werth GR, Connelly DP. Continuous quality improvement and medical informatics: the convergent synergy. Proc Annu Symp Comput Appl Med Care 1992; 631-635
  • 40 Chow AL, Lye DC, Arah OA. Patient and physician predictors of patient receipt of therapies recommended by a computerized decision support system when initially prescribed broad-spectrum antibiotics: a cohort study. J Am Med Inform Assoc 2016; 23 (e1): e58-e70
  • 41 Gundlapalli AV, Carter ME, Divita G. , et al. Extracting concepts related to homelessness from the free text of VA electronic medical records. AMIA Annu Symp Proc 2014; 2014: 589-598
  • 42 Gundlapalli AV, Redd A, Carter M. , et al. Validating a strategy for psychosocial phenotyping using a large corpus of clinical text. J Am Med Inform Assoc 2013; 20 (e2): e355-e364
  • 43 Welker JA, Huston M, McCue JD. Antibiotic timing and errors in diagnosing pneumonia. Arch Intern Med 2008; 168 (04) 351-356
  • 44 Kanwar M, Brar N, Khatib R, Fakih MG. Misdiagnosis of community-acquired pneumonia and inappropriate utilization of antibiotics: side effects of the 4-h antibiotic administration rule. Chest 2007; 131 (06) 1865-1869
  • 45 Jones BE, Jones M, Xi Z. , et al. The “Working” Diagnosis: Changes in the Pneumonia Diagnosis Among Hospitalized Veterans. Society for Medical Decision-Making Annual Conference; 2015 . Available at: https://smdm.confex.com/smdm/2015mo/webprogram/Paper9321.html . Accessed November 28, 2018
  • 46 Nisbett RE, Wilson TD. Telling more than we can know - verbal reports on mental processes. Psychol Rev 1977; 84 (03) 231-259
  • 47 Bradburn NM, Sudman S, Wansink B. Asking Questions: The Definitive Guide to Questionnaire Design–For Market Research, Political Polls, and Social and Health Questionnaires. Revised edition. San Francisco, CA: Jossey-Bass; 2004

Zoom Image
Fig. 1 Workflow.
Zoom Image
Fig. 2 Revision. Concept mapping site-of-care deviations.
Zoom Image
Fig. 3 Revision. Concept mapping antibiotic deviations.