Subscribe to RSS
DOI: 10.1055/a-2040-0578
Human-Centered Design of a Clinical Decision Support for Anemia Screening in Children with Inflammatory Bowel Disease
- Abstract
- Background and Significance
- Objective
- Methods
- Results
- Discussion
- Conclusion
- Clinical Relevance Statement
- Multiple-Choice Questions
- References
Abstract
Background Inflammatory bowel disease (IBD) commonly leads to iron deficiency anemia (IDA). Rates of screening and treatment of IDA are often low. A clinical decision support system (CDSS) embedded in an electronic health record could improve adherence to evidence-based care. Rates of CDSS adoption are often low due to poor usability and fit with work processes. One solution is to use human-centered design (HCD), which designs CDSS based on identified user needs and context of use and evaluates prototypes for usefulness and usability.
Objectives this study aimed to use HCD to design a CDSS tool called the IBD Anemia Diagnosis Tool, IADx.
Methods Interviews with IBD practitioners informed creation of a process map of anemia care that was used by an interdisciplinary team that used HCD principles to create a prototype CDSS. The prototype was iteratively tested with “Think Aloud” usability evaluation with clinicians as well as semi-structured interviews, a survey, and observations. Feedback was coded and informed redesign.
Results Process mapping showed that IADx should function at in-person encounters and asynchronous laboratory review. Clinicians desired full automation of clinical information acquisition such as laboratory trends and analysis such as calculation of iron deficit, less automation of clinical decision selection such as laboratory ordering, and no automation of action implementation such as signing medication orders. Providers preferred an interruptive alert over a noninterruptive reminder.
Conclusion Providers preferred an interruptive alert, perhaps due to the low likelihood of noticing a noninterruptive advisory. High levels of desire for automation of information acquisition and analysis with less automation of decision selection and action may be generalizable to other CDSSs designed for chronic disease management. This underlines the ways in which CDSSs have the potential to augment rather than replace provider cognitive work.
Keywords
clinical decision support systems - usability testing - human-centered decision - clinical workflowsBackground and Significance
Inflammatory bowel disease (IBD) affects up to 300,000 children in the United States[1] and is characterized by gut inflammation leading to bloody diarrhea, weight loss, and multisystemic complications.[2] The most common IBD complication is iron deficiency anemia (IDA),[3] which worsens quality of life[4] and developmental outcomes in children.[5] Guideline-based IBD IDA care involves annual screening for anemia and treatment with iron when deficiency is present, but adherence to guidelines is often poor.[6] [7]
Clinical decision support systems (CDSSs) integrated into electronic health records (EHRs) provide an opportunity to deliver evidence-based recommendations that are well integrated into systems of care.[8] [9] [10] [11] Human-centered design (HCD) has been applied to the development of CDSS[12] [13] [14] [15] to improve integration of CDSS tools into clinical workflows and to improve usability with the goals of improving patient safety, enhancing clinical outcomes, and improving process efficiency, among other aims.[12] [13] [16] [17] HCD involves in-depth analysis of work systems and care processes[12] [16] [18] to inform design of a CDSS prototype, which undergoes iterative redesign using human factors (HF) principles,[12] [19] [20] taking into account the five “rights” of clinical decision support (CDS).[21] Key gaps in the research of HCD of CDSS include methods of translating requirements into build and demonstration of how HCD contributes to design.[14] [22]
Objective
The objective was to use HCD methods to evaluate baseline care practices to inform the iterative design of a CDSS that was integrated into clinical workflows and designed to improve IBD IDA care, the IBD Anemia Diagnosis Tool or IADx.
Methods
Organizational Setting
This study was conducted in the Pediatric Gastroenterology Division at Johns Hopkins University School of Medicine.[23] At the time of the study, the division had 12 full-time faculty, 6 fellows, and 1 nurse practitioner who provided care to 500 children with IBD. All research was conducted with approval of the institutional review board. An interdisciplinary team was assembled including IBD experts (M.O.-H., S.H.), EHR software developers (S.D.M. and A.M.), health informaticians (S.D.M., H.P.L., Z.M., J.H.G., and P.N.), and an HF engineer (A.P.G.). [Table 1] summarizes the methodology used to carry out the HCD and formative evaluation of IADx.
Abbreviations: CDSS, clinical decision support system; SUS, system usability scale.[34]
Process Mapping
To determine the approach to anemia screening by clinicians, semi-structured interviews were performed with a junior and senior IBD clinician (S.D.M. and M.O.-H.) in the Pediatric Gastroenterology Division using an interview guide (see [Supplementary Material A], available in the online version). The interviewer used responses to draft a process map using draw.io,[24] which was validated in a follow-up interview.
IADx Prototype Design
A 1-hour virtual design meeting was held[25] with health informaticians and clinicians (S.D.M., H.P.L., and A.M.) in which the process map was used to identify tasks that could be supported by CDSS. The group selected an intervention involving an interruptive alert with a linked order set. A prototype was designed on paper and shown to senior IBD researchers (M.O.H. and S.H.) for feedback. Over a series of eight additional 30-minute design sessions, S.D.M. and A.M. built a functioning EHR-based CDSS prototype in the EpicCare Ambulatory EHR.[26] Conditional rules were created to identify patients with IBD. Evidence-based laboratory thresholds[27] were used to determine whether or not patients had IDA and Health Maintenance Plans in Epic were linked to laboratory monitoring tests. When the patient was due for at least one laboratory or had IDA within the prior year and no iron prescription, a best practice advisory (BPA) with a linked order set with preselected laboratories and medications was presented to the clinician.
Simulation-Based Usability Evaluation of the IADx Prototype
Simulation-based evaluation was performed to iteratively refine the IADx prototype with clinician end-users using a “Think Aloud” technique where users verbalized thoughts while using IADx, a postexperimental short survey and a semi-structured interview (see [Table 1]; [Supplementary Materials B, C, and D], available in the online version).[28] The feedback was recorded, transcribed, and analyzed for common themes, which were incorporated into the redesign.[8] [29]
Participants
Participants were recruited via email from active providers with purposive sampling to include a variety of roles and experience levels. No compensation was offered.
Clinical Scenarios and Tasks
IADx was tested in Epic TST at the Johns Hopkins University Simulation Center using a “computer on wheels” workstation. A member of the study team (Z.M.) sat next to the participant to guide the session, conduct observations using an observation guide, perform semi-structured interviews, and record responses (see [Supplementary Materials B, C, and D], available in the online version).
Each end-user tested two simulated in-person office visits and two asynchronous In Basket encounters to review laboratory results. Simulated patient records varied by (1) status of anemia screening (screened vs. unscreened), (2) presence of IDA (IDA vs. no IDA), (3) presence of iron treatment (treated IDA vs. untreated IDA), and (4) interruptive vs. noninterruptive alert. Order of encounters was randomized using a modified Latin square method.[30]
Measures and Data Collection
Audio and video of the participant and computer screen were recorded using B-Line Sim Capture,[31] and other data were collected on REDCap.[32] [33] After completion of all four scenarios by each participant, the System Usability Scale (SUS)[34] was administered and semi-structured interviews were performed (see [Supplementary Material D], available in the online version).
Data Analysis for Iterative Redesign of IADx
After each session, audio recordings were transcribed by study staff and independently coded (S.D.M. and Z.M.) according to a priori defined categories of usability, visibility, workflow, content, understandability, practical usefulness, medical usefulness, and navigation (see [Supplementary Material E], available in the online version).[8] Coders met to reconcile differences. The coded data were interpreted using HF principles including the five “rights” of CDS[21] with an HF engineer (A.P.G.), and incorporated into tool redesign. Across each of the two rounds of IADx redesign, participants were recruited until all categories of “Think Aloud” feedback achieved theoretical saturation.[35] Task completion and perceived difficulty rates as well as SUS score were compared between rounds by Fisher's exact test and t-test methods using R version 3.5.2.
Results
Process Map
Screening occurs in the context of clinic visits, wherein the provider must remember to consider screening, order laboratories, and remember to follow-up results. Iron treatment was initiated on review of laboratory tests during clinic visits or In Basket encounters ([Fig. 1]) and required calculation of dosage, ordering of follow-up laboratories, and patient/family education.


IADx Design
During the initial design meetings, IADx was created to act at the steps of screening for IDA, interpretation of laboratory results, and ordering of iron and repeat IDA screening for patients found to have IDA ([Fig. 1]). The initial IADx prototype displayed laboratories for which the patient was due and linked to an order set containing laboratory orders, iron orders, and patient instructions, which were preselected and editable by the user.
“Think Aloud” Usability Testing Results
Six providers participated across two rounds of testing. Participant characteristics except for years in practice were similar between rounds ([Table 2]).
Abbreviation: BPA, best practice alert.
[Table 3] summarizes “Think Aloud” and semi-structured interview comments across two rounds of testing with related design choices, HF principles, and process implications. [Fig. 2A and B] contain screenshots of IADx with highlighted HF redesign principles. Coded provider feedback was incorporated into iterative redesign. In round one, providers desired to “see a trend” of laboratories and normal values to make treatment decisions, so laboratory trends and normal ranges were added. End-users raised concerns about redundant and confusing features in the tool including laboratory due dates, so these were removed. Providers reported not using oral iron to treat IDA but preferred intravenous (IV) iron with desire for auto-calculation of the treatment dose of IV iron.


End user feedback (round) |
Usability dimension |
Design choice or feature |
Human factor design principle |
Potential process implications |
---|---|---|---|---|
“I cannot remember every [normal value] off the top of my head.” (1) |
Usability |
Include normal value ranges. |
Dependence on memory[51] |
Facilitates provider information analysis of laboratory data by reducing need to memorize or look up normal laboratory values. |
“So here's an iron order in the smartset?” (1) |
Visibility |
Make iron orders more visible. |
Enhance visibility[52] Automation of decision selection[46] |
Facilitates provider decision selection by making iron orders more prominent. |
“It says some blood is due in December 2018, which hasn't yet happened.” (1) |
Content |
Remove dates laboratories are due. |
Unnecessary information[51] |
Reduces time spent using the tool. |
“It's confusing for providers to do that equation.” (1) |
Practical usefulness |
Automatically calculate iron treatment dose by weight. |
Build in calculators[51] Automation of information analysis[46] |
Facilitates provider information analysis by automating a complex calculation of iron treatment dose. |
“See[2] [3] [hemoglobin] values before that so you can see a trend.” (1) |
Medical usefulness |
Include trends for laboratories. |
Patient information summary[51] Automation information acquisition[46] |
Facilitates provider information analysis and decision selection by providing laboratory information in the context of prior trends. |
“So it looks like he is on fer-in-sol so in this circumstance…I would definitely transition him to the IV iron” (1) |
Medical usefulness |
Do not auto-select medication orders |
Automation action implementation[46] |
Preserves provider autonomy by including options of iron medications to order, but leaving the ultimate decision of whether or not to order these up to the provider. |
“Here I see there's normals. I didn't notice that before.” (2) |
Visibility |
Make normal value chart more easily visible |
Information order and layout[51] |
Facilitates provider information analysis of laboratory data by making the normal value chart more prominent. |
“Why do I have to click that and then have a comment box enter?” (2) |
Usability |
Remove the comment box and acknowledge reason. |
Minimize workload[53] |
Reduces time spent using the tool. |
“The pop-up really brings your attention to the anemia.” (2) |
Visibility |
Use alert reminder. |
Alert timing[51] |
Increases likelihood of provider awareness of anemia in patient population. |
“It would be helpful if that took me to the alert.” (2) |
Workflow |
Link to the BPA section from the alert. |
Alert timing[51] |
Reduces time spent using the tool. |
“I presume they are not on iron at this time, right? (2) |
Understandability |
Include whether or not patient is on iron in the BPA. |
Match between system and world[52] Automation of information acquisition[46] |
Facilitates provider information analysis by automating information acquisition. |
“It would also be nice if the treatment plan would actually show up here.” (2) |
Navigation |
Include functioning link to iron infusion order set. |
Number of steps[51] Consistency and standards[52] |
Reduces time spent using the tool by embedding links to related order sets. |
Abbreviation: BPA, best practice alert; IV, intravenous.
Comments from round two affirmed that end-users felt a pop-up alert would be preferred as it “really bring your attention to the anemia,” and that the pop-up should fire at the time of order entry. “Think Aloud” feedback showed that providers preferred a more streamlined CDSS without a comment box in the pop-up alert but with a link to pertinent order sets. In addition to laboratory trends, providers requested data on iron prescriptions, which were added. See [Table 3] and [Fig. 2B] for examples.
SUS scores were similar across rounds with mean (standard deviation) scores of 77.5 (5.0) in round one and 75.0 (7.5) in round two, with no major change in score on any individual question. Most activities were completed easily or with little difficulty except placement of laboratory orders, which was difficult 33% of the time in round one and 0% in round two.
Discussion
Here, we present the HCD of IADx from development of a process map through iterative prototyping with formative usability evaluation to create a usable CDSS tool. We show how HF principles can be incorporated into CDSS creation at each stage of prototype development[22] within a commercial EHR.[12] [36]
Lessons Learned
-
A striking result of the “Think Aloud” feedback was the provider preference for an interruptive alert. Prior studies have shown that interruptive alerts are frequently disabled[37] [38] due to inappropriate alert triggering, poor usability, or poor fit with the workflow,[39] with recommendation to avoid using interruptive alerts for nonemergent CDSS.[40] [41] There is some evidence that interruptive alerts do cause more change in provider behavior than noninterruptive alerts,[39] giving credence to the idea that a noninterruptive IADx would commonly be ignored.[42] Provider preference may partly be explained by the fact that users were only offered the options of either an interruptive or noninterruptive BPA. A noninterruptive alert could be preferred, as in the Vitamin D soft stop reminder at order entry in Hendrickson et al,[43] or the dynamic order set to impact inpatient blood utilization as in Ikoma et al.[42] Alternatively, providers may tolerate this interruptive alert since it was customized to local need, occurred at the time in the workflow, and contained sufficient information for decision making.[21] [44] [45]
-
“Think Aloud” testing showed the provider preference for high levels of automation of information acquisition and analysis with some automation of decision selection (laboratories but not medications), and no automation of action implementation.[46] In the case of IADx, the low risk of laboratory ordering and the higher risk of iron medication may have contributed to this preference. Other studies on HCD of CDSS tools similarly showed that providers valued auto-retrieval of key data and prepopulation of available clinical information[16] but preferred that ultimate decisions about what to do with the automatically gathered and analyzed information be left to the end-user.[12]
-
Using HF principles to interpret end-user feedback for iterative redesign of IADx resulted in a tool that put everything needed for a decision on a single screen[16] through medication history in the linked BPA, patient information summary, laboratory trends with normal ranges, and built-in calculators. We would have liked to format the laboratory trends and normal ranges in a more visually elegant tabular display, but limitations of the EHR environment precluded this. A future IADx iteration could be implemented using web-based tools including CDSHooks[47] and Smart on FHIR[48] to allow for scaling across sites and improved flexibility in the presentation of patient-level data pertinent to decision making.
Limitations
This single-center study created a tool in response to local factors, which may limit the generalizability. Due to time and funding constraints, the baseline workflow evaluation only incorporated interviews with two physician practitioners rather than more in-depth methodology,[49] [50] potentially missing opportunity for interventions besides an interruptive alert. The small number of volunteer participants may introduce self-selection bias for end-users interested in technology. The study focused on feedback from provider end-users, missing input from other members of the care team. This study was limited only to the formative design of a CDSS rather than outcomes of implementation, which is of critical need to steer HCD methodology in directions that produce not just usable and workflow-integrated tools but ones that improve real-world clinical outcomes. Many of these limitations can be addressed in a future multi-center CDSS development and implementation effort.
Conclusion
This study presents the HCD of IADx, a tool to improve IDA care in children with IBD. The principles of high levels of automation of information acquisition and analysis but more limited automation of decision selection and action implementation may be broadly applicable to the development of other CDSS tools. In some cases, locally developed CDSS that is well-integrated into existing workflows may be usefully deployed as an interruptive alert when preferred by providers.
Clinical Relevance Statement
HCD helps develop CDSS tools that are integrated into clinical workflows. Practitioners want CDSS that automates information acquisition and analysis, but prefer that clinical decisions be left to the provider. CDSS that is well integrated into the workflow can in some cases be implemented as an interruptive alert, even for nonemergent clinical processes.
Multiple-Choice Questions
-
What are the benefits of human-centered design of clinical decision support systems?
-
Increased clinical throughput.
-
Improved integration into clinical workflows.
-
Shortened software development lifecycle.
-
Reduced requirement for clinical support staff.
Correct Answer: The correct answer is option b. Explanation: Human-centered design (HCD) has been applied to the development of clinical decision support systems (CDSS) as it brings a focus to people, tasks, and systems and seeks to incorporate human factors principles into design. HCD has been shown to improve integration of CDSS tools into clinical workflows and to improve usability. HCD involves in-depth analysis of work systems and care processes to understand cognition and tasks. This evaluation informs the design requirements and development of a CDSS prototype, which undergoes iterative redesign using HF principles by a multidisciplinary team, incorporating feedback end-users on CDSS design characteristics.
-
-
In the case of IADx, which cognitive processes did practitioners want to be addressed by the clinical decision support system?
-
Information analysis
-
Action implementation
-
Machine learning
-
Process redesign
Correct Answer: The correct answer is option a. Explanation: Clinician end-users universally desired that the clinical decision support system (CDSS) would automate information acquisition and information analysis. Providers were somewhat split about whether the CDSS should automate decision selection with preference to select laboratories for subsequent provider signature but not medications. Providers did not want automation of action implementation by the CDSS, such as having the system actually place orders.
-
Conflict of Interest
The authors have no financial or personal relationships with people or organizations that could inappropriately influence or bias the objectivity of the submitted content. The authors have no conflicts of interest.
Protection of Human and Animal Subjects
The study was performed in compliance with the World Medical Association Declaration of Helsinki on Ethical Principles for Medical Research Involving Human Subjects, and was reviewed by Johns Hopkins University Institutional Review Board.
-
References
- 1 Kappelman MD, Moore KR, Allen JK, Cook SF. Recent trends in the prevalence of Crohn's disease and ulcerative colitis in a commercially insured US population. Dig Dis Sci 2013; 58 (02) 519-525
- 2 Pigneur B, Seksik P, Viola S. et al. Natural history of Crohn's disease: comparison between childhood- and adult-onset disease. Inflamm Bowel Dis 2010; 16 (06) 953-961
- 3 Goodhand JR, Kamperidis N, Rao A. et al. Prevalence and management of anemia in children, adolescents, and adults with inflammatory bowel disease. Inflamm Bowel Dis 2012; 18 (03) 513-519
- 4 Danese S, Hoffman C, Vel S. et al. Anaemia from a patient perspective in inflammatory bowel disease: results from the European Federation of Crohn's and Ulcerative Colitis Association's online survey. Eur J Gastroenterol Hepatol 2014; 26 (12) 1385-1391
- 5 McCann JC, Ames BN. An overview of evidence for a causal relation between iron deficiency during development and deficits in cognitive or behavioral function. Am J Clin Nutr 2007; 85 (04) 931-945
- 6 Miller SD, Cuffari C, Akhuemonkhan E, Guerrerio AL, Lehmann H, Hutfless S. Anemia screening, prevalence, and treatment in pediatric inflammatory bowel disease in the United States, 2010–2014. Pediatr Gastroenterol Hepatol Nutr 2019; 22 (02) 152-161
- 7 Miller SD, Lehmann H, Murphy Z. et al. User centered digital development of a decision support tool to improve anemia care in pediatric inflammatory bowel disease. J Pediatr Gastroenterol Nutr 2019; 69: S331
- 8 Richardson S, Mishuris R, O'Connell A. et al. “Think aloud” and “Near live” usability testing of two complex clinical decision support tools. Int J Med Inform 2017; 106: 1-8
- 9
Bright TJ,
Wong A,
Dhurjati R.
et al.
Effect of clinical decision-support systems: a systematic review. Ann Intern Med 2012;
157 (01) 29-43
MissingFormLabel
- 10 Osheroff JA, Teich JM, Levick D. et al. Improving Outcomes with Clinical Decision Support: An Implementer's Guide. 2nd ed. Chicago, IL: HIMSS Publishing; 2012
- 11 Sutton RT, Pincock D, Baumgart DC, Sadowski DC, Fedorak RN, Kroeker KI. An overview of clinical decision support systems: benefits, risks, and strategies for success. NPJ Digit Med 2020; 3 (01) 17
- 12 Carayon P, Hoonakker P, Hundt AS. et al. Application of human factors to improve usability of clinical decision support for diagnostic decision-making: a scenario-based simulation study. BMJ Qual Saf 2019; 29 (04) 329-340
- 13 Hoonakker PLT, Carayon P, Salwei ME. et al. The design of PE dx, a CDS to support pulmonary embolism diagnosis in the ED. Stud Health Technol Inform 2019; 265: 134-140
- 14 Beuscart-Zéphir MC, Borycki E, Carayon P, Jaspers MWM, Pelayo S. Evolution of human factors research and studies of health information technologies: the role of patient safety. Yearb Med Inform 2013; 8: 67-77
- 15 Brunner J, Chuang E, Goldzweig C, Cain CL, Sugar C, Yano EM. User-centered design to improve clinical decision support in primary care. Int J Med Inform 2017; 104: 56-64
- 16 Savoy A, Militello LG, Patel H. et al. A cognitive systems engineering design approach to improve the usability of electronic order forms for medical consultation. J Biomed Inform 2018; 85: 138-148
- 17 Savoy A, Patel H, Flanagan ME, Daggy JK, Russ AL, Weiner M. Comparative usability evaluation of consultation order templates in a simulated primary care environment. Appl Ergon 2018; 73: 22-32
- 18 Thursky KA, Mahemoff M. User-centered design techniques for a computerised antibiotic decision support system in an intensive care unit. Int J Med Inform 2007; 76 (10) 760-768
- 19 Chokshi SK, Mann DM. Innovating from within: a process model for user-centered digital development in academic medical centers. JMIR Human Factors 2018; 5 (04) e11048
- 20 Kilsdonk E, Peute LWP, Riezebos RJ, Kremer LC, Jaspers MWM. Uncovering healthcare practitioners' information processing using the think-aloud method: From paper-based guideline to clinical decision support system. Int J Med Inform 2016; 86: 10-19
- 21 Campbell R. The five “rights” of clinical decision support. J AHIMA 2013; 84 (10) 42-47 , quiz 48
- 22 Carayon P, Hoonakker P. Human factors and usability for health information technology: old and new challenges. Yearb Med Inform 2019; 28 (01) 71-77
- 23 About us | Johns Hopkins Children's Center. Johns Hopkins Medicine Web site. Accessed Jan 30, 2020 at: https://www.hopkinsmedicine.org/johns-hopkins-childrens-center/about-us/index.html
- 24 Draw.io. . Draw.io Web site. Accessed August 21, 2020 at: https://app.diagrams.net/
- 25 Zoom. Zoom.us Web site. Updated 2020. Accessed August 21, 2020 at: https://zoom.us/
- 26 EpicCare. . Epic.com Web site. Accessed August 21, 2020 at: https://www.epic.com/software
- 27 Dignass AU, Gasche C, Bettenworth D. et al; European Crohn's and Colitis Organisation [ECCO]. European consensus on the diagnosis and management of iron deficiency and anaemia in inflammatory bowel diseases. J Crohn's Colitis 2015; 9 (03) 211-222
- 28 Rose AF, Schnipper JL, Park ER, Poon EG, Li Q, Middleton B. Using qualitative studies to improve the usability of an EMR. J Biomed Inform 2005; 38 (01) 51-60
- 29 Kilsdonk E, Peute LW, Jaspers MWM. Factors influencing implementation success of guideline-based clinical decision support systems: a systematic review and gaps analysis. Int J Med Inform 2017; 98: 56-64
- 30 Rojas B, White RF. The modified latin square. J R Stat Soc B 1957; 19 (02) 305-317
- 31 B-line medical sim capture. Accessed January 30, 2020 at: https://www.blinemedical.com/simcapture.html
- 32 REDCap. . Accessed January 30, 2020 at: https://www.project-redcap.org/
- 33 Galaxy - UX2100 usability badge. Accessed January 30, 2020 at: https://galaxy.epic.com/?#Browse/page=8400!68!240!3542166
- 34 Brooke J. SUS-A quick and dirty usability scale. Usability evaluation in industry. CRC Press 1996; 189: 4-7
- 35 Corbin J, Strauss A. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. 3rd ed. Thousand Oaks, CA: SAGE Publications; 2008
- 36 Kwan JL, Lo L, Ferguson J. et al. Computerised clinical decision support systems and absolute improvements in care: meta-analysis of controlled clinical trials. BMJ 2020; 370: m3216
- 37
Backman R,
Bayliss S,
Moore D,
Litchfield I.
Clinical reminder alert fatigue in healthcare: a systematic literature review protocol
using qualitative evidence. Syst Rev 2017; 6 (01) 255
MissingFormLabel
- 38
Ancker JS,
Edwards A,
Nosal S,
Hauser D,
Mauer E,
Kaushal R.
with the HITEC Investigators.
Effects of workload, work complexity, and repeated alerts on alert fatigue in a clinical
decision support system. BMC Med Inform Decis Mak 2017; 17 (01) 36
MissingFormLabel
- 39 Olakotan O, Mohd Yusof M, Ezat Wan Puteh S. A systematic review on CDSS alert appropriateness. Stud Health Technol Inform 2020; 270: 906-910
- 40 Ash JS, Berg M, Coiera E. Some unintended consequences of information technology in health care: the nature of patient care information system-related errors. J Am Med Inform Assoc 2004; 11 (02) 104-112
- 41 Chaparro JD, Beus JM, Dziorny AC. et al. Clinical decision support stewardship: Best practices and techniques to monitor and improve interruptive alerts. Appl Clin Inform 2022; 13 (03) 560-568
- 42 Ikoma S, Furukawa M, Busuttil A. et al. Optimizing inpatient blood utilization using real-time clinical decision support. Appl Clin Inform 2021; 12 (01) 49-56
- 43 Hendrickson CD, McLemore MF, Dahir KM. et al. Is the climb worth the view? the savings/alert ratio for reducing vitamin D testing. Appl Clin Inform 2020; 11 (01) 160-165
- 44 Baysari M. Finding fault with the default alert. AHRQ PSNet Patient Safety Network Web site. Updated 2013. Accessed May 27, 2021 at: https://psnet.ahrq.gov/web-mm/finding-fault-default-alert
- 45 Gao E, Radparvar I, Dieu H, Ross MK. User experience design for adoption of asthma clinical decision support tools. Appl Clin Inform 2022; 13 (04) 971-982
- 46 Parasuraman R, Sheridan TB, Wickens CD. A model for types and levels of human interaction with automation. IEEE Trans Syst Man Cybern A Syst Hum 2000; 30 (03) 286-297
- 47 CDS hooks. CDSHooks.org Web site. Accessed November 6, 2022 at: https://cds-hooks.org/
- 48 HL7 smart on FHIR. HL7 Smart on FHIR Web site. Accessed November 6, 2022 at: https://www.hl7.org/fhir/smart-app-launch/
- 49 Holden RJ, Carayon P, Gurses AP. et al. SEIPS 2.0: a human factors framework for studying and improving the work of healthcare professionals and patients. Ergonomics 2013; 56 (11) 1669-1686
- 50 Rice H, Garabedian PM, Shear K. et al. Clinical decision support for fall prevention: defining end-user needs. Appl Clin Inform 2022; 13 (03) 647-655
- 51 Wiklund ME, Kendler J, Hochberg L, Weinger MB. Technical Basis for User Interface Design of Health IT. Grant/Contract Reports (NISTGCR). 2015. Gaithersburg, MD National Institute of Standards and Technology [online]. Accessed March 9, 2023 at: https://doi.org/10.6028/NIST.GCR.15-996
- 52 Zhang J, Johnson TR, Patel VL, Paige DL, Kubose T. Using usability heuristics to evaluate patient safety of medical devices. J Biomed Inform 2003; 36 (1–2): 23-30
- 53 Scapin DL, Bastien JMC. Ergonomic criteria for evaluating the ergonomic quality of interactive systems. Behav Inf Technol 1997; 16 (05) 220-231
Address for correspondence
Publication History
Received: 15 November 2022
Accepted: 17 February 2023
Accepted Manuscript online:
21 February 2023
Article published online:
10 May 2023
© 2023. Thieme. All rights reserved.
Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany
-
References
- 1 Kappelman MD, Moore KR, Allen JK, Cook SF. Recent trends in the prevalence of Crohn's disease and ulcerative colitis in a commercially insured US population. Dig Dis Sci 2013; 58 (02) 519-525
- 2 Pigneur B, Seksik P, Viola S. et al. Natural history of Crohn's disease: comparison between childhood- and adult-onset disease. Inflamm Bowel Dis 2010; 16 (06) 953-961
- 3 Goodhand JR, Kamperidis N, Rao A. et al. Prevalence and management of anemia in children, adolescents, and adults with inflammatory bowel disease. Inflamm Bowel Dis 2012; 18 (03) 513-519
- 4 Danese S, Hoffman C, Vel S. et al. Anaemia from a patient perspective in inflammatory bowel disease: results from the European Federation of Crohn's and Ulcerative Colitis Association's online survey. Eur J Gastroenterol Hepatol 2014; 26 (12) 1385-1391
- 5 McCann JC, Ames BN. An overview of evidence for a causal relation between iron deficiency during development and deficits in cognitive or behavioral function. Am J Clin Nutr 2007; 85 (04) 931-945
- 6 Miller SD, Cuffari C, Akhuemonkhan E, Guerrerio AL, Lehmann H, Hutfless S. Anemia screening, prevalence, and treatment in pediatric inflammatory bowel disease in the United States, 2010–2014. Pediatr Gastroenterol Hepatol Nutr 2019; 22 (02) 152-161
- 7 Miller SD, Lehmann H, Murphy Z. et al. User centered digital development of a decision support tool to improve anemia care in pediatric inflammatory bowel disease. J Pediatr Gastroenterol Nutr 2019; 69: S331
- 8 Richardson S, Mishuris R, O'Connell A. et al. “Think aloud” and “Near live” usability testing of two complex clinical decision support tools. Int J Med Inform 2017; 106: 1-8
- 9
Bright TJ,
Wong A,
Dhurjati R.
et al.
Effect of clinical decision-support systems: a systematic review. Ann Intern Med 2012;
157 (01) 29-43
MissingFormLabel
- 10 Osheroff JA, Teich JM, Levick D. et al. Improving Outcomes with Clinical Decision Support: An Implementer's Guide. 2nd ed. Chicago, IL: HIMSS Publishing; 2012
- 11 Sutton RT, Pincock D, Baumgart DC, Sadowski DC, Fedorak RN, Kroeker KI. An overview of clinical decision support systems: benefits, risks, and strategies for success. NPJ Digit Med 2020; 3 (01) 17
- 12 Carayon P, Hoonakker P, Hundt AS. et al. Application of human factors to improve usability of clinical decision support for diagnostic decision-making: a scenario-based simulation study. BMJ Qual Saf 2019; 29 (04) 329-340
- 13 Hoonakker PLT, Carayon P, Salwei ME. et al. The design of PE dx, a CDS to support pulmonary embolism diagnosis in the ED. Stud Health Technol Inform 2019; 265: 134-140
- 14 Beuscart-Zéphir MC, Borycki E, Carayon P, Jaspers MWM, Pelayo S. Evolution of human factors research and studies of health information technologies: the role of patient safety. Yearb Med Inform 2013; 8: 67-77
- 15 Brunner J, Chuang E, Goldzweig C, Cain CL, Sugar C, Yano EM. User-centered design to improve clinical decision support in primary care. Int J Med Inform 2017; 104: 56-64
- 16 Savoy A, Militello LG, Patel H. et al. A cognitive systems engineering design approach to improve the usability of electronic order forms for medical consultation. J Biomed Inform 2018; 85: 138-148
- 17 Savoy A, Patel H, Flanagan ME, Daggy JK, Russ AL, Weiner M. Comparative usability evaluation of consultation order templates in a simulated primary care environment. Appl Ergon 2018; 73: 22-32
- 18 Thursky KA, Mahemoff M. User-centered design techniques for a computerised antibiotic decision support system in an intensive care unit. Int J Med Inform 2007; 76 (10) 760-768
- 19 Chokshi SK, Mann DM. Innovating from within: a process model for user-centered digital development in academic medical centers. JMIR Human Factors 2018; 5 (04) e11048
- 20 Kilsdonk E, Peute LWP, Riezebos RJ, Kremer LC, Jaspers MWM. Uncovering healthcare practitioners' information processing using the think-aloud method: From paper-based guideline to clinical decision support system. Int J Med Inform 2016; 86: 10-19
- 21 Campbell R. The five “rights” of clinical decision support. J AHIMA 2013; 84 (10) 42-47 , quiz 48
- 22 Carayon P, Hoonakker P. Human factors and usability for health information technology: old and new challenges. Yearb Med Inform 2019; 28 (01) 71-77
- 23 About us | Johns Hopkins Children's Center. Johns Hopkins Medicine Web site. Accessed Jan 30, 2020 at: https://www.hopkinsmedicine.org/johns-hopkins-childrens-center/about-us/index.html
- 24 Draw.io. . Draw.io Web site. Accessed August 21, 2020 at: https://app.diagrams.net/
- 25 Zoom. Zoom.us Web site. Updated 2020. Accessed August 21, 2020 at: https://zoom.us/
- 26 EpicCare. . Epic.com Web site. Accessed August 21, 2020 at: https://www.epic.com/software
- 27 Dignass AU, Gasche C, Bettenworth D. et al; European Crohn's and Colitis Organisation [ECCO]. European consensus on the diagnosis and management of iron deficiency and anaemia in inflammatory bowel diseases. J Crohn's Colitis 2015; 9 (03) 211-222
- 28 Rose AF, Schnipper JL, Park ER, Poon EG, Li Q, Middleton B. Using qualitative studies to improve the usability of an EMR. J Biomed Inform 2005; 38 (01) 51-60
- 29 Kilsdonk E, Peute LW, Jaspers MWM. Factors influencing implementation success of guideline-based clinical decision support systems: a systematic review and gaps analysis. Int J Med Inform 2017; 98: 56-64
- 30 Rojas B, White RF. The modified latin square. J R Stat Soc B 1957; 19 (02) 305-317
- 31 B-line medical sim capture. Accessed January 30, 2020 at: https://www.blinemedical.com/simcapture.html
- 32 REDCap. . Accessed January 30, 2020 at: https://www.project-redcap.org/
- 33 Galaxy - UX2100 usability badge. Accessed January 30, 2020 at: https://galaxy.epic.com/?#Browse/page=8400!68!240!3542166
- 34 Brooke J. SUS-A quick and dirty usability scale. Usability evaluation in industry. CRC Press 1996; 189: 4-7
- 35 Corbin J, Strauss A. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. 3rd ed. Thousand Oaks, CA: SAGE Publications; 2008
- 36 Kwan JL, Lo L, Ferguson J. et al. Computerised clinical decision support systems and absolute improvements in care: meta-analysis of controlled clinical trials. BMJ 2020; 370: m3216
- 37
Backman R,
Bayliss S,
Moore D,
Litchfield I.
Clinical reminder alert fatigue in healthcare: a systematic literature review protocol
using qualitative evidence. Syst Rev 2017; 6 (01) 255
MissingFormLabel
- 38
Ancker JS,
Edwards A,
Nosal S,
Hauser D,
Mauer E,
Kaushal R.
with the HITEC Investigators.
Effects of workload, work complexity, and repeated alerts on alert fatigue in a clinical
decision support system. BMC Med Inform Decis Mak 2017; 17 (01) 36
MissingFormLabel
- 39 Olakotan O, Mohd Yusof M, Ezat Wan Puteh S. A systematic review on CDSS alert appropriateness. Stud Health Technol Inform 2020; 270: 906-910
- 40 Ash JS, Berg M, Coiera E. Some unintended consequences of information technology in health care: the nature of patient care information system-related errors. J Am Med Inform Assoc 2004; 11 (02) 104-112
- 41 Chaparro JD, Beus JM, Dziorny AC. et al. Clinical decision support stewardship: Best practices and techniques to monitor and improve interruptive alerts. Appl Clin Inform 2022; 13 (03) 560-568
- 42 Ikoma S, Furukawa M, Busuttil A. et al. Optimizing inpatient blood utilization using real-time clinical decision support. Appl Clin Inform 2021; 12 (01) 49-56
- 43 Hendrickson CD, McLemore MF, Dahir KM. et al. Is the climb worth the view? the savings/alert ratio for reducing vitamin D testing. Appl Clin Inform 2020; 11 (01) 160-165
- 44 Baysari M. Finding fault with the default alert. AHRQ PSNet Patient Safety Network Web site. Updated 2013. Accessed May 27, 2021 at: https://psnet.ahrq.gov/web-mm/finding-fault-default-alert
- 45 Gao E, Radparvar I, Dieu H, Ross MK. User experience design for adoption of asthma clinical decision support tools. Appl Clin Inform 2022; 13 (04) 971-982
- 46 Parasuraman R, Sheridan TB, Wickens CD. A model for types and levels of human interaction with automation. IEEE Trans Syst Man Cybern A Syst Hum 2000; 30 (03) 286-297
- 47 CDS hooks. CDSHooks.org Web site. Accessed November 6, 2022 at: https://cds-hooks.org/
- 48 HL7 smart on FHIR. HL7 Smart on FHIR Web site. Accessed November 6, 2022 at: https://www.hl7.org/fhir/smart-app-launch/
- 49 Holden RJ, Carayon P, Gurses AP. et al. SEIPS 2.0: a human factors framework for studying and improving the work of healthcare professionals and patients. Ergonomics 2013; 56 (11) 1669-1686
- 50 Rice H, Garabedian PM, Shear K. et al. Clinical decision support for fall prevention: defining end-user needs. Appl Clin Inform 2022; 13 (03) 647-655
- 51 Wiklund ME, Kendler J, Hochberg L, Weinger MB. Technical Basis for User Interface Design of Health IT. Grant/Contract Reports (NISTGCR). 2015. Gaithersburg, MD National Institute of Standards and Technology [online]. Accessed March 9, 2023 at: https://doi.org/10.6028/NIST.GCR.15-996
- 52 Zhang J, Johnson TR, Patel VL, Paige DL, Kubose T. Using usability heuristics to evaluate patient safety of medical devices. J Biomed Inform 2003; 36 (1–2): 23-30
- 53 Scapin DL, Bastien JMC. Ergonomic criteria for evaluating the ergonomic quality of interactive systems. Behav Inf Technol 1997; 16 (05) 220-231



