Subscribe to RSS
DOI: 10.1055/a-2068-6699
Beyond Information Design: Designing Health Care Dashboards for Evidence-Driven Decision-Making
- The Purpose–Subject–Consumer Framework
- Case Example: Designing Dashboards for Inpatient Nurse Staffing Decisions
- Discussion
- General Discussion
- References
With health care systems experiencing a deluge of exponentially expanding performance measures,[1] [2] dashboards (a graphical report of essential data relevant to a particular objective or process, such as the World Health Organization's Coronavirus Dashboard)[3] have become common to efficiently consolidate monitoring large numbers of clinical performance and related health care measures across multiple domains. As a result, overcrowded, ineffective dashboards abound.[4]
Much has been written about how to design more visually pleasing, navigable, and interpretable dashboards (known in human factors research as “information design”). Information design, however, assumes dashboard designers know what information needs to be presented and to whom. Further, dashboards assume a certain level of numeracy and graph literacy of their consumers to be effective.[5] Various frameworks to aid in information design have been proposed,[6] such as an ontology of performance summary display[7] and the BEhavior and Acceptance fRamework (BEAR)[8] for the design of clinical decision support systems, which consolidates the propositions of four frameworks (including, e.g., the Human, Organization, and Technology-fit framework [HOT-fit][9] and the Unified Theory of Acceptance and Use of Technology [UTAUT][10]) and 10 literature reviews to provide a comprehensive view of the factors needed in successfully designing and implementing clinical decision support systems and information dashboards. Frameworks such as these provide a comprehensive panorama of the domain of information design and implementation that researchers can use for expanding generalizable knowledge; however, such frameworks can be overwhelming and unwieldy for the field designer trying to solve a concrete problem for a health care practice by means of a dashboard. What is needed is a straightforward procedure or set of rules for identifying the content to be presented on a dashboard that will yield the most benefit for the problem in question. The literature on performance metric development can yield useful insight on this matter.
Hysong et al[11] proposed asking three simple questions to help decision-makers select appropriate quality improvement and performance metrics:
-
What is the purpose of the metric?
-
Who is the consumer (or audience) of the metric?
-
Who is the intended subject, that is, who is being evaluated in this metric?
Just as lacking clear answers to these questions can hinder appropriate performance metric generation and selection, these three factors—unclear purpose, unclear or wrong consumer, and wrong subject—can pose barriers to successful dashboard design and implementation. Below we describe these in more detail and present a case example illustrating the use and benefits of this framework for dashboard design.
The Purpose–Subject–Consumer Framework
Clear Purpose
What purpose is the dashboard intended to serve? Is it intended, for example, for high-level monitoring to alert the user that further investigation is needed? Or is it intended as a direct source of feedback for the user, with the expectation that data on the dashboard should directly lead to behavior change? Whether an automotive dashboard or a clinical performance one, the purpose of any dashboard is to provide real-time, actionable data for subsequent use to change behavior. Building on the example of an automotive dashboard, its purpose is to supply critical status information for vehicle operation (e.g., remaining fuel, tire pressure, current temperature). Additionally, dashboards engage warning lights when there is vehicle trouble (e.g., low fuel, dangerously low tire pressure, overheating). These factors all change in real time, requiring immediate action should a certain threshold be crossed. In the context of clinical performance, the purpose of the dashboard should be clear. For example, different features are needed for a clinical performance dashboard used for developmental feedback versus one designed to emphasize quality improvement or resource planning. When designing for developmental feedback purposes, best practices in audit and feedback research suggest the need for a clear comparator, information about velocity and goal setting for future periods, and, ideally, “correct solution” information (i.e., information about what behavior to change to improve performance in the upcoming period).[12] In contrast, a dashboard designed for quality improvement purposes is more likely to require features displaying trends and statistical process control limits, with less detail and emphasis on cross-facility comparators or goal setting. In either case, information presented on a single screen should serve a single higher purpose, not merely be a collection of unrelated measures.[13]
#
Clear and Correct Consumer
Equally important to its purpose is the audience the dashboard is intended to serve: who will be consuming the information presented? Even when referring to the same phenomenon or purpose, clinicians of different varieties, practice leaders, and patients have very different decisions to make and, thus, have different information needs. For example, a dashboard presenting facility access data for patients may need next-available appointment data displayed so patients can decide whether the available care is sufficiently timely for their health needs. In contrast, facility leaders may instead require third next-available appointment data on their dashboard to determine access trends and identify where to effectively target resources to ensure timely access.[14]
Additionally, effectively addressing a given consumer's or audience's needs also requires adapting the way the data in question are presented to said consumer. For example, facility leaders engaging with appointment availability data might require graphs showing trendlines over time for available appointments as well as statistical process limits. This information helps those users quickly identify major changes in available appointments. Patients, however, who only want to find the quickest and the nearest facility to make an appointment, would likely be confused by trend lines and control limits. Instead, they might benefit from a geographical presentation of the same appointment availability data instead of a temporal presentation. In short, different consumers have distinct questions they might ask of the same data and have distinct levels of and variation in their numeracy and proficiency with graphs; designers should thus present information in a style that allows the expected consumer to quickly interpret and effectively use data.
Systematically designed dashboards generally succeed at identifying a single type of consumer for a given dashboard, such as clinical teams trying to make decisions about their patients.[15] [16] [17] However, sometimes a single dashboard tries to serve too many consumer types, perhaps because the true intended consumer has not been clarified.[18] In our example, a generic dashboard might display metrics that provide actionable information to neither facility managers nor patients. Indeed, providing only a single dashboard might even decrease access to care by both placing informational barriers in front of patients seeking health care and raising informational barriers for administrators trying to effectively deploy resources to improve access.
#
Clear and Correct Subject
Finally, about whom or what is the dashboard displaying information? Dashboards must display focused information about the correct person, population, or phenomenon at the correct level of analysis. Determining the correct subject and its corresponding granularity often depends on the dashboard's purpose and consumer. Yet data are often presented at too abstract or too granular a level for the intended purpose. This can sometimes be driven by available data, for example, in the early days of clinical performance measurement, dashboards and reports of medical center clinical performance in the Veterans Health Administration (VHA) were reported at the facility level; yet the underlying data for such measures were powered only for regional-level aggregation.[19] This resulted in reports with overly exaggerated performance trends due to small sample sizes for any given facility. Today, with the use of automated querying and natural language processing, most clinical performance measures are based on the entire population of interest, facilitating presentation of clinical performance data at the proper level of granularity (e.g., region, facility, or clinical team).[20]
#
#
Case Example: Designing Dashboards for Inpatient Nurse Staffing Decisions
As an example of how to overcome these barriers, we present a case describing the design of a dashboard intended to guide staffing decisions (purpose) by nursing management (consumer) to provide information about nurse workload, nurse staffing information, and nursing-sensitive patient outcomes at the unit and hospital levels (subject). The dashboard design was part of a larger study of nurse staffing patterns in the VHA.[21]
Consistent with information design and usability frameworks such as BEAR[8] and UTUAT,[10] which posit the user perspective is central to the design of a more usable and acceptable interface, we conducted interviews before commencing design to clarify the purpose of the different users in requiring nurse workload information on a dashboard. Interviewees reported using data to improve care quality and streamline care delivery and to plan resources needed for adequate care, such as unit staffing and supplies; findings from these interviews regarding purpose are reported elsewhere.[22] Despite all being managers, nursing leadership on the unit, facility, and regional levels may require very distinct information types and presentations to do their jobs. Based on user feedback, we created a dashboard with views tailored to consumer organizational role. Upon creating an initial dashboard prototype, we conducted two rounds of follow-up interviews using a subset of our initial interviewees, to assess the usability of the prototype from the perspective of each consumer type. Interviewees answered questions about their initial impressions as well as task efficiency (how efficiently can the user complete the tasks of interest on the dashboard) and effectiveness (how useful are the visualizations/reports at helping the user answer questions [and make decisions] about nurse staffing; see [Supplementary Material S1], available in the online version). Our case example reports the details of this latter set of interviews, focusing on the subject and consumer dimensions of the framework.
Participants
We interviewed 10 potential consumers of our dashboard representing the following groups: nursing unit manager (n = 1), service/care line (department) leadership (n = 3), facility leadership (n = 2), regional network leadership, (n = 2), and national leadership (n = 2).
#
Procedure
We iteratively developed two versions of the dashboard, with each iteration evaluated through a round of interviews. For each round, consumers viewed a prototype dashboard display and engaged in a think-aloud activity in which they voiced their impressions of the display real time as they worked with the dashboard, either directly or with the interviewer's assistance.[23] Consumers were asked to indicate which features or elements were most useful, identify extraneous items on the dashboard, and comment on how the data (subject matter) were represented on the display, paying special attention to the display's efficiency and effectiveness at helping them accomplish their intended goals (purpose). Interviews were recorded and analyzed by three members of the research team (two research coordinators with backgrounds in public health and an industrial/organizational psychologist with expertise in qualitative and think-aloud methods), using qualitative rapid analysis.[24]
#
Findings
Participants from all audience groups reported liking the display format, particularly the second iteration; they found it “easy to read” and were able to see information quickly. Everything being “all in one place” (as opposed to having to search multiple, separate sources for the same information) was also mentioned by interviewees of all organizational levels as a useful feature, suggesting that the dashboard's purpose had been adequately fulfilled:
Interviewer: How does this display compare to other currently available resources?
Participant: I think it's definitely a little more uh user friendly… And easier to understand. And. And it's all in one place. So that's really key. You know, I usually have to go to five or six different places to find data. That's really … the problem. So one, having it, having it all in one place is very helpful
– Facility-level participant.
Unit- and department-level consumers' preferred subject was the nursing unit: these consumers requested measures such as outcomes per nursing unit, staffing needs for a given unit, and unit staffing configuration. The preferred subject of consumers working at the facility level or higher was the facility: they liked being able to see facility-level trends with the ability to drill down to the unit level and the ability to compare facility-level metrics. This suggested, at minimum, the need for drill-down/up functionality on the dashboard to accommodate display of data about different subjects for the different consumers expected to engage with the dashboard. Consumers from four of the five organizational levels commented on the need for real-time information. However, within-facility consumers (e.g., unit, service line) were much more concerned with needing within-facility census information (e.g., unit-level staffing and patient information, comparisons), whereas consumers at the facility level and higher were much more concerned with longer-term trends and the broader picture, including more accountability types of metrics, such as patient safety indicator performance and staff. [Figs 1] and [2] present examples of the same data in our dashboard, presented to different consumer types. We plan to present the dashboard and our interview findings to the VHA Office of Nursing Informatics (ONI) to identify areas of alignment with future tools. ONI collaborates with internal and external stakeholders to optimize nursing documentation throughout the VHA health care system.




#
#
Discussion
Although our interviews found some areas of common ground among all five user types (e.g., ability to see all in one place, and ease of readability, need for real-time information), each user type had unique needs (e.g., between-facility vs. within-facility comparisons for national- and facility-level users, respectively) that highlighted the importance of tailoring dashboard displays for different user types and the benefits of assessing user type–specific purposes for optimal dashboard design.
Our data were limited by the fact that although we interviewed users from five different levels of the organization, all participants worked in the Veteran Affairs system. This limits our ability to make similar statements about what is needed in other health care systems.
#
General Discussion
Dashboards, when designed, implemented, and used correctly, can be invaluable tools for monitoring important quality trends and troubleshooting problems before they arise. Information design research can speak to how best to display data on a dashboard. However, many dashboards are often created without resolving fundamental questions about their purpose, consumer, and subjects, obfuscating rather than clarifying information, which undermines their intended purpose. This can sometimes stem from relying on default layouts and offerings from off-the-shelf products, which are often designed to provide one-size-fits-all solutions. Health care faces unique challenges in that the same data about the same subject(s) can be used for multiple purposes and by multiple consumers. For dashboards to achieve their potential, users must invest critical, clear, upfront thought to how best to meet their users' design needs and not rely on the default design trend of the moment. Upfront clarity about the specific consumer the dashboard intends to serve, the purpose the dashboard intends to accomplish, and the subject of the data to be presented greatly facilitates decisions about the information design of the dashboard. This results in a more meaningful and actionable tool to help a wide variety of consumers across the health care system make wise decisions to deliver and receive high-quality care.
#
#
Conflict of Interest
None declared.
-
References
- 1 Agency for Healthcare Research and Quality. . National Quality Measures Clearinghouse. 2010. Accessed May 17, 2010 at: http://www.qualitymeasures.ahrq.gov
- 2 Hysong SJ, Francis J, Petersen LA. Motivating and engaging frontline providers in measuring and improving team clinical performance. BMJ Qual Saf 2019; 28 (05) 405-411
- 3 Organization WH. . WHO Coronavirus (COVID-19) Dashboard [Web Page]. 2023. Accessed May 8, 2023 at: https://covid19.who.int/
- 4 Yigitbasioglu OM, Velcu O. A review of dashboards in performance management: implications for design and research. Int J Account Inf Syst 2012; 13 (01) 41-59
- 5 Lopez KD, Wilkie DJ, Yao Y. et al. Nurses' numeracy and graphical literacy: informing studies of clinical decision support interfaces. J Nurs Care Qual 2016; 31 (02) 124-130
- 6 Sedrakyan G, Mannens E, Verbert K. Guiding the choice of learning dashboard visualizations: linking dashboard design and data visualization concepts'. J Comput Lang 2019; 50: 19-38
- 7 Lee D, Panicker V, Gross C, Zhang J, Landis-Lewis Z. What was visualized? A method for describing content of performance summary displays in feedback interventions. BMC Med Res Methodol 2020; 20 (01) 90
- 8 Camacho J, Zanoletti-Mannello M, Landis-Lewis Z, Kane-Gill SL, Boyce RD. A conceptual framework to study the implementation of clinical decision support systems (BEAR): literature review and concept mapping. J Med Internet Res 2020; 22 (08) e18388
- 9 Yusof MM, Kuljis J, Papazafeiropoulou A, Stergioulas LK. An evaluation framework for health information systems: human, organization and technology-fit factors (HOT-fit). Int J Med Inform 2008; 77 (06) 386-398
- 10 Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. Manage Inf Syst Q 2003; 27 (03) 425-478
- 11 Hysong SJ, O'Mahen P, Profit J, Petersen LA. Purpose, subject, and consumer: Comment on “Perceived burden due to registrations for quality monitoring and improvement in hospitals: a mixed methods study”. Int J Health Policy Manag 2022; 11 (04) 539-543
- 12 Hysong SJ, Kell HJ, Petersen LA, Campbell BA, Trautner BW. Theory-based and evidence-based design of audit and feedback programmes: examples from two clinical intervention studies. BMJ Qual Saf 2017; 26 (04) 323-334
- 13 Agency for Healthcare Research and Quality. . Data Visualization Best Practices for Primary Care Quality Improvement (QI) Dashboards. Rockville, MD: Author; 2018. Accessed May 8, 2023 at: https://www.ahrq.gov/evidencenow/tools/dashboard-best-practice.html
- 14 Kaboli PJ, Miake-Lye IM, Ruser C. et al. Sequelae of an evidence-based approach to management for access to care in the Veterans Health Administration. Med Care 2019; 57 (03) S213-S20
- 15 Foster M, Albanese C, Chen Q. et al. Heart failure dashboard design and validation to improve care of veterans. Appl Clin Inform 2020; 11 (01) 153-159
- 16 Simpao AF, Ahumada LM, Larru Martinez B. et al. Design and implementation of a visual analytics electronic antibiogram within an electronic health record system at a tertiary pediatric hospital. Appl Clin Inform 2018; 9 (01) 37-45
- 17 Hester G, Lang T, Madsen L, Tambyraja R, Zenker P. Timely data for targeted quality improvement interventions: use of a visual analytics dashboard for bronchiolitis. Appl Clin Inform 2019; 10 (01) 168-174
- 18 Froese M-E, Tory M. Lessons learned from designing visualization dashboards. IEEE Comput Graph Appl 2016; 36 (02) 83-89
- 19 Office of Quality and Performance. . FY 2006 Technical Manual for the VHA Performance Measurement System including JCAHO Hospital Core Measures. Veterans Health Administration [Intranet]. 2006. Accessed September 13, 2006 at: http://vaww.oqp.med.va.gov/oqp_services/performance_measurement/uploads/web_performance_measures/2006_perf_meas/FY06%20Tech%20Manual%20Q4%206-06.doc
- 20 Office of Analytics and Performance Integration. Electronic Technical Manual for the VHA Performance Measurement System. Veterans Health Administration [Intranet]. 2023. Accessed May 8, 2023 at: https://pm.rtp.med.va.gov/ReportServer/Pages/ReportViewer.aspx?/Performance%20Reports/Measure%20Management/MeasureCatalog
- 21 Petersen LA. Improving the Measurement of VA Facility Performance to Foster a Learning Healthcare System. US Department of Veterans Affairs Health Services Research & Development Service; 2015. Accessed May 8, 2023 at: https://www.hsrd.research.va.gov/research/abstracts.cfm?Project_ID=2141705745
- 22 Wong JJ, SoRelle RP, Yang C. et al. Nurse leader perceptions of data in the Veterans Health Administration: a qualitative evaluation. Comput Inform Nurs 2023;
- 23 Chipman SF, Schraagen JM, Shalin VL. Introduction to cognitive task analysis. In: Schraagen JM, Chipman SF, Shalin VL. eds. Cognitive Task Analysis. Mahwah, NJ: Lawrence Erlbaum Associates; 2000: 3-23
- 24 Gale RC, Wu J, Erhardt T. et al. Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implement Sci 2019; 14 (01) 11
Address for correspondence
Publication History
Received: 01 November 2022
Accepted: 30 March 2023
Accepted Manuscript online:
04 April 2023
Article published online:
14 June 2023
© 2023. Thieme. All rights reserved.
Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany
-
References
- 1 Agency for Healthcare Research and Quality. . National Quality Measures Clearinghouse. 2010. Accessed May 17, 2010 at: http://www.qualitymeasures.ahrq.gov
- 2 Hysong SJ, Francis J, Petersen LA. Motivating and engaging frontline providers in measuring and improving team clinical performance. BMJ Qual Saf 2019; 28 (05) 405-411
- 3 Organization WH. . WHO Coronavirus (COVID-19) Dashboard [Web Page]. 2023. Accessed May 8, 2023 at: https://covid19.who.int/
- 4 Yigitbasioglu OM, Velcu O. A review of dashboards in performance management: implications for design and research. Int J Account Inf Syst 2012; 13 (01) 41-59
- 5 Lopez KD, Wilkie DJ, Yao Y. et al. Nurses' numeracy and graphical literacy: informing studies of clinical decision support interfaces. J Nurs Care Qual 2016; 31 (02) 124-130
- 6 Sedrakyan G, Mannens E, Verbert K. Guiding the choice of learning dashboard visualizations: linking dashboard design and data visualization concepts'. J Comput Lang 2019; 50: 19-38
- 7 Lee D, Panicker V, Gross C, Zhang J, Landis-Lewis Z. What was visualized? A method for describing content of performance summary displays in feedback interventions. BMC Med Res Methodol 2020; 20 (01) 90
- 8 Camacho J, Zanoletti-Mannello M, Landis-Lewis Z, Kane-Gill SL, Boyce RD. A conceptual framework to study the implementation of clinical decision support systems (BEAR): literature review and concept mapping. J Med Internet Res 2020; 22 (08) e18388
- 9 Yusof MM, Kuljis J, Papazafeiropoulou A, Stergioulas LK. An evaluation framework for health information systems: human, organization and technology-fit factors (HOT-fit). Int J Med Inform 2008; 77 (06) 386-398
- 10 Venkatesh V, Morris MG, Davis GB, Davis FD. User acceptance of information technology: toward a unified view. Manage Inf Syst Q 2003; 27 (03) 425-478
- 11 Hysong SJ, O'Mahen P, Profit J, Petersen LA. Purpose, subject, and consumer: Comment on “Perceived burden due to registrations for quality monitoring and improvement in hospitals: a mixed methods study”. Int J Health Policy Manag 2022; 11 (04) 539-543
- 12 Hysong SJ, Kell HJ, Petersen LA, Campbell BA, Trautner BW. Theory-based and evidence-based design of audit and feedback programmes: examples from two clinical intervention studies. BMJ Qual Saf 2017; 26 (04) 323-334
- 13 Agency for Healthcare Research and Quality. . Data Visualization Best Practices for Primary Care Quality Improvement (QI) Dashboards. Rockville, MD: Author; 2018. Accessed May 8, 2023 at: https://www.ahrq.gov/evidencenow/tools/dashboard-best-practice.html
- 14 Kaboli PJ, Miake-Lye IM, Ruser C. et al. Sequelae of an evidence-based approach to management for access to care in the Veterans Health Administration. Med Care 2019; 57 (03) S213-S20
- 15 Foster M, Albanese C, Chen Q. et al. Heart failure dashboard design and validation to improve care of veterans. Appl Clin Inform 2020; 11 (01) 153-159
- 16 Simpao AF, Ahumada LM, Larru Martinez B. et al. Design and implementation of a visual analytics electronic antibiogram within an electronic health record system at a tertiary pediatric hospital. Appl Clin Inform 2018; 9 (01) 37-45
- 17 Hester G, Lang T, Madsen L, Tambyraja R, Zenker P. Timely data for targeted quality improvement interventions: use of a visual analytics dashboard for bronchiolitis. Appl Clin Inform 2019; 10 (01) 168-174
- 18 Froese M-E, Tory M. Lessons learned from designing visualization dashboards. IEEE Comput Graph Appl 2016; 36 (02) 83-89
- 19 Office of Quality and Performance. . FY 2006 Technical Manual for the VHA Performance Measurement System including JCAHO Hospital Core Measures. Veterans Health Administration [Intranet]. 2006. Accessed September 13, 2006 at: http://vaww.oqp.med.va.gov/oqp_services/performance_measurement/uploads/web_performance_measures/2006_perf_meas/FY06%20Tech%20Manual%20Q4%206-06.doc
- 20 Office of Analytics and Performance Integration. Electronic Technical Manual for the VHA Performance Measurement System. Veterans Health Administration [Intranet]. 2023. Accessed May 8, 2023 at: https://pm.rtp.med.va.gov/ReportServer/Pages/ReportViewer.aspx?/Performance%20Reports/Measure%20Management/MeasureCatalog
- 21 Petersen LA. Improving the Measurement of VA Facility Performance to Foster a Learning Healthcare System. US Department of Veterans Affairs Health Services Research & Development Service; 2015. Accessed May 8, 2023 at: https://www.hsrd.research.va.gov/research/abstracts.cfm?Project_ID=2141705745
- 22 Wong JJ, SoRelle RP, Yang C. et al. Nurse leader perceptions of data in the Veterans Health Administration: a qualitative evaluation. Comput Inform Nurs 2023;
- 23 Chipman SF, Schraagen JM, Shalin VL. Introduction to cognitive task analysis. In: Schraagen JM, Chipman SF, Shalin VL. eds. Cognitive Task Analysis. Mahwah, NJ: Lawrence Erlbaum Associates; 2000: 3-23
- 24 Gale RC, Wu J, Erhardt T. et al. Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implement Sci 2019; 14 (01) 11



