Appl Clin Inform 2018; 09(03): 511-518
DOI: 10.1055/s-0038-1666842
Research Article
Georg Thieme Verlag KG Stuttgart · New York

The Development of Heuristics for Evaluation of Dashboard Visualizations

Dawn Dowding
1  Division of Nursing, Midwifery and Social Work, School of Health Sciences, University of Manchester, Manchester, United Kingdom
Jacqueline A. Merrill
2  Department of Biomedical Informatics, Columbia University, New York, New York, United States
3  School of Nursing, Columbia University, New York, New York, United States
› Author Affiliations
Funding The research reported here was supported by the Agency for Healthcare Research and Quality (U.S.) under award number R21HS023855. The content is solely the responsibility of the authors and does not necessarily represent the official viess of the Agency for Healthcare Research and Quality.
Further Information

Publication History

15 March 2018

21 May 2018

Publication Date:
11 July 2018 (online)


Background Heuristic evaluation is used in human–computer interaction studies to assess the usability of information systems. Nielsen's widely used heuristics, first developed in 1990, are appropriate for general usability but do not specifically address usability in systems that produce information visualizations.

Objective This article develops a heuristic evaluation checklist that can be used to evaluate systems that produce information visualizations. Principles from Nielsen's heuristics were combined with heuristic principles developed by prior researchers specifically to evaluate information visualization.

Methods We used nominal group technique to determine an appropriate final set. The combined existing usability principles and associated factors were distributed via email to a group of 12 informatics experts from a range of health care disciplines. Respondents were asked to rate each factor on its importance as an evaluation heuristic for visualization systems on a scale from 1 (definitely don't include) to 10 (definitely include). The distribution of scores for each item were calculated. A median score of ≥8 represented consensus for inclusion in the final checklist.

Results Ten of 12 experts responded with rankings and written comments. The final checklist consists of 10 usability principles (7 general and 3 specific to information visualization) substantiated by 49 usability factors. Three nursing informatics experts then used the checklist to evaluate a vital sign dashboard developed for home care nurses, using a task list designed to explore the full functionality of the dashboard. The experts used the checklist without difficulty, and indicated that it covered all major usability problems encountered during task completion.

Conclusion The growing capacity to generate and electronically process health data suggests that data visualization will be increasingly important. A checklist of usability heuristics for evaluating information visualization systems can contribute to assuring high quality in electronic data systems developed for health care.

Protection of Human and Animal Subjects

The study protocol was reviewed and approved by the Institutional Review Boards at Columbia University and the Visiting Nurse Service of New York.

Supplementary Material