CC BY-NC-ND 4.0 · Appl Clin Inform 2022; 13(01): 67-79
DOI: 10.1055/s-0041-1740919
Research Article

Agile, Easily Applicable, and Useful eHealth Usability Evaluations: Systematic Review and Expert-Validation

Irina Sinabell
1   Department of Biomedical Computer Science and Mechatronics, Institute of Medical Informatics, UMIT, Private University of Health Sciences, Medical Informatics and Technology, Hall in Tirol, Austria
,
Elske Ammenwerth
1   Department of Biomedical Computer Science and Mechatronics, Institute of Medical Informatics, UMIT, Private University of Health Sciences, Medical Informatics and Technology, Hall in Tirol, Austria
› Author Affiliations
 

Abstract

Background Electronic health (eHealth) usability evaluations of rapidly developed eHealth systems are difficult to accomplish because traditional usability evaluation methods require substantial time in preparation and implementation. This illustrates the growing need for fast, flexible, and cost-effective methods to evaluate the usability of eHealth systems. To address this demand, the present study systematically identified and expert-validated rapidly deployable eHealth usability evaluation methods.

Objective Identification and prioritization of eHealth usability evaluation methods suitable for agile, easily applicable, and useful eHealth usability evaluations.

Methods The study design comprised a systematic iterative approach in which expert knowledge was contrasted with findings from literature. Forty-three eHealth usability evaluation methods were systematically identified and assessed regarding their ease of applicability and usefulness through semi-structured expert interviews with 10 European usability experts and systematic literature research. The most appropriate eHealth usability evaluation methods were selected stepwise based on the experts' judgements of their ease of applicability and usefulness.

Results Of these 43 eHealth usability evaluation methods identified as suitable for agile, easily applicable, and useful eHealth usability evaluations, 10 were recommended by the experts based on their usefulness for rapid eHealth usability evaluations. The three most frequently recommended eHealth usability evaluation methods were Remote User Testing, Expert Review, and Rapid Iterative Test and Evaluation Method. Eleven usability evaluation methods, such as Retrospective Testing, were not recommended for use in rapid eHealth usability evaluations.

Conclusion We conducted a systematic review and expert-validation to identify rapidly deployable eHealth usability evaluation methods. The comprehensive and evidence-based prioritization of eHealth usability evaluation methods supports faster usability evaluations, and so contributes to the ease-of-use of emerging eHealth systems.


#

Background and Significance

The term eHealth refers to the use of information and communication technologies necessary for the operability of health systems.[1] [2] [3] Some eHealth systems are aimed at medical staff for medical decision support,[4] [5] [6] while others are aimed at patients for their personal welfare.[7] [8] [9] eHealth systems, such as health information systems, are becoming more prevalent due to the rapid development of information and communication technologies[10] and have the potential to improve health care.[11] [12] [13] There have been reports on critical issues related to the successful implementation of eHealth systems,[11] including the lack of customizability and usability.[14] Increased usability may lead to increased patient safety.[11] [15] Safe and usable eHealth systems are crucial in health care because failures of the system can result in death or injury to the patients being treated.[15] The usability of health information systems has thus become an important concern worldwide[14] [16] because usability problems of eHealth systems can put patients at the risk of harm.[17]

Usability is considered one of the crucial requirements of eHealth systems[18] because the usefulness of these systems to end users (medical staff or patients) is an essential component in the development of health information systems.[10] ISO 9241–11:2018 defines usability as “the extent to which a system, product or service can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.”[19] Usability focuses on functional aspects[20] and aims to assess the level of a system's effectiveness and efficiency.[21] Effectiveness and efficiency are part of the performance of a system.[10] Effectiveness relates to the “accuracy and completeness with which users achieve specific goals”[19] and includes, for example, the informativeness and understandability of the system.[22] Efficiency relates to “resources used in relation to the results achieved”[19] and includes, for instance, readability and reachability of the system.[22] To improve the effectiveness and efficiency of an eHealth system, usability evaluations are implemented.

Usability evaluations of eHealth systems have an enormous value for patient benefit.[23] To obtain the patient benefit and evaluate the usability of eHealth systems, both traditionally well-established expert- or user-based usability evaluation methods can be applied. In the course of expert-based usability evaluation methods, evaluators can inspect the usability of eHealth systems using heuristics.[24] User-based usability evaluations are utilized to observe users' interaction with software[24] and are usually realized using usability tests. Usability tests are considered a key component of user-centered design[25] to evaluate health information technology.[26] User-centered design involves the prospective end users, such as medical staff or patients, in all steps of the development process.[8] The needs of the end users are considered by user experience as well, which is more broadly defined as the “user's perceptions and responses that result from the use and/or anticipated use of a system, product or service.”[19] User experience considers users' perceptions while interacting with software and extends on users' feelings and emotional responses as well.[27] The requirements of eHealth systems change quickly due to customer and user needs. To address these needs and adapt quickly to these changes, efforts have been made to introduce iterative design and refinement of systems through agile software development.[16]

Agile software development enables rapid software delivery, demand for which is constantly on the rise.[28] Traditional usability evaluation methods are difficult to reconcile with agile software development,[29] as they require substantial time in preparation and implementation.[30] However, fast user feedback is crucial for eHealth systems developed using agile software development.[31] To allow the incorporation of user feedback in agile software development, there is a growing need for fast, flexibly applicable, and cost-effective usability evaluation methods.[32] This creates a need for easily applicable and useful eHealth usability evaluation methods that facilitate agile eHealth usability evaluations.

There already exist several approaches for rapidly applicable usability evaluations that are integrated into agile software development.[29] [33] [34] [35] [36] These approaches include extreme usability[37] or extremely rapid usability testing.[30] Easily applicable usability evaluation methods, as such, were brought up by Jakob Nielsen in the late 1980s.[38] At that time, Nielsen coined the term discount usability engineering,[38] which refers not only to usability evaluation methods simply applicable at low-cost[31] but also focuses on rapid iteration to obtain user feedback.[38] This thought behind discount usability is still represented in many current approaches, including agile user experience,[39] agile user-centered design,[40] or light-weight user-centered design.[41]

The idea behind discount usability engineering perfectly fits the field of health care, where cost reductions are ubiquitous.[42] However, there are only few approaches on discount usability engineering that are appropriate for evaluating eHealth systems, such as low-cost rapid usability testing[14] or rapid usability evaluation.[43] Due to the high complexity of eHealth systems,[44] end users should be involved early in the development of eHealth systems. In addition to the early integration of end users, being aware of the context in which to evaluate eHealth systems is essential.[45] The stage of software lifecycle[46] that contains the different stages (such as requirements engineering, design, and evaluation[35]) and context[24] of eHealth affects the choice of the appropriate eHealth usability evaluation method. The context of eHealth refers to different available types of eHealth systems, such as health information systems, electronic health records, or web sites for online patient information. A variety of evaluation methods can be considered to inspect or test usability[47] but there are only few easily applicable and useful eHealth usability evaluation methods suitable for agile eHealth usability evaluations in health care.


#

Objectives

To address the demand of easily applicable and useful eHealth usability evaluation methods that support faster usability evaluations, we systematically identified and expert-validated rapidly deployable eHealth usability evaluation methods. Our objective was to identify and prioritize eHealth usability evaluation methods suitable for agile, easily applicable, and useful eHealth usability evaluations.


#

Methods

The study design comprised an iterative approach by contrasting expert knowledge with findings from literature. To achieve the objectives, we set up a process that included two main steps: (1) literature review and (2) interviews with experts.

Systematic Literature Review on eHealth Usability Evaluation Methods

Step one comprised the development of a literature-based list of eHealth usability evaluation methods. We conducted a systematic literature review within the thematic areas of eHealth, usability, and agility. In this context, we define agility as the possibility to quickly implement eHealth usability evaluations to facilitate rapid software delivery. The search questions were: (1) which eHealth usability evaluation methods exist? (2) which usability evaluation methods can be rapidly deployed to facilitate agile eHealth usability evaluations? The search question guided the selection of the search terms. We combined search terms from the thematic areas of eHealth, usability, and agility ([Table 1]). We selected the following databases for the search: ACM Digital Library, IEEE Xplore, and Medline (via PubMed). To consider emergent eHealth usability evaluation methods that were not published in peer-reviewed literature, we complemented our search with reviews of gray literature from Google Scholar (first 30 pages of results). To meet the particularities of the search engines of each database, we adjusted the search terms. We used the Medical Subject Heading (MeSH) terms for the literature search in Medline (via PubMed), such as telemedicine, medical informatics, user-centered design, and user-computer interface.

Table 1

Search terms used to identify eHealth usability evaluation methods

Thematic area of search

Selected search terms

eHealth

eHealth, telemedicine, telemonitoring, telehealth, mHealth, “mobile health”, “electronic health”, health, “medical informatics”, “clinical informatics”, medical, “medical computer science”, or “health information technology”

AND

evaluation, framework, model, approach, process, processes, concept, testing, development, or engineering

Usability

usability, “user-centered design”, “human computer interaction”, or “usability testing”

Agility

agile, extreme, rapid, fast, and iterative

Note: Mobile health (mHealth) represents a context of eHealth that deals with mHealth systems aimed for instance at patients for their personal welfare.


In total, 3,981 findings from peer-reviewed literature and non-peer-reviewed literature matched our inclusion criteria ([Fig. 1]). To include a paper in the review process, the paper must contain a description of an eHealth usability evaluation method, report on agility as the possibility to quickly implement eHealth usability evaluations, and refer to the applicability of eHealth usability evaluation methods emphasizing the evaluation stage of the software lifecycle ([Table 2]). The search was limited to English-language papers published from January 2008 to June 2019. Since our study focuses on obtaining descriptions of eHealth usability evaluation methods, we excluded papers that were experience reports, conference posters, or presentations. After removing duplicates (n = 324), we downloaded all findings into Zotero reference manager to review them for possible inclusion (n = 3,657). We analyzed the papers via a two-step process as follows: (1) screening of the papers according to the inclusion criteria based on title and abstract and (2) reading the full text of those papers that matched our inclusion criteria. We kept only papers that report on eHealth usability evaluation methods suitable for agile, easily applicable, and useful eHealth usability evaluation. We decided an eHealth usability evaluation method during the review process to be suitable if the eHealth usability evaluation method was theoretically or practically integrated in an agile, easily applicable, and useful eHealth usability evaluation. From these included papers (n = 287), we extracted a list of 29 eHealth usability evaluation methods (see details in [Fig. 2]). This list of eHealth usability evaluation methods was complemented by an ongoing manual search in peer-reviewed journals parallel to the interviews to include up-to-date literature (n = 42). Further, we applied a snowballing approach and examined our selected findings for further crucial literature.

Zoom Image
Fig. 1 Flow chart of literature review according to the PRISMA statement. We searched ACM Digital Library, Google Scholar, IEEE Xplore, and Medline (via PubMed) (ordered alphabetically).
Zoom Image
Fig. 2 Model behind iterative development of the expert-based prioritization of eHealth usability evaluation methods. Originating from the literature-based list of eHealth usability evaluation methods, the iterative refinement of prioritized eHealth usability evaluation methods is visualized.
Table 2

Inclusion and exclusion criteria of systematic literature review on eHealth usability evaluation methods

Inclusion

Exclusion

• Relevance to the three main thematic areas for this paper: (1) eHealth, (2) usability, and (3) agility

• Description of eHealth usability evaluation method, i.e., a method, model, approach, process, or concept that can be rapidly deployed to facilitate rapid software delivery

• Applicability of eHealth usability evaluation method emphasizing the evaluation stage of the software lifecycle

• Peer-reviewed papers as well as non-peer-reviewed papers

• Papers not published in English

• Papers not focusing on the evaluation of eHealth systems

• No description on existing eHealth usability evaluation methods suitable for agile, easily applicable, and useful eHealth usability evaluations

• No indicators that eHealth usability evaluation methods can be rapidly deployed

• Papers that were experience reports, conference posters, or presentations

• Papers published before 2008


#

Expert-Based Iterative Validation of eHealth Usability Evaluation Methods

Step two comprised the validation of the extracted list of 29 eHealth usability evaluation methods by experts. We did this iteratively by continuously assessing and validating each eHealth usability evaluation method identified in step one with the help of interviews with 10 experts ([Fig. 2]). We performed five iterations with interviews instead two different experts in each iteration. During each iteration, we (1) successively conducted two interviews with two experts and (2) related the experts' statements on eHealth usability evaluation methods to the literature identified in step one to possibly add further rapidly deployable eHealth usability evaluation methods. Contrasting the experts' statements with the literature was conducted to ensure the consideration of the most recent eHealth usability evaluation methods. Inspired by rapid software delivery and the user feedback gathered in each iteration, implementing the expert interviews iteratively was motivated by agile software development.

The expert interviews were finished after five iterations, as a saturation of results was already achieved during iteration number four ([Fig. 3]). We defined saturation as the number of altered, newly added, recommended, and not recommended eHealth usability evaluation methods, which strongly decreased after iteration four. During iteration four, only one eHealth usability evaluation method was recommended; regarding iteration five, only one eHealth usability evaluation method was newly added and recommended by experts (see also [Fig. 4]).

Zoom Image
Fig. 3 Saturation of information content concerning experts' interviews.
Zoom Image
Fig. 4 Iterative prioritization and refinement of eHealth usability evaluation methods displayed for each iteration. Visualization of eHealth usability evaluation methods that were altered, newly added, recommended, and not recommended each iteration (ordered alphabetically).

The 10 interviews with the usability experts were conducted in March and April 2020. We identified the usability experts via professional associations. For the selection of experts, we considered the following criteria: (1) a record of at least 10 years' experience in the field of usability, user experience, and/or agile software development and (2) occupation as a usability engineer, professional for usability or user experience, experience consultant, user experience architect/designer, and usability, interaction, or product designer. We kept the inclusion criteria as broad as possible to avoid excluding potentially qualified experts. When approaching the experts, we clarified the issue of the interview and roughly outlined the questions we wanted to discuss. To ensure that the approached experts have expertise working with eHealth systems and are familiar with a variety of eHealth usability evaluation methods, we informed them in advance that we wanted to obtain their opinion on rapidly deployable eHealth usability evaluation methods that facilitate agile eHealth usability evaluations. Overall, we invited 20 experts. From these, 10 agreed to participate in the online-based and semi-structured interviews via videoconference or by phone.

We used an interview guideline designed for a half-hour conversation consisting of two subject areas mentioned below:

  1. Expert's opinion on rapidly deployable eHealth usability evaluation methods suitable for agile, easily applicable, and useful eHealth usability evaluations. Question 1: Which rapidly deployable eHealth usability evaluation method would you recommend (or not recommend) to conduct agile, easily applicable, and useful eHealth usability evaluations? Question 2: How would you set up the eHealth usability evaluation? Question 3: Assuming that it is possible to combine two or more eHealth usability evaluation methods, which eHealth usability evaluation methods would you combine and why?

  2. Expert's opinion on the list of 29 eHealth usability evaluation methods identified in the literature search (step one). We shared the list with the experts visually as a file or orally before starting or during the interview. Question: Which of these eHealth usability evaluation methods would you recommend to rapidly evaluate an eHealth system? Which would you not recommend? Why or why not?

We analyzed the transcribed interviews by combining inductive and deductive content analysis. To achieve this, we used the literature-based list of eHealth usability evaluation methods to predefine eHealth usability evaluation methods recommended and not recommended by experts (deductive content analysis) and used the interview transcripts to postdefine eHealth usability evaluation methods recommended and not recommended by experts (inductive content analysis). The analysis consisted of two steps mentioned below:

  1. We counted each eHealth usability evaluation method that was recommended and not recommended via experts' choice. We documented the numbers of experts' recommendations (as well as non-recommendations) for each eHealth usability evaluation method. For example, Remote User Test was recommended nine times by experts.

  2. We summarized eHealth usability evaluation methods using the same methodology. We did this for both recommended eHealth usability evaluation methods and not recommended eHealth usability evaluation methods. For example, Asynchronous Usability Testing is an automated usability test that is recorded and performed without an evaluator.[21] Unmoderated Usability Testing is performed automatically without an evaluator as well.[48] Since these two eHealth usability evaluation methods can be regarded equally, we summarized them under the term Unmoderated Usability Testing.

All remaining eHealth usability evaluation methods, which were neither recommended nor not recommended, and thus not mentioned, in more detail by experts, were categorized as potentially useful eHealth usability evaluation methods.


#
#

Results

After literature review and expert interviews, we finally prioritized 10 recommended eHealth usability evaluation methods, 22 potentially useful eHealth usability evaluation methods, and 11 not recommended eHealth usability evaluation methods. Not recommended eHealth usability evaluation methods refer to eHealth usability evaluation methods that the experts recommended not to use for rapid deployments.

We took care to have a diversity of experts' sector affiliations to gain different views from their professional experience. Most experts (six of 10) were employed in research and development. The remaining experts were equally employed in industry and civil service. The median usability experience of our chosen experts was 16 years. The most experienced expert had a usability experience of 25 years.

The interviews were recorded and transcribed with 29,799 words in total. The interviews lasted a median of 34 minutes.

Based on the literature review, we extracted a list of 29 eHealth usability evaluation methods that provided the basis for the iterative prioritization of the eHealth usability evaluation methods ([Fig. 2], left). [Fig. 4] shows the iterative prioritization and refinement of the eHealth usability evaluation methods for each iteration (iteration one to five) in detail.

Iteration 1

We started the initial round with the literature-based list of 29 eHealth usability evaluation methods. Experts stated that the combination of Cognitive Walkthrough (pretest) and Shadowing is too cumbersome to implement and not suitable for rapid software delivery; the combination was therefore altered and simplified to Shadowing. Due to both experts' suggestions, we summarized Retrospective Cognitive Walkthrough and Retrospective Peer Discovery under the generic term Retrospective Testing. We finished iteration one with 37 eHealth usability evaluation methods, resulting from 29 eHealth usability evaluation methods minus one eHealth usability evaluation method (because two eHealth usability evaluation methods were summarized and simplified to Retrospective Testing) plus nine eHealth usability evaluation methods that were newly added.


#

Iteration 2

Six eHealth usability evaluation methods were not recommended by experts. Due to both experts' suggestions, the two eHealth usability evaluation methods Think Aloud and Questionnaires were summarized into one eHealth usability evaluation method (referred to as “Think Aloud combined with Questionnaire”). In addition, both experts suggested the joint consideration of Card Sorting and Storyboard (referred to as “Card Sorting as well as Storyboard”). We finished iteration two with 40 eHealth usability evaluation methods, which resulted from 37 eHealth usability evaluation methods minus two eHealth usability evaluation methods (due to the previously mentioned experts suggestions concerning the summarization of eHealth usability evaluation methods) plus five newly added eHealth usability evaluation methods.


#

Iteration 3

Two eHealth usability evaluation methods (Synchronous Usability Testing and Unmoderated Usability Testing) were newly added according to both experts' suggestions. In total, three eHealth usability evaluation methods were not recommended. We finished iteration three with 42 eHealth usability evaluation methods, resulting from 40 eHealth usability evaluation methods plus two newly added eHealth usability evaluation methods.


#

Iteration 4

Feature Inspection was recommended by both experts, although this method originates more from the field of user experience and is used in early stages of software lifecycle.[49] We finished iteration four with 42 eHealth usability evaluation methods because no eHealth usability evaluation methods were newly added or altered.


#

Iteration 5

Crowd Testing was newly added and recommended by experts because the evaluation can be achieved “automatically under real conditions which is useful to evaluate eHealth systems aimed at patients.” We finished iteration five with 43 eHealth usability evaluation methods ([Fig. 5]).

Zoom Image
Fig. 5 Final prioritization of eHealth usability evaluation methods. For easier readability, recommended eHealth usability evaluation methods are ordered by the number of experts' choice (for more details see [Fig. 6]). The same was done for not recommended eHealth usability evaluation methods. All other eHealth usability evaluation methods are arranged alphabetically.

#

Recommended eHealth Usability Evaluation Methods

The three most frequently recommended eHealth usability evaluation methods are Remote User Testing, Expert Review, and Rapid Iterative Test and Evaluation Method ([Fig. 6]). The Remote User Testing is recommended by experts due to its simplified technical framework; beneficial are its uncomplicated “technical environment, working at distance.” Expert Review is recommended by experts because it “is always a quick choice to accomplish a usability evaluation in health care.” Rapid Iterative Test and Evaluation Method is the third most recommended eHealth usability evaluation method because “prospective users can be quickly involved, which is an important precondition for developing user-friendly eHealth systems.”

Zoom Image
Fig. 6 eHealth usability evaluation methods according to number of experts’ choice. The number originates from the documented number of experts’ recommendation (as well as non-recommendations) for each eHealth usability evaluation method.

Descriptions of all 10 recommended eHealth usability evaluation methods can be found in the appendix ([Supplementary Appendix Table A], available in the online version).


#

Potentially Useful eHealth Usability Evaluation Methods

The experts neither recommended nor not recommended Perspective-Based Inspection, Consistency Inspection, Standards Inspection, and Formal Usability Inspection suitable for agile eHealth usability evaluations, all of which can be applied early in the software lifecycle. Descriptions of all 22 potentially useful eHealth usability evaluation methods are listed in the appendix ([Supplementary Appendix Table B], available in the online version).


#

Not Recommended eHealth Usability Evaluation Methods

[Fig. 6] shows that Retrospective Testing is the most frequently not recommended eHealth usability evaluation method, followed by Focus Group, Unmoderated Usability Testing, and Questionnaires. Experts do not recommend Retrospective Testing because the evaluation is done twice; they noted that this “must be more effort” and the benefit to achieve higher quality is “dearly bought.” Focus Groups are not recommended because “the effort in implementation and preparation is quite high.” Unmoderated Usability Testing is not recommended because technical faults can occur during evaluation and the evaluator is not able to intervene, which is disadvantageous for the implementation of the evaluation. Questionnaires are not recommended because a large number of test participants who are representative of prospective users are needed to gain statistically valid results. Experts stated that this “effort is simply too great.” Descriptions of the 11 not recommended eHealth usability evaluation methods can be found in the appendix ([Supplementary Appendix Table C], available in the online version).


#

Combinations of eHealth Usability Evaluation Methods

The experts suggested that combining Remote User Testing with Think Aloud or Interview to rapidly evaluate eHealth systems would be highly useful because insights into participants' thought processes are difficult to gain solely from observations when tasks are performed during eHealth usability evaluation. The experts stated that Retrospective Testing can be combined with Think Aloud or Eye Tracking to gain deeper insights into participants' thought processes or eye movements. However, this combination does not increase the usefulness of Retrospective Testing for agile eHealth usability evaluations because the experts agree that Think Aloud or Eye Tracking increases the effort especially if the evaluation is done twice. The experts suggested combining Think Aloud with Questionnaire as a way to enhance insufficiently informative qualitative results with quantitative results. Nevertheless, the experts do not recommend conducting Questionnaires for agile eHealth usability evaluations since many test participants are required to achieve reliable quantitative results.

The experts' quotes on recommended eHealth usability evaluation methods and not recommended eHealth usability evaluation methods are documented in detail in the appendixes ([Supplementary Appendix Tables A] and [C], available in the online version).


#
#

Discussion

This study aimed to achieve the identification and prioritization of eHealth usability evaluation methods suitable for agile, easily applicable, and useful eHealth usability evaluations. Results show that there are a variety of eHealth usability evaluation methods deployable to evaluate eHealth systems. According to expert interviews, we found that 10 eHealth usability evaluation methods were recommended to evaluate eHealth systems, while 11 eHealth usability evaluation methods were not recommended. A further 22 eHealth usability evaluation methods are potentially useful to evaluate eHealth systems but were not especially commented on by experts.

Overall, we identified 43 eHealth usability evaluation methods. The systematically identified and expert-validated eHealth usability evaluation methods are useful either to rapidly evaluate eHealth systems aimed at medical staff or for the evaluation of eHealth systems aimed at patients. Both usability professionals and non-usability professionals can use the systematically identified and prioritized eHealth usability evaluation methods that facilitate agile, easily applicable, and useful eHealth usability evaluations. The categorization of recommended, potentially useful, and not recommended eHealth usability evaluation methods helps usability professionals and non-usability professionals to choose an appropriate eHealth usability evaluation method suitable to conduct agile eHealth usability evaluations. This fosters usability evaluations in health care that are easy to realize and can be performed rapidly.

Some eHealth usability evaluation methods exist that are suitable for rapid evaluations addressing eHealth systems for medical decision support.[50] A prior study demonstrated that Questionnaires were recommended to be the most appropriate eHealth usability evaluation method to evaluate electronic health records compared with Heuristic Evaluation, Cognitive Walkthrough, Usability Testing, and Remote Usability Testing.[50] Although Questionnaires are the most prevalent eHealth usability evaluation method,[23] our results showed that experts do not recommend Questionnaires to conduct agile eHealth usability evaluations because, as one reason, the experts assessed the scoring systems of Questionnaires to be cumbersome. This finding can be confirmed because recent research criticized the complex scoring systems of Questionnaires[51] and showed that Questionnaires easily overlook important information about user interpretation of information.[5] Recent research further showed that automatically feasible usability evaluations are not used to develop eHealth systems.[23] This is confirmed by the fact that Unmoderated Usability Testing was strongly not recommended by experts. The combination of rapidly deployable eHealth usability evaluation methods to enrich usability findings may be a necessary approach to accomplish eHealth usability evaluations with medical staff quickly, since they have limited time available to them.[52] This is consistent with our findings since the experts recommended combining eHealth usability evaluation methods to support faster eHealth usability evaluations. The experts did not recommend using Retrospective Testing for agile eHealth usability evaluations combined with Think Aloud or Eye Tracking. Recent research showed that Eye Tracking has not attracted acceptance for the evaluation of mHealth systems[23] because, as one reason, the recording of eye movements was found to be distracting by the test participants.[53] [54] Additionally, Eye Tracking gained little additional benefit to Retrospective Think Aloud.[54] Cognitive Walkthrough was suggested by the experts as a potentially useful eHealth usability evaluation method. Research supports this suggestion, since Cognitive Walkthrough has been frequently used to evaluate eHealth systems,[55] [56] [57] although it was criticized in 2013 for its effort and the time required for implementation.[32] This led to the creation of simplified versions of Cognitive Walkthrough, such as Cognitive Jogthrough[58] or Streamlined Cognitive Walkthrough.[32]

Limitations

Since usability research is a rapidly evolving field, we included gray literature to incorporate emerging eHealth usability evaluation methods. We obtained more relevant findings from Google Scholar compared with the databases ACM Digital Library, IEEE Xplore, and Medline (via PubMed). Since Google Scholar is a metasearch engine and takes databases from several other publishers into account, we received papers that were not considered by the other databases, such as a paper dealing with discount user-centered eHealth design,[42] a usability toolkit addressing the evaluation of electronic health records,[50] or a description of Cooperative Usability Testing in the field of eHealth.[59] Given the systematic literature search conducted in this study, we believe that we have found most of the relevant papers and then focused on including further relevant papers by using a snowballing approach. One limitation of our study, however, is that we are not able to confirm this with absolute certainty. We included papers that emphasize the evaluation stage of the software lifecycle. Nevertheless, we did not explicitly restrict the literature search to the evaluation stage because there are eHealth usability evaluation methods, such as Cognitive Walkthrough, that can be applied in different stages of the software lifecycle such as evaluation, requirements engineering, and design ([Supplementary Appendix Table B], available in the online version). One further limitation of this study is that the literature search was predominantly performed and interpreted by the first author. To avoid bias, the results were frequently discussed with the second author. Based on the literature review, we extracted a list of 29 eHealth usability evaluation methods. We performed the expert-based prioritization iteratively to continuously assess and validate each eHealth usability evaluation method. Due to our chosen iterative approach, results of one iteration affected the results of the subsequent iteration. This may have an impact on the final prioritization of recommended eHealth usability evaluation methods and not recommended eHealth usability evaluation methods which represents a limitation of our study. We performed five iterations with interviews including two different experts in each iteration. The interviews were time limited to around half an hour due to the full schedule of the experts. We addressed this limitation by relating the experts' statements to eHealth usability evaluation methods to the literature (identified in step one) to possibly add further rapidly deployable eHealth usability evaluation methods. As a saturation of results was already achieved during iteration number four, we finished the expert interviews after five iterations. The generalization of the experts' suggestions from this study is difficult because the choice of an appropriate eHealth usability evaluation method is also affected by aspects such as the context of eHealth and stage of software lifecycle.


#

Future Work

Further research is needed to support the selection of an appropriate eHealth usability evaluation method regarding the context of eHealth. We are currently developing a decision tree to address this need. Based on the results of this study, we aim to develop a toolbox consisting of the prioritized eHealth usability evaluation methods. There are existing toolboxes describing usability evaluation methods[50] [60] [61]; however, our intended toolbox will address the systematic identification and evidence-based prioritization of eHealth usability evaluation methods suitable for agile eHealth usability evaluations extended with information on the strengths and weaknesses of each eHealth usability evaluation method. Some of the systematically identified and prioritized eHealth usability evaluation methods, such as Assessing Cognitive Workload[62] and Cooperative Usability Testing,[63] were theoretically conceived but have not yet been practically applied in eHealth. This study therefore suggests that there is an increasing need to share knowledge and to identify eHealth usability evaluation methods that were applied and tested practically. Future work of this study will continue by investigating whether the systematically identified and expert-validated eHealth usability evaluation methods can be used to evaluate eHealth systems quickly and easily in health care. For this purpose, the implementation of a case study is planned prior to this study.


#
#

Conclusion

We conducted a systematic review and expert-validation to identify rapidly deployable eHealth usability evaluation methods. The systematic identification and evidence-based prioritization of eHealth usability evaluation methods support faster eHealth usability evaluations, and thus contributes to the ease-of-use of emerging eHealth systems. We aim to provide a toolbox consisting of the eHealth usability evaluation methods identified in this study. Future work will contain the development of a toolbox that includes further information of eHealth usability evaluation methods on those presented in the tables in the appendixes ([Supplementary Appendix Tables A] to [C], available in the online version). To achieve this, we aim to set up method cards for each eHealth usability evaluation method indicating coherences and similarities between different eHealth usability evaluation methods.


#

Clinical Relevance Statement

This study offers an expert-validated prioritization of eHealth usability evaluation methods that can be used to quickly evaluate eHealth systems. Medical staff with different professional backgrounds (e.g., medical computer scientists or health care professionals) can utilize the systematically identified and prioritized eHealth usability evaluation methods to perform an agile eHealth usability evaluation. The evidence-based prioritization into 10 recommended eHealth usability evaluation methods, 22 potentially useful eHealth usability evaluation methods, and 11 not recommended eHealth usability evaluation methods provides an indication as to which eHealth usability evaluation method is suitable for conducting an agile eHealth usability evaluation.


#

Multiple Choice Questions

  1. What do the systematically identified and prioritized eHealth usability evaluation methods focus on?

    • large-scale usability evaluations aimed at evaluating safety-critical eHealth systems.

    • eHealth user experience methods that can be especially applied at early stages of the software lifecycle.

    • rapidly deployable eHealth usability evaluation methods to support faster usability evaluations.

    • none of the above.

    Correct Answer: The correct answer is option c. This study offers systematically identified and prioritized rapidly deployable eHealth usability evaluation methods to foster usability evaluations in health care that are easy to realize and can be performed quickly.

  2. For the evaluation of which medical device or software can the expert-validated prioritization of eHealth usability evaluation methods be utilized?

    • Surgical robots supporting operations on patients.

    • eHealth systems, for instance in clinical practice.

    • Serious games for mental health disorders.

    • None of the above.

    Correct Answer: The correct answer is option b. The prioritized rapidly deployable eHealth usability evaluation methods can be used to evaluate eHealth systems that are the object of rapid software delivery, such as health information systems, electronic health records, or web sites for online patient information.


#
#

Conflict of Interest

None declared.

Acknowledgments

Participation of experts' sharing their comprehensive experiences and the time necessary for realizing this study is acknowledged.

Protection of Human and Animal Subjects

Ethical approval was not required for this study.


Supplementary Material

  • References

  • 1 Lau F, Kuziemsky CE. eds. Handbook of eHealth Evaluation: An Evidence-Based Approach. Victoria, British Columbia Canada: University of Victoria; 2016
  • 2 International Telecommunication Union (ITU). Standardization in e-health. Accessed August 9, 2021 at: https://www.itu.int/itunews/issue/2003/06/standardization.html
  • 3 World Health Organization (WHO). eHealth. Accessed August 9, 2021 at: https://www.euro.who.int__data/assets/pdf_file/0010/261694/6.-eHealth,-Factsheet-for-European-Parliament.pdf
  • 4 Bockhacker M, Syrek H, Elstermann von Elster M, Schmitt S, Roehl H. Evaluating usability of a touchless image viewer in the operating room. Appl Clin Inform 2020; 11 (01) 88-94
  • 5 Richter Lagha R, Burningham Z, Sauer BC. et al. Usability testing a potentially inappropriate medication dashboard: a core component of the dashboard development process. Appl Clin Inform 2020; 11 (04) 528-534
  • 6 Orenstein EW, Boudreaux J, Rollins M. et al. Formative usability testing reduces severe blood product ordering errors. Appl Clin Inform 2019; 10 (05) 981-990
  • 7 Hron JD, Parsons CR, Williams LA, Harper MB, Bourgeois FC. Rapid implementation of an inpatient telehealth program during the COVID-19 pandemic. Appl Clin Inform 2020; 11 (03) 452-459
  • 8 Rödle W, Wimmer S, Zahn J. et al. User-centered development of an online platform for drug dosing recommendations in pediatrics. Appl Clin Inform 2019; 10 (04) 570-579
  • 9 Farzandipour M, Nabovati E, Heidarzadeh Arani M, Akbari H, Sharif R, Anvari S. Enhancing asthma patients' self-management through smartphone-based application: design, usability evaluation, and educational intervention. Appl Clin Inform 2019; 10 (05) 870-878
  • 10 Klaassen B, van Beijnum BJ, Hermens HJ. Usability in telemedicine systems-a literature survey. Int J Med Inform 2016; 93: 57-69
  • 11 Kushniruk AW, Borycki EM. Low-cost rapid usability engineering: designing and customizing usable healthcare information systems. Healthc Q 2006; 9 (04) 98-102
  • 12 Ali H, Cole A, Panos G. Transforming patient hospital experience through smart technologies. In: Proceedings of the International Conference on Human-Computer Interaction; HCII '20. Copenhagen, Denmark: Springer International Publishing; 2020: 203-215
  • 13 Coppersmith NA, Sarkar IN, Chen ES. Quality informatics: the convergence of healthcare data, analytics, and clinical excellence. Appl Clin Inform 2019; 10 (02) 272-277
  • 14 Kushniruk A, Borycki E. Low-cost rapid usability testing: its application in both product development and system implementation. Stud Health Technol Inform 2017; 234: 195-200
  • 15 Price M, Weber J, Bellwood P, Diemert S, Habibi R. Evaluation of eHealth system usability and safety. In: Handbook of EHealth Evaluation: An Evidece-Based Approach. Victoria, British Columbia Canada: University of Victoria; 2016: 337-350
  • 16 Kushniruk AW, Borycki EM. Integrating low-cost rapid usability testing into agile system development of healthcare IT: a methodological perspective. Stud Health Technol Inform 2015; 210: 200-204
  • 17 Marcilly R, Schiro J, Beuscart-Zéphir MC, Magrabi F. Building usability knowledge for health information technology: a usability-oriented analysis of incident reports. Appl Clin Inform 2019; 10 (03) 395-408
  • 18 Broekhuis M, van Velsen L, Hermens H. Assessing usability of eHealth technology: a comparison of usability benchmarking instruments. Int J Med Inform 2019; 128: 24-31
  • 19 International Organization for Standardization (ISO). Ergonomics of human-system interaction - Part 11: usability: definitions and concepts. Accessed July 26, 2021 at: https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-2:v1:en
  • 20 Paelke V, Röcker C. User interfaces for cyber-physical systems: challenges and possible approaches. In: Proceedings of the 6th International Conference on Design, User Experience, and Usability; DUXU '17. Vancouver, BC, Canada: Springer International Publishing; 2017. ;10288: 75-85
  • 21 Bastien JM. Usability testing: a review of some methodological and technical aspects of the method. Int J Med Inform 2010; 79 (04) e18-e23
  • 22 Speicher M, Both A, Gaedke M. INUIT: the interface usability instrument. In: Proceedings of the 6th International Conference on Design, User Experience, and Usability; DUXU '17. Vancouver, BC, Canada: Springer International Publishing; 2017. ;10288: 256-268
  • 23 Maramba I, Chatterjee A, Newman C. Methods of usability testing in the development of eHealth applications: a scoping review. Int J Med Inform 2019; 126: 95-104
  • 24 Jaspers MW. A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence. Int J Med Inform 2009; 78 (05) 340-353
  • 25 Thompson KE, Rozanski EP, Haake AR. Here, there, anywhere: remote usability testing that works. In: Proceedings of the 5th Conference on Information Technology Education; CITC5 '04. Salt Lake City, UT, USA: ACM; 2004: 132-137
  • 26 Wysocki T, Pierce J, Caldwell C. et al. A web-based coping intervention by and for parents of very young children with type 1 diabetes: user-centered design. JMIR Diabetes 2018; 3 (04) e16
  • 27 González Sánchez JL, Padilla Zea N, Gutiérrez FL. From usability to playability: introduction to player-centred video game development process. In: Proceedings of the International Conference on Human Centered Design; HCD '09. San Diego, CA, USA: Springer; 2009. ;5619: 65-74
  • 28 Dubey A, Singi K, Kaulgud V. Personas and redundancies in crowdsourced testing. In: Proceedings of the 12th International Conference on Global Software Engineering; ICGSE '17. Buenos Aires, Argentinia: IEEE; 2017: 76-80
  • 29 Silva T, Silveira M, Maurer F. Usability evaluation practices within agile development. In: Proceedings of the 48th Hawaii International Conference on System Sciences. HICSS '15. Kauai, HI, USA: IEEE; 2015: 5133-5142
  • 30 Pawson M, Greenberg S. Extremely rapid usability testing. J Usability Stud 2009; 4: 124-135
  • 31 Kane D. Finding a place for discount usability engineering in agile development: throwing down the gauntlet. In: Proceedings of the Agile Development Conference; ADC '03. Salt Lake City, UT, USA: IEEE; 2003: 40-46
  • 32 Grigoreanu V, Mohanna M. Informal Cognitive Walkthroughs (ICW): paring down and pairing up for an agile world. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI '13. Paris, France: ACM; 2013: 3093-3096
  • 33 Marques AB, Figueiredo R, Amorin W, Rabelo J, Barbosa SDJ, Conte T. Do usability and agility combine?: investigating the adoption of usability modeling in an agile software project in the industry. In: Proceedings of the 17th Brazilian Symposium on Human Factors in Computing Systems; IHC '18. Belém, Brazil: ACM; 2018: 1-11
  • 34 Cavichi de Freitas R, Rodrigues LA, Marques da Cunha A. AGILUS: a method for integrating usability evaluations on agile software development. In: Proceedings of the International Conference on Human-Computer Interaction; HCI '16. Cham, Switzerland: Springer International Publishing; 2016. ;9731: 545-552
  • 35 Magües D, Castro J, Acuna S. HCI usability techniques in agile development. In: Proceedings of the International Conference on Automatica; ICA-ACCA '16. Curico, Chile: IEEE; 2016: 1-7
  • 36 Butt S, Onn A, Butt M, Inam N, Butt S. Incorporation of usability evaluation methods in agile software model. In: Proceedings of the 17th IEEE International Multi Topic Conference; INMIC '14. Karachi, Pakistan: IEEE; 2014: 193-199
  • 37 Federoff M, Villamor C, Miller L. et al. Extreme usability: adapting research approaches for agile development. In: Proceedings of the 26th Annual CHI Conference Extended Abstracts on Human Factors in Computing Systems; CHI EA '08. Florence, Italy: ACM; 2008: 2269-2272
  • 38 Nielsen J. Discount usability: 20 years. Accessed September 13, 2019 at: https://www.nngroup.com/articles/discount-usability-20-years/
  • 39 Schwartz L. Agile-user experience design: an agile and user-centered process?. In: Proceedings of the 8th International Conference on Software Engineering Advances; ICSEA '13. Venice, Italy: IARIA; 2013: 346-351
  • 40 Isa W, Lokman A, Aris S. et al. Engineering rural informatics using agile user centered design. In: Proceedings of the 2nd International Conference on Information and Communication Technology; ICoICT '14. Bandung, Indonesia: IEEE; 2014: 367-372
  • 41 Constantine LL, Lockwood LAD. Usage-centered software engineering: an agile approach to integrating users, user interfaces, and usability into software engineering practice. In: Proceedings of the 25th International Conference on Software Engineering; ICSE '03. Portland, OR, USA: IEEE; 2003: 746-747
  • 42 Verhoeven F, van Gemert-Pijnen J. Discount user-centered e-health design: a quick-but-not-dirty method. In: Proceedings of the Symposium of the Austrian HCI and Usability Engineering Group; USAB '10. Klagenfurt, Austria: Springer; 2010. 638. 101-123
  • 43 Russ AL, Baker DA, Fahner WJ. et al. A rapid usability evaluation (RUE) method for health information technology. AMIA Annu Symp Proc 2010; 2010: 702-706
  • 44 European Union Agency for Network and Information Security (ENISA). Security and Resilience in eHealth: Security Challenges and Risks. Iraklio, Greece: ENISA; 2015: 1-48
  • 45 Kuziemsky CE, Kushniruk A. Context mediated usability testing. Stud Health Technol Inform 2014; 205: 905-909
  • 46 Øvad T, Larsen L. The prevalence of UX design in agile development processes in industry. In: Proceedings of the Agile Conference; AGILE '15. Washington, DC, USA: IEEE; 2015: 40-49
  • 47 Duh HB-L, Tan GCB, Chen VH. Usability evaluation for mobile device: a comparison of laboratory and field tests. In: Proceedings of the 8th Conference on Human-Computer Interaction with Mobile Devices and Services; MobileHCI '06. Helsinki, Finland: ACM; 2006: 181-186
  • 48 Whitenton K. Tools for unmoderated usability testing. Accessed April 28, 2021 at: https://www.nngroup.com/articles/unmoderated-user-testing-tools/
  • 49 Nielsen J. Usability inspection methods. In: Proceedings of the Conference Companion on Human Factors in Computing Systems; CHI '94. Boston, MA, USA: ACM; 1994: 413-414
  • 50 Johnson CM, Johnston D, Crowley PK. et al. EHR Usability Toolkit: A Background Report on Usability and Electronic Health Records. Accessed August 8, 2021 at: https://digital.ahrq.gov/sites/default/files/docs/citation/EHR_Usability_Toolkit_Background_Report.pdf
  • 51 Davis R, Gardner J, Schnall R. A review of usability evaluation methods and their use for testing eHealth HIV interventions. Curr HIV/AIDS Rep 2020; 17 (03) 203-218
  • 52 Wu DTY, Vennemeyer S, Brown K. et al. Usability testing of an interactive dashboard for surgical quality improvement in a large congenital heart center. Appl Clin Inform 2019; 10 (05) 859-869
  • 53 Elbabour F, Alhadreti O, Mayhew P. Eye tracking in retrospective think-aloud usability testing: is there added value?. J Usability Stud 2017; 12 (03) 95-110
  • 54 Elling S, Lentz L, de Jong M. Retrospective think-aloud method: using eye movements as an extra cue for participants' verbalizations. In: Proceedings of the 29th Annual CHI Conference on Human Factors in Computing Systems; SIGCHI '11. Vancouver, BC, Canada: ACM; 2011: 1161-1170
  • 55 Khajouei R, Zahiri Esfahani M, Jahani Y. Comparison of heuristic and cognitive walkthrough usability evaluation methods for evaluating health information systems. J Am Med Inform Assoc 2017; 24 (e1): e55-e60
  • 56 Georgsson M, Staggers N, Årsand E, Kushniruk A. Employing a user-centered cognitive walkthrough to evaluate a mHealth diabetes self-management application: a case study and beginning method validation. J Biomed Inform 2019; 91: 1-15
  • 57 Ghalibaf AK, Jangi M, Habibi MRM, Zangouei S, Khajouei R. Usability evaluation of obstetrics and gynecology information system using cognitive walkthrough method. Electron Physician 2018; 10 (04) 6682-6688
  • 58 Rowley DE, Rhoades DG. The cognitive jogthrough: a fast-paced user interface evaluation procedure. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; CHI '92. Monterey, Calif, USA: ACM; 1992: 389-395
  • 59 Andersen SB, Rasmussen CK, Frøkjær E. Bringing content understanding into usability testing in complex application domains - a case study in eHealth. In: Proceedings of the 6th International Conference on Design, User Experience, and Usability; DUXU '17. Vancouver, BC, Canada: Springer International Publishing; 2017. ;10288: 327-341
  • 60 The User Experience Professionals' Association (UXPA). Usability body of knowledge. Accessed August 9, 2021 at: http://www.usabilitybok.org/methods
  • 61 University of Minnesota Duluth (UMD). Usability evaluation toolbox. Accessed August 15, 2021 at: https://www.d.umn.edu/itss/training/online/usability/toolbox.html
  • 62 Maguire M. Methods to support human-centred design. Int J Hum Comput Stud 2001; 55 (04) 587-634
  • 63 Frøkjær E, Hornbæk K. Cooperative usability testing. In: Proceedings of CHI Extended Abstracts on Human Factors in Computing Systems; CHI EA '05. Portland, OR, USA: ACM; 2005: 1383-1386
  • 64 Power C, Petrie H, Mitchell R. A framework for remote user evaluation of accessibility and usability of websites. In: Proceedings of the 5th International Conference on Universal Access in Human-Computer Interaction; UAHCI '09. San Diego, CA, USA: Springer; 2009. ;5614: 594-601
  • 65 Hammontree M, Weiler P, Nayak N. Remote usability testing. Interact 1994; 1 (03) 21-25
  • 66 Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform 2004; 37 (01) 56-76
  • 67 Medlock MC, Wixon D, McGee M, Welsh D. The rapid iterative test and evaluation method. In: Int Techn, Cost-Justifying Usability. 2nd ed.. Elsevier; 2005: 489-517
  • 68 Wilson C. User Interface Inspection Methods: A User-Centered Design Method. Amsterdam, The Netherlands; Elsevier. 2013
  • 69 Laemller R. Guerrilla Testing: Muting und rasch zu Resultaten. Accessed November 2, 2020 at: https://www.testingtime.com/blog/guerrilla-testing/
  • 70 Firmenich S, Garrido A, Grigera J, Rivero JM, Rossi G. Usability improvement through A/B testing and refactoring. Softw Qual J 2018; 27 (01) 203-240
  • 71 Guaiani F, Muccini H. Crowd and laboratory testing, can they co-exist? An exploratory study. In: Proceedings of the 2nd International Workshop on CrowdSourcing in Software Engineering; CSI-SE '15. Florence, Italy: IEEE; 2015: 32-37
  • 72 Zogaj S, Bretschneider U, Leimeister JM. Managing crowdsourced software testing: a case study based insight on the challenges of a crowdsourcing intermediary. J Bus Econ 2014; 84 (03) 375-405
  • 73 Downey LL. Group usability testing: evolution in usability techniques. J Usability Stud 2007; 2 (03) 133-144
  • 74 Sears A. Heuristic walkthroughs: finding the problems without the noise. Int J Hum Comput Interact 1997; 9 (03) 213-234
  • 75 Nielsen L, Madsen S. The usability expert's fear of agility: an empirical study of global trends and emerging practices. In: Proceedings of the 7th Nordic Conference on Human-Computer Interaction; NordiCHI '12. Copenhagen, Denmark: ACM; 2012: 261-264
  • 76 Höysniemi J, Hämäläinen P, Turkki L. Using peer tutoring in evaluating the usability of a physically interactive computer game with children. Interact Comput 2003; 15 (02) 203-255
  • 77 Khalayli N, Nyhus S, Hamnes K, Terum T. Persona based rapid usability kick-off. In: Proceedings of CHI Extended Abstracts on Human Factors in Computing Systems; CHI EA '07. San Jose, CA, USA: ACM; 2007: 1771-1776
  • 78 Usability in Germany eV (UIG). Fokusgruppe. Accessed November 9, 2020 at: https://www.usability-in-germany.de/definition/fokusgruppe
  • 79 Rosenbaum S, Cockton G, Coyne K, Muller M, Rauch T. Focus groups in HCI. In: Proceedings of CHI Extended Abstracts on Human Factors in Computing Systems; CHI EA '02. Minneapolis, MN, USA: ACM; 2002: 702-703
  • 80 Usability in Germany eV (UIG). Fragebogen. Accessed November 9, 2020 at: https://www.usability-in-germany.de/definition/fragebogen
  • 81 Krause R. Storyboards help visualize UX ideas. Nielsen Norman Group. Accessed November 9, 2020 at: https://www.nngroup.com/articles/storyboards-visualize-ideas/
  • 82 Piyush J, Sanjay KD, Ajay R. Software usability evaluation method. Int J Adv Res Comput Eng Technol 2012; 1 (02) 28-33
  • 83 Holzinger A, Slany WXP. + UE → XU Praktische Erfahrungen mit eXtreme Usability. Informatik-Spektrum 2006; 29 (02) 91-97
  • 84 Borycki E, Senathirajah Y, Kushniruk AW. The future of mobile usability, workflow and safety testing. Stud Health Technol Inform 2017; 245: 15-19
  • 85 Holzinger A. Usability engineering methods for software developers. Commun ACM 2005; 48 (01) 71-74

Address for correspondence

Irina Sinabell, MSc
Department of Biomedical Computer Science and Mechatronics, Institute of Medical Informatics, UMIT, Private University of Health Sciences, Medical Informatics and Technology
Eduard-Wallnöfer-Zentrum 1, 6060 Hall in Tirol
Austria   

Publication History

Received: 24 May 2021

Accepted: 20 October 2021

Article published online:
09 March 2022

© 2022. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • References

  • 1 Lau F, Kuziemsky CE. eds. Handbook of eHealth Evaluation: An Evidence-Based Approach. Victoria, British Columbia Canada: University of Victoria; 2016
  • 2 International Telecommunication Union (ITU). Standardization in e-health. Accessed August 9, 2021 at: https://www.itu.int/itunews/issue/2003/06/standardization.html
  • 3 World Health Organization (WHO). eHealth. Accessed August 9, 2021 at: https://www.euro.who.int__data/assets/pdf_file/0010/261694/6.-eHealth,-Factsheet-for-European-Parliament.pdf
  • 4 Bockhacker M, Syrek H, Elstermann von Elster M, Schmitt S, Roehl H. Evaluating usability of a touchless image viewer in the operating room. Appl Clin Inform 2020; 11 (01) 88-94
  • 5 Richter Lagha R, Burningham Z, Sauer BC. et al. Usability testing a potentially inappropriate medication dashboard: a core component of the dashboard development process. Appl Clin Inform 2020; 11 (04) 528-534
  • 6 Orenstein EW, Boudreaux J, Rollins M. et al. Formative usability testing reduces severe blood product ordering errors. Appl Clin Inform 2019; 10 (05) 981-990
  • 7 Hron JD, Parsons CR, Williams LA, Harper MB, Bourgeois FC. Rapid implementation of an inpatient telehealth program during the COVID-19 pandemic. Appl Clin Inform 2020; 11 (03) 452-459
  • 8 Rödle W, Wimmer S, Zahn J. et al. User-centered development of an online platform for drug dosing recommendations in pediatrics. Appl Clin Inform 2019; 10 (04) 570-579
  • 9 Farzandipour M, Nabovati E, Heidarzadeh Arani M, Akbari H, Sharif R, Anvari S. Enhancing asthma patients' self-management through smartphone-based application: design, usability evaluation, and educational intervention. Appl Clin Inform 2019; 10 (05) 870-878
  • 10 Klaassen B, van Beijnum BJ, Hermens HJ. Usability in telemedicine systems-a literature survey. Int J Med Inform 2016; 93: 57-69
  • 11 Kushniruk AW, Borycki EM. Low-cost rapid usability engineering: designing and customizing usable healthcare information systems. Healthc Q 2006; 9 (04) 98-102
  • 12 Ali H, Cole A, Panos G. Transforming patient hospital experience through smart technologies. In: Proceedings of the International Conference on Human-Computer Interaction; HCII '20. Copenhagen, Denmark: Springer International Publishing; 2020: 203-215
  • 13 Coppersmith NA, Sarkar IN, Chen ES. Quality informatics: the convergence of healthcare data, analytics, and clinical excellence. Appl Clin Inform 2019; 10 (02) 272-277
  • 14 Kushniruk A, Borycki E. Low-cost rapid usability testing: its application in both product development and system implementation. Stud Health Technol Inform 2017; 234: 195-200
  • 15 Price M, Weber J, Bellwood P, Diemert S, Habibi R. Evaluation of eHealth system usability and safety. In: Handbook of EHealth Evaluation: An Evidece-Based Approach. Victoria, British Columbia Canada: University of Victoria; 2016: 337-350
  • 16 Kushniruk AW, Borycki EM. Integrating low-cost rapid usability testing into agile system development of healthcare IT: a methodological perspective. Stud Health Technol Inform 2015; 210: 200-204
  • 17 Marcilly R, Schiro J, Beuscart-Zéphir MC, Magrabi F. Building usability knowledge for health information technology: a usability-oriented analysis of incident reports. Appl Clin Inform 2019; 10 (03) 395-408
  • 18 Broekhuis M, van Velsen L, Hermens H. Assessing usability of eHealth technology: a comparison of usability benchmarking instruments. Int J Med Inform 2019; 128: 24-31
  • 19 International Organization for Standardization (ISO). Ergonomics of human-system interaction - Part 11: usability: definitions and concepts. Accessed July 26, 2021 at: https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-2:v1:en
  • 20 Paelke V, Röcker C. User interfaces for cyber-physical systems: challenges and possible approaches. In: Proceedings of the 6th International Conference on Design, User Experience, and Usability; DUXU '17. Vancouver, BC, Canada: Springer International Publishing; 2017. ;10288: 75-85
  • 21 Bastien JM. Usability testing: a review of some methodological and technical aspects of the method. Int J Med Inform 2010; 79 (04) e18-e23
  • 22 Speicher M, Both A, Gaedke M. INUIT: the interface usability instrument. In: Proceedings of the 6th International Conference on Design, User Experience, and Usability; DUXU '17. Vancouver, BC, Canada: Springer International Publishing; 2017. ;10288: 256-268
  • 23 Maramba I, Chatterjee A, Newman C. Methods of usability testing in the development of eHealth applications: a scoping review. Int J Med Inform 2019; 126: 95-104
  • 24 Jaspers MW. A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence. Int J Med Inform 2009; 78 (05) 340-353
  • 25 Thompson KE, Rozanski EP, Haake AR. Here, there, anywhere: remote usability testing that works. In: Proceedings of the 5th Conference on Information Technology Education; CITC5 '04. Salt Lake City, UT, USA: ACM; 2004: 132-137
  • 26 Wysocki T, Pierce J, Caldwell C. et al. A web-based coping intervention by and for parents of very young children with type 1 diabetes: user-centered design. JMIR Diabetes 2018; 3 (04) e16
  • 27 González Sánchez JL, Padilla Zea N, Gutiérrez FL. From usability to playability: introduction to player-centred video game development process. In: Proceedings of the International Conference on Human Centered Design; HCD '09. San Diego, CA, USA: Springer; 2009. ;5619: 65-74
  • 28 Dubey A, Singi K, Kaulgud V. Personas and redundancies in crowdsourced testing. In: Proceedings of the 12th International Conference on Global Software Engineering; ICGSE '17. Buenos Aires, Argentinia: IEEE; 2017: 76-80
  • 29 Silva T, Silveira M, Maurer F. Usability evaluation practices within agile development. In: Proceedings of the 48th Hawaii International Conference on System Sciences. HICSS '15. Kauai, HI, USA: IEEE; 2015: 5133-5142
  • 30 Pawson M, Greenberg S. Extremely rapid usability testing. J Usability Stud 2009; 4: 124-135
  • 31 Kane D. Finding a place for discount usability engineering in agile development: throwing down the gauntlet. In: Proceedings of the Agile Development Conference; ADC '03. Salt Lake City, UT, USA: IEEE; 2003: 40-46
  • 32 Grigoreanu V, Mohanna M. Informal Cognitive Walkthroughs (ICW): paring down and pairing up for an agile world. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI '13. Paris, France: ACM; 2013: 3093-3096
  • 33 Marques AB, Figueiredo R, Amorin W, Rabelo J, Barbosa SDJ, Conte T. Do usability and agility combine?: investigating the adoption of usability modeling in an agile software project in the industry. In: Proceedings of the 17th Brazilian Symposium on Human Factors in Computing Systems; IHC '18. Belém, Brazil: ACM; 2018: 1-11
  • 34 Cavichi de Freitas R, Rodrigues LA, Marques da Cunha A. AGILUS: a method for integrating usability evaluations on agile software development. In: Proceedings of the International Conference on Human-Computer Interaction; HCI '16. Cham, Switzerland: Springer International Publishing; 2016. ;9731: 545-552
  • 35 Magües D, Castro J, Acuna S. HCI usability techniques in agile development. In: Proceedings of the International Conference on Automatica; ICA-ACCA '16. Curico, Chile: IEEE; 2016: 1-7
  • 36 Butt S, Onn A, Butt M, Inam N, Butt S. Incorporation of usability evaluation methods in agile software model. In: Proceedings of the 17th IEEE International Multi Topic Conference; INMIC '14. Karachi, Pakistan: IEEE; 2014: 193-199
  • 37 Federoff M, Villamor C, Miller L. et al. Extreme usability: adapting research approaches for agile development. In: Proceedings of the 26th Annual CHI Conference Extended Abstracts on Human Factors in Computing Systems; CHI EA '08. Florence, Italy: ACM; 2008: 2269-2272
  • 38 Nielsen J. Discount usability: 20 years. Accessed September 13, 2019 at: https://www.nngroup.com/articles/discount-usability-20-years/
  • 39 Schwartz L. Agile-user experience design: an agile and user-centered process?. In: Proceedings of the 8th International Conference on Software Engineering Advances; ICSEA '13. Venice, Italy: IARIA; 2013: 346-351
  • 40 Isa W, Lokman A, Aris S. et al. Engineering rural informatics using agile user centered design. In: Proceedings of the 2nd International Conference on Information and Communication Technology; ICoICT '14. Bandung, Indonesia: IEEE; 2014: 367-372
  • 41 Constantine LL, Lockwood LAD. Usage-centered software engineering: an agile approach to integrating users, user interfaces, and usability into software engineering practice. In: Proceedings of the 25th International Conference on Software Engineering; ICSE '03. Portland, OR, USA: IEEE; 2003: 746-747
  • 42 Verhoeven F, van Gemert-Pijnen J. Discount user-centered e-health design: a quick-but-not-dirty method. In: Proceedings of the Symposium of the Austrian HCI and Usability Engineering Group; USAB '10. Klagenfurt, Austria: Springer; 2010. 638. 101-123
  • 43 Russ AL, Baker DA, Fahner WJ. et al. A rapid usability evaluation (RUE) method for health information technology. AMIA Annu Symp Proc 2010; 2010: 702-706
  • 44 European Union Agency for Network and Information Security (ENISA). Security and Resilience in eHealth: Security Challenges and Risks. Iraklio, Greece: ENISA; 2015: 1-48
  • 45 Kuziemsky CE, Kushniruk A. Context mediated usability testing. Stud Health Technol Inform 2014; 205: 905-909
  • 46 Øvad T, Larsen L. The prevalence of UX design in agile development processes in industry. In: Proceedings of the Agile Conference; AGILE '15. Washington, DC, USA: IEEE; 2015: 40-49
  • 47 Duh HB-L, Tan GCB, Chen VH. Usability evaluation for mobile device: a comparison of laboratory and field tests. In: Proceedings of the 8th Conference on Human-Computer Interaction with Mobile Devices and Services; MobileHCI '06. Helsinki, Finland: ACM; 2006: 181-186
  • 48 Whitenton K. Tools for unmoderated usability testing. Accessed April 28, 2021 at: https://www.nngroup.com/articles/unmoderated-user-testing-tools/
  • 49 Nielsen J. Usability inspection methods. In: Proceedings of the Conference Companion on Human Factors in Computing Systems; CHI '94. Boston, MA, USA: ACM; 1994: 413-414
  • 50 Johnson CM, Johnston D, Crowley PK. et al. EHR Usability Toolkit: A Background Report on Usability and Electronic Health Records. Accessed August 8, 2021 at: https://digital.ahrq.gov/sites/default/files/docs/citation/EHR_Usability_Toolkit_Background_Report.pdf
  • 51 Davis R, Gardner J, Schnall R. A review of usability evaluation methods and their use for testing eHealth HIV interventions. Curr HIV/AIDS Rep 2020; 17 (03) 203-218
  • 52 Wu DTY, Vennemeyer S, Brown K. et al. Usability testing of an interactive dashboard for surgical quality improvement in a large congenital heart center. Appl Clin Inform 2019; 10 (05) 859-869
  • 53 Elbabour F, Alhadreti O, Mayhew P. Eye tracking in retrospective think-aloud usability testing: is there added value?. J Usability Stud 2017; 12 (03) 95-110
  • 54 Elling S, Lentz L, de Jong M. Retrospective think-aloud method: using eye movements as an extra cue for participants' verbalizations. In: Proceedings of the 29th Annual CHI Conference on Human Factors in Computing Systems; SIGCHI '11. Vancouver, BC, Canada: ACM; 2011: 1161-1170
  • 55 Khajouei R, Zahiri Esfahani M, Jahani Y. Comparison of heuristic and cognitive walkthrough usability evaluation methods for evaluating health information systems. J Am Med Inform Assoc 2017; 24 (e1): e55-e60
  • 56 Georgsson M, Staggers N, Årsand E, Kushniruk A. Employing a user-centered cognitive walkthrough to evaluate a mHealth diabetes self-management application: a case study and beginning method validation. J Biomed Inform 2019; 91: 1-15
  • 57 Ghalibaf AK, Jangi M, Habibi MRM, Zangouei S, Khajouei R. Usability evaluation of obstetrics and gynecology information system using cognitive walkthrough method. Electron Physician 2018; 10 (04) 6682-6688
  • 58 Rowley DE, Rhoades DG. The cognitive jogthrough: a fast-paced user interface evaluation procedure. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; CHI '92. Monterey, Calif, USA: ACM; 1992: 389-395
  • 59 Andersen SB, Rasmussen CK, Frøkjær E. Bringing content understanding into usability testing in complex application domains - a case study in eHealth. In: Proceedings of the 6th International Conference on Design, User Experience, and Usability; DUXU '17. Vancouver, BC, Canada: Springer International Publishing; 2017. ;10288: 327-341
  • 60 The User Experience Professionals' Association (UXPA). Usability body of knowledge. Accessed August 9, 2021 at: http://www.usabilitybok.org/methods
  • 61 University of Minnesota Duluth (UMD). Usability evaluation toolbox. Accessed August 15, 2021 at: https://www.d.umn.edu/itss/training/online/usability/toolbox.html
  • 62 Maguire M. Methods to support human-centred design. Int J Hum Comput Stud 2001; 55 (04) 587-634
  • 63 Frøkjær E, Hornbæk K. Cooperative usability testing. In: Proceedings of CHI Extended Abstracts on Human Factors in Computing Systems; CHI EA '05. Portland, OR, USA: ACM; 2005: 1383-1386
  • 64 Power C, Petrie H, Mitchell R. A framework for remote user evaluation of accessibility and usability of websites. In: Proceedings of the 5th International Conference on Universal Access in Human-Computer Interaction; UAHCI '09. San Diego, CA, USA: Springer; 2009. ;5614: 594-601
  • 65 Hammontree M, Weiler P, Nayak N. Remote usability testing. Interact 1994; 1 (03) 21-25
  • 66 Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform 2004; 37 (01) 56-76
  • 67 Medlock MC, Wixon D, McGee M, Welsh D. The rapid iterative test and evaluation method. In: Int Techn, Cost-Justifying Usability. 2nd ed.. Elsevier; 2005: 489-517
  • 68 Wilson C. User Interface Inspection Methods: A User-Centered Design Method. Amsterdam, The Netherlands; Elsevier. 2013
  • 69 Laemller R. Guerrilla Testing: Muting und rasch zu Resultaten. Accessed November 2, 2020 at: https://www.testingtime.com/blog/guerrilla-testing/
  • 70 Firmenich S, Garrido A, Grigera J, Rivero JM, Rossi G. Usability improvement through A/B testing and refactoring. Softw Qual J 2018; 27 (01) 203-240
  • 71 Guaiani F, Muccini H. Crowd and laboratory testing, can they co-exist? An exploratory study. In: Proceedings of the 2nd International Workshop on CrowdSourcing in Software Engineering; CSI-SE '15. Florence, Italy: IEEE; 2015: 32-37
  • 72 Zogaj S, Bretschneider U, Leimeister JM. Managing crowdsourced software testing: a case study based insight on the challenges of a crowdsourcing intermediary. J Bus Econ 2014; 84 (03) 375-405
  • 73 Downey LL. Group usability testing: evolution in usability techniques. J Usability Stud 2007; 2 (03) 133-144
  • 74 Sears A. Heuristic walkthroughs: finding the problems without the noise. Int J Hum Comput Interact 1997; 9 (03) 213-234
  • 75 Nielsen L, Madsen S. The usability expert's fear of agility: an empirical study of global trends and emerging practices. In: Proceedings of the 7th Nordic Conference on Human-Computer Interaction; NordiCHI '12. Copenhagen, Denmark: ACM; 2012: 261-264
  • 76 Höysniemi J, Hämäläinen P, Turkki L. Using peer tutoring in evaluating the usability of a physically interactive computer game with children. Interact Comput 2003; 15 (02) 203-255
  • 77 Khalayli N, Nyhus S, Hamnes K, Terum T. Persona based rapid usability kick-off. In: Proceedings of CHI Extended Abstracts on Human Factors in Computing Systems; CHI EA '07. San Jose, CA, USA: ACM; 2007: 1771-1776
  • 78 Usability in Germany eV (UIG). Fokusgruppe. Accessed November 9, 2020 at: https://www.usability-in-germany.de/definition/fokusgruppe
  • 79 Rosenbaum S, Cockton G, Coyne K, Muller M, Rauch T. Focus groups in HCI. In: Proceedings of CHI Extended Abstracts on Human Factors in Computing Systems; CHI EA '02. Minneapolis, MN, USA: ACM; 2002: 702-703
  • 80 Usability in Germany eV (UIG). Fragebogen. Accessed November 9, 2020 at: https://www.usability-in-germany.de/definition/fragebogen
  • 81 Krause R. Storyboards help visualize UX ideas. Nielsen Norman Group. Accessed November 9, 2020 at: https://www.nngroup.com/articles/storyboards-visualize-ideas/
  • 82 Piyush J, Sanjay KD, Ajay R. Software usability evaluation method. Int J Adv Res Comput Eng Technol 2012; 1 (02) 28-33
  • 83 Holzinger A, Slany WXP. + UE → XU Praktische Erfahrungen mit eXtreme Usability. Informatik-Spektrum 2006; 29 (02) 91-97
  • 84 Borycki E, Senathirajah Y, Kushniruk AW. The future of mobile usability, workflow and safety testing. Stud Health Technol Inform 2017; 245: 15-19
  • 85 Holzinger A. Usability engineering methods for software developers. Commun ACM 2005; 48 (01) 71-74

Zoom Image
Fig. 1 Flow chart of literature review according to the PRISMA statement. We searched ACM Digital Library, Google Scholar, IEEE Xplore, and Medline (via PubMed) (ordered alphabetically).
Zoom Image
Fig. 2 Model behind iterative development of the expert-based prioritization of eHealth usability evaluation methods. Originating from the literature-based list of eHealth usability evaluation methods, the iterative refinement of prioritized eHealth usability evaluation methods is visualized.
Zoom Image
Fig. 3 Saturation of information content concerning experts' interviews.
Zoom Image
Fig. 4 Iterative prioritization and refinement of eHealth usability evaluation methods displayed for each iteration. Visualization of eHealth usability evaluation methods that were altered, newly added, recommended, and not recommended each iteration (ordered alphabetically).
Zoom Image
Fig. 5 Final prioritization of eHealth usability evaluation methods. For easier readability, recommended eHealth usability evaluation methods are ordered by the number of experts' choice (for more details see [Fig. 6]). The same was done for not recommended eHealth usability evaluation methods. All other eHealth usability evaluation methods are arranged alphabetically.
Zoom Image
Fig. 6 eHealth usability evaluation methods according to number of experts’ choice. The number originates from the documented number of experts’ recommendation (as well as non-recommendations) for each eHealth usability evaluation method.