Open Access
CC BY-NC-ND 4.0 · Endosc Int Open 2024; 12(12): E1465-E1475
DOI: 10.1055/a-2465-7283
Review

Validity evidence for endoscopic ultrasound competency assessment tools: Systematic review

Autoren

  • Alessandra Ceccacci

    1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
  • Harneet Hothi

    2   Temerty Faculty of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
  • Rishad Khan

    3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
  • Nikko Gimpaya

    4   Scarborough Health Network Research Institute, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)
  • Brian P.H. Chan

    5   Division of Gastroenterology, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)
    1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
    3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
  • Nauzer Forbes

    6   Division of Gastroenterology and Hepatology, University of Calgary, Calgary, Canada (Ringgold ID: RIN2129)
  • Paul James

    7   Division of Gastroenterology, University Health Network, Toronto, Canada (Ringgold ID: RIN7989)
    1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
    3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
  • Daniel Jeffry Low

    5   Division of Gastroenterology, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)
    1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
    3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
  • Jeffrey Mosko

    8   Division of Gastroenterology, St Michael's Hospital, Toronto, Canada (Ringgold ID: RIN10071)
    1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
    3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
    9   Li Ka Shing Knowledge Institute, Unity Health Toronto, Toronto, Canada (Ringgold ID: RIN508783)
  • Elaine T. Yeung

    5   Division of Gastroenterology, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)
    1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
    3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
  • Catharine M Walsh

    10   Division of Gastroenterology, Hepatology, and Nutrition, and the Research and Learning Institutes, The Hospital for Sick Children, Toronto, Canada (Ringgold ID: RIN7979)
    11   Department of Pediatrics and the Wilson Centre, University of Toronto Temerty Faculty of Medicine, Toronto, Canada (Ringgold ID: RIN12366)
  • Samir C Grover

    1   Department of Medicine, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
    5   Division of Gastroenterology, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)
    3   Division of Gastroenterology and Hepatology, University of Toronto, Toronto, Canada (Ringgold ID: RIN7938)
    4   Scarborough Health Network Research Institute, Scarborough Health Network, Scarborough, Canada (Ringgold ID: RIN507265)

Abstract

Background and study aims Competent endoscopic ultrasound (EUS) performance requires a combination of technical, cognitive, and non-technical skills. Direct observation assessment tools can be employed to enhance learning and ascertain clinical competence; however, there is a need to systematically evaluate validity evidence supporting their use. We aimed to evaluate the validity evidence of competency assessment tools for EUS and examine their educational utility.

Methods We systematically searched five databases and gray literature for studies investigating EUS competency assessment tools from inception to May 2023. Data on validity evidence across five domains (content, response process, internal structure, relations to other variables, and consequences) were extracted and graded (maximum score 15). We evaluated educational utility using the Accreditation Council for Graduate Medical Education framework and methodological quality using the Medical Education Research Quality Instrument (MERSQI).

Results From 2081 records, we identified five EUS assessment tools from 10 studies. All tools are formative assessments intended to guide learning, with four employed in clinical settings. Validity evidence scores ranged from 3 to 12. The EUS and ERCP Skills Assessment Tool (TEESAT), Global Assessment of Performance and Skills in EUS (GAPS-EUS), and the EUS Assessment Tool (EUSAT) had the strongest validity evidence with scores of 12, 10, and 10, respectively. Overall educational utility was high given ease of tool use. MERSQI scores ranged from 9.5 to 12 (maximum score 13.5).

Conclusions The TEESAT, GAPS-EUS, and EUSAT demonstrate strong validity evidence for formative assessment of EUS and are easily implemented in educational settings to monitor progress and support learning.



Publikationsverlauf

Eingereicht: 19. Oktober 2024

Angenommen: 05. November 2024

Accepted Manuscript online:
11. November 2024

Artikel online veröffentlicht:
17. Dezember 2024

© 2024. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial-License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/).

Georg Thieme Verlag KG
Oswald-Hesse-Straße 50, 70469 Stuttgart, Germany