J Reconstr Microsurg 2023; 39(08): 589-600
DOI: 10.1055/a-2003-7689
Original Article

Evaluation of a Microsurgery Training Curriculum

Anita Cuteanu
1   Department of Arts and Sciences, University College London, Bloomsbury, London, United Kingdom
,
Agathe Hellich
1   Department of Arts and Sciences, University College London, Bloomsbury, London, United Kingdom
,
Alba Le Cardinal
1   Department of Arts and Sciences, University College London, Bloomsbury, London, United Kingdom
,
Maeve Thomas
1   Department of Arts and Sciences, University College London, Bloomsbury, London, United Kingdom
,
Anna Valchanova
1   Department of Arts and Sciences, University College London, Bloomsbury, London, United Kingdom
,
Sital Vara
3   The Griffin Institute, Northwick Park and St Mark's Hospital, Harrow, United Kingdom
,
Gwynn Horbury
3   The Griffin Institute, Northwick Park and St Mark's Hospital, Harrow, United Kingdom
,
3   The Griffin Institute, Northwick Park and St Mark's Hospital, Harrow, United Kingdom
,
Walaa Ghamrawi
3   The Griffin Institute, Northwick Park and St Mark's Hospital, Harrow, United Kingdom
,
Naim Slim
2   Surgical Unit, Yeovil District Hospital NHS Foundation Trust, Yeovil, Somerset, United Kingdom
,
Nader Francis
2   Surgical Unit, Yeovil District Hospital NHS Foundation Trust, Yeovil, Somerset, United Kingdom
3   The Griffin Institute, Northwick Park and St Mark's Hospital, Harrow, United Kingdom
,
On behalf of the Microsurgery Training Group at The Griffin Institute › Author Affiliations
Funding None.

Abstract

Background Microsurgery is one of the most challenging areas of surgery with a steep learning curve. To address this educational need, microsurgery curricula have been developed and validated, with the majority focus on technical skills only. The aim of this study was to report on the evaluation of a well-established curriculum using the Kirkpatrick model.

Methods A training curriculum was delivered over 5 days between 2017 and 2020 focusing on (1) microscopic field manipulation, (2) knot tying, nondominant hand usage, (3) 3-D models/anastomosis, and (4) tissue experience. The Kirkpatrick model was applied to evaluate the curriculum at four levels: (1) participants' feedback (2) skills development using a validated, objective assessment tool (Global Assessment Score form) and CUSUM charts were constructed to model proficiency gain (3) and (4) assessing skill retention/long-term impact.

Results In total, 155 participants undertook the curriculum, totaling 5,425 hours of training. More than 75% of students reported the course as excellent, with the remaining voting for “good.” All participants agreed that the curriculum met expectations and would recommend it. Significant improvement in anastomosis attainment scores between days 1 and 3 (median score 4) and days 4 and 5 (median score 5) (W = 494.5, p = 0.00170). The frequency of errors reduced with successive attempts (chi square = 9.81, p = 0.00174). The steepest learning curve was in anastomosis and patency domains, requiring 11 attempts on average to reach proficiency. In total, 88.5% survey respondents could apply the skills learnt and 76.9% applied the skills learnt within 6 months. Key areas of improvement were identified from this evaluation, and actions to address them were implemented in the following programs.

Conclusion Robust evaluation of curriculum can be applied to microsurgery training demonstrating its efficacy in reducing surgical errors with an improvement in overall technical skills that can extend to impact clinical practice. It allows the identification of areas of improvement, driving the refinement of training programs.

Statement

The following study demonstrates a novel and comprehensive approach to evaluate a curriculum, which can be applied to other specialties and courses. It demonstrates its efficacy in reducing surgical errors with an improvement in overall technical skills that can extend to impact clinical practice. It also allows the identification of areas for improvement which can drive the refinement of training programs.


Microsurgery Training Group at the Griffin Institute:


Soha Sajid, Rishi Pandya, Kirstie Forbes, Janak Bechar, Phillip Brown, Sanil Ajwani, Anthony Simons, Harmony Ubhi, Julia Street, Etienne Botha, Hassan Assiri, Alastair Henry, Natalie Redgrave, R. Llewellyn Thomas, Khemanand Maharaj, James Higginson, Sahiba Singh, Zeynep Ünlüer, Josephine Xu, Joanna Miles, Stergios Doumas, Daanesh Zakai, Omar Mirza, Thomas Pepper, Farid Froghi, Sami Ramadan, Shruthi Reddy, Michael Gallagher, Robert Slade, Prabath Kumarasinghe, Kishor A Choudhari, Monica Alexandra Ramirez, Rebecca Exley, Hany Hashesh, W.H. Schreuder, Ciara Fox, JV Williams, R J Pilkington, Kishan Ubayasiri, Robert J MacFarlane, Arkoumanis Panagiotis.




Publication History

Received: 14 March 2022

Accepted: 30 November 2022

Accepted Manuscript online:
23 December 2022

Article published online:
28 February 2023

© 2023. Thieme. All rights reserved.

Thieme Medical Publishers, Inc.
333 Seventh Avenue, 18th Floor, New York, NY 10001, USA

 
  • References

  • 1 Singh M, Ziolkowski N, Ramachandran S, Myers SR, Ghanem AM. Development of a five-day basic microsurgery simulation training course: a cost analysis. Arch Plast Surg 2014; 41 (03) 213-217
  • 2 Grober ED, Hamstra SJ, Wanzel KR. et al. Laboratory based training in urological microsurgery with bench model simulators: a randomized controlled trial evaluating the durability of technical skill. J Urol 2004; 172 (01) 378-381
  • 3 Crouch G, Wong G, Hong J. et al. Validated specialty-specific models for multi-disciplinary microsurgery training laboratories: a systematic review. ANZ J Surg 2021; 91 (06) 1110-1116
  • 4 Javid P, Aydın A, Mohanna PN, Dasgupta P, Ahmed K. Current status of simulation and training models in microsurgery: a systematic review. Microsurgery 2019; 39 (07) 655-668
  • 5 Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005; 27 (01) 10-28
  • 6 Chen J, Xun H, Abousy M, Long C, Sacks JM. No microscope? No problem: a systematic review of microscope-free microsurgery training models. J Reconstr Microsurg 2022; 38 (02) 106-114
  • 7 Merrill MD, Drake L, Lacy MJ, Pratt J. the ID2 Research Group. Reclaiming instructional design. Educ Technol 1996; 5–7 (05) 36
  • 8 Kirkpatrick DL. Training and Development Handbook: A Guide to Human Resource Development. New York: McGraw Hill; 1976
  • 9 Kirkpatrick DL. Evaluating Training Programs. San Francisco: Berrett-Koehler; 1994
  • 10 Kirkpatrick JD, Kirkpatrick WK. Kirkpatrick's Four Levels of Training Evaluation,. 2016. Alexandria, U.S.: Association for Talent Development;
  • 11 Office H. Guidance on the Operation of the Animals (Scientific Procedures) Act 1986. 11 March Updated 2020. [Online]. Accessed January 05, 2023 at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/662364/Guidance_on_the_Operation_of_ASPA.pdf
  • 12 Ghanem AM, Al Omran Y, Shatta B, Kim E, Myers S. Anastomosis lapse index (ALI): a validated end product assessment tool for simulation microsurgery training. J Reconstr Microsurg 2016; 32 (03) 233-241
  • 13 Shepard LA. Commentary: evaluating the validity of formative and interim assessment. Educ Meas 2009; 28 (03) 32-27
  • 14 Miskovic D, Wyles SM, Carter F, Coleman MG, Hanna GB. Development, validation and implementation of a monitoring tool for training in laparoscopic colorectal surgery in the English National Training Program. Surg Endosc 2011; 25 (04) 1136-1142
  • 15 Paladino JR, Gasteratos K, Akelina Y, Marshall B, Papazoglou LG, Strauch RJ. The benefits of expert instruction in microsurgery courses. J Reconstr Microsurg 2021; 37 (02) 143-153
  • 16 Navia A, Tejos R, Canahuate S. et al. MicrosimUC: validation of a low-cost, portable, do-it-yourself microsurgery training kit. J Reconstr Microsurg 2022; 38 (05) 409-419 Jun
  • 17 Dąbrowski F, Stogowski P, Białek J. et al. Video-based microsurgical education versus stationary basic microsurgical course: a noninferiority randomized controlled study. J Reconstr Microsurg 2022; 38 (07) 585-592
  • 18 Albano NJ, Zeng W, Lin C, Uselmann AJ, Eliceiri KW, Poore SO. Augmentation of chicken thigh model with fluorescence imaging allows for real-time, high fidelity assessment in supermicrosurgery training. J Reconstr Microsurg 2021; 37 (06) 514-518
  • 19 Satterwhite T, Son J, Carey J. et al. Microsurgery education in residency training: validating an online curriculum. Ann Plast Surg 2012; 68 (04) 410-414
  • 20 Moulton CA, Dubrowski A, Macrae H, Graham B, Grober E, Reznick R. Teaching surgical skills: what kind of practice makes perfect?: a randomized, controlled trial. Ann Surg 2006; 244 (03) 400-409
  • 21 Anastakis DJ, Regehr G, Reznick RK. et al. Assessment of technical skills transfer from the bench training model to the human model. Am J Surg 1999; 177 (02) 167-170
  • 22 Masud D, Haram N, Moustaki M, Chow W, Saour S, Mohanna PN. Microsurgery simulation training system and set up: an essential system to complement every training programme. J Plast Reconstr Aesthet Surg 2017; 70 (07) 893-900
  • 23 Rodriguez JR, Yañez R, Cifuentes I, Varas J, Dagnino B. Microsurgery workout: a novel simulation training curriculum based on nonliving models. Plast Reconstr Surg 2016; 138 (04) 739e-747e
  • 24 Boecker A, Kornmann J, Xiong L. et al. A structured, microsurgical training curriculum improves the outcome in lower extremity reconstruction free flap residency training: The Ludwigshafen Concept. J Reconstr Microsurg 2021; 37 (06) 492-502
  • 25 Bates R. A critical analysis of evaluation practice: the Kirkpatrick model and the principle of beneficence. Eval Program Plann 2004; 27 (03) 341-347