CC BY-NC-ND 4.0 · Indian J Plast Surg 2019; 52(02): 216-221
DOI: 10.1055/s-0039-1695658
Original Article
Association of Plastic Surgeons of India

Objective Assessment of Microsurgery Competency—In Search of a Validated Tool

Sheeja Rajan
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
2   MCI Regional Centre for Medical Education Technology, Kozhikode, Kerala, India
,
Ranjith Sathyan
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
,
L. S. Sreelesh
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
,
Anu Anto Kallerey
3   Taluk Hospital Nadapuram, Kozhikode, Kerala, India
,
Aarathy Antharjanam
4   Taluk Hospital Pazhayangadi, Kannur, Kerala, India
,
Raj Sumitha
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
,
Jinchu Sundar
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
,
Ronnie Johnson John
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
,
S. Soumya
1   Department of Plastic Surgery, Government Medical College, Kozhikode, Kerala, India
› Author Affiliations
Further Information

Address for correspondence

Sheeja Rajan T M, MS, DLO, MCh
DNB Plastic Surgery, Department of Plastic Surgery, Government Medical College
Kozhikode, Kerala
India   

Publication History

Publication Date:
16 September 2019 (online)

 

Abstract

Microsurgical skill acquisition is an integral component of training in plastic surgery. Current microsurgical training is based on the subjective Halstedian model. An ideal microsurgery assessment tool should be able to deconstruct all the subskills of microsurgery and assess them objectively and reliably. For our study, to analyze the feasibility, reliability, and validity of microsurgery skill assessment, a video-based objective structured assessment of technical skill tool was chosen. Two blinded experts evaluated 40 videos of six residents performing microsurgical anastomosis for arteriovenous fistula surgery. The generic Reznick's global rating score (GRS) and University of Western Ontario microsurgical skills acquisition/assessment (UWOMSA) instrument were used as checklists. Correlation coefficients of 0.75 to 0.80 (UWOMSA) and 0.71 to 0.77 (GRS) for interrater and intrarater reliability showed that the assessment tools were reliable. Convergent validity of the UWOMSA tool with the prevalidated GRS tool showed good agreement. The mean improvement of scores with years of residency was measured with analysis of variance. Both UWOMSA (p-value: 0.034) and GRS (p-value: 0.037) demonstrated significant improvement in scores from postgraduate year 1 (PGY1) to PGY2 and a less marked improvement from PGY2 to PGY3. We conclude that objective assessment of microsurgical skills in an actual clinical setting is feasible. Tools like UWOMSA are valid and reliable for microsurgery assessment and provide feedback to chart progression of learning. Acceptance and validation of such objective assessments will help to improve training and bring uniformity to microsurgery education.


#

Introduction

Microsurgery is a routine and indispensable aspect of plastic and reconstructive surgery. It is the backbone of many procedures ranging from a basic hand trauma repair to the more recent advances like the allotransplantation of face or hand. It is not an exaggeration therefore to consider microsurgery as an essential skill to be acquired by all aspiring plastic surgeons.

Proficiency in the requisite technical skills is necessary for good surgical performance of a trainee in the operating room. Microsurgery was being taught in most centers using the traditional Halstedian apprenticeship model,[1] by assisting the mentors and then proceeding to perform steps of the surgery under supervision. Satisfactory performance was based upon the number of surgeries assisted or performed, assessment by seniors, or simply by completion of the course. Such assessments are neither reproducible nor criterion based.[2] [3] [4] [5] None of these methods can actually measure the competence or technical skills of a trainee but at best offers only the subjective general opinion of a mentor, which is not quantifiable. Therefore, it is imperative to introduce new objective assessment modalities for microsurgery with adequate description of their feasibility, reliability, and validity in the actual clinical setting.[6] For our study, we have chosen a video-based model of objective structured assessment of technical skills (OSATS), introduced by Martin et al.[7] OSATS is a well-recognized medical education technology tool which involves learners performing structured tasks under direct observation while being evaluated with the help of a checklist. Qualitative and quantitative video analysis has been applied for resident training in multiple surgical specialties. Studies by Goldenberg and Grantcharov,[8] Mota et al,[9] and Hu et al[10] have all shown the potential benefits of video-based education in surgical training. Herrera-Almario et al[11] however have stated that self-assessment by video-based learning can improve surgical skills only when supplemented with surgical demonstration and feedback from the mentors.

The primary aim of our study was to demonstrate the feasibility of objective assessment of microsurgical skills in residents performing arteriovenous fistula creation for renal dialysis. We also assessed the validity of two assessment tools in arteriovenous fistula microvascular anastomosis.


#

Methodology

A cross-sectional study with six plastic surgery residents, two each in their postgraduate years (PGYs) PGY1 (n = 2), PGY2 (n = 2), and PGY3 (n = 2) at our institution were included in the study population. Study tools consisted of 10 surgical videos and two assessment scales. Ten patients for arteriovenous fistula creation using a modified Cimino-Brescia technique were selected after obtaining appropriate consents from patients and ethical clearance from the institution. Video recordings of residents performing the surgery were made using a 13-megapixel mobile camera. The videos captured the hand movements, instrument handling, suture training as well as the details of the suturing technique. Patient identity was masked with false numbers and videos were muted during evaluation. The videos were rated by two experts who were blinded to the residents’ identity and year of training. The experts were postgraduate teachers with more than 8 years of experience in microsurgery.

Evaluations of the videos were done using two assessment scales namely, a generic prevalidated global rating scale (GRS) and a procedure-specific University of Western Ontario microsurgical skills acquisition/assessment instrument (UWOMSA) scale. In terms of medical education technology, this assessment can be considered as a video-modified OSATS.[12] [13] [14] The GRS[15] ([Supplementary Material 1]; available online only) has eight behaviorally anchored criteria, viz. respect for tissues, time in motion, instrument handling, suture training, flow of operation, knowledge of procedure, final product, and overall performance. Five-point Likert scores can be assigned with the scoring chart which has indicators for the desired competence required for each score.

The UWOMSA scale ([Supplementary Material 2]; available online only) developed by Temple and Ross[16] is a dedicated tool for microsurgery assessment. It is a simple tool comprising items considered significant for ensuring vascular patency after microsurgical anastomosis. This scale comprises two parts with five-point Likert scores. The first part is the knot tying module, for assessing the quality of the knot, efficiency, and handling. The second part is the anastomotic module comprising vessel preparation, suturing, and final product assessments. Under each subcompetency, three anchors are given to increase objectivity, for example, a score of 1 can be assigned if a candidate forgets to do vessel dilatation or adventitial stripping, places needle inaccurately, or takes a back-wall stitch. Clean adventitial stripping, gentle technique, or a patent anastomosis with evenly placed sutures merits a maximum score of 5.

Each assessor rated the same video twice with a mean interval of 3 days between the assessments to reduce recall bias. A total of 40 assessments were thus obtained and evaluated.


#

Results and Analysis

Statistical analysis of the results was done using the SPSS v1.6 package. Reliability of the task specific UWOMSA score and its agreement with a generic GRS scale were tested. Since UWOMSA is not a validated scale, we analyzed the correlation of its findings with a prevalidated GRS. This is a usual statistical exercise. But it needs to be kept in mind that the UWOMSA is a specific tool for assessment of microsurgery, whereas the GRS is a generic scale which can be used in several surgical scenarios. Average measures of the intraclass correlation coefficient were obtained for the following:

  • Intrarater reliability (consistency of scoring by the same assessor at different periods of time; [Table 1]).

  • Interrater reliability (agreement between the two assessors; [Table 2]).

  • Correlation coefficient between the UWOMSA and GRS ratings (agreement between the two scales; [Table 3]).

Table 1

Intrarater reliability

Tool

Correlation coefficient

Inference

UWOMSA_Knot tying module

0.75

Good agreement

UWOMSA_Anastomosis module

0.53

Moderate agreement

Global rating scale

0.71

Good agreement

Table 2

Interrater reliability

Tool

Correlation coefficient

Inference

UWOMSA_Knot tying module

0.80

Excellent agreement

UWOMSA_Anastomosis module

0.56

Moderate agreement

Global rating scale

0.77

Good agreement

Table 3

Convergent validity of UWOMSA with global rating scale

Tool

Correlation coefficient

Inference

UWOMSA_Knot tying module

0.71

Good agreement

UWOMSA_Anastomosis module

0.74

Good agreement

Descriptive statistics of the mean scores and differences between UWOMSA and GRS were assessed ([Figs. 1] [2] [3]). Analysis of variance (ANOVA) was used to assess difference in scores between PGY1, PGY2, and PGY3 of training ([Table 4]).

Zoom Image
Fig. 1 Changes in mean scores of UWOMSA knot tying module (KTM) with years of postgraduate training.
Zoom Image
Fig. 2 Changes in mean scores of UWOMSA anastomoses module (AM) with years of postgraduate training.
Zoom Image
Fig. 3 Changes in mean scores of GRS with years of postgraduate training.
Table 4

Analysis of variance (ANOVA) of mean scores with years of postgraduate training

Test

p-Value

UWOMSA_Knot tying module

0.034

UWOMSA_Anastomosis module

0.121

Global rating scale

0.037

In broad terms, validity refers to the extent that a scale measures what it intends to measure. In the case of UWOMSA, it means to the extent to which it can measure the microsurgical skill. To determine the convergent validity of UWOMSA, each surgical video was assessed with a generic GRS scale which is a standard scale for suturing procedures. GRS is a seminal scale that is used for several surgical skill assessments. In our study, an intraclass correlation coefficient was used to assess correlation between UWOMSA and GRS scales.

Reliability Assessment

Reliability describes the ability of a test to produce the same results on repeated trials.[17] Agreements between the two raters (interrater reliability) and within a single rater at two different points of time (intrarater reliability) were both measured by Pearson's intraclass correlation coefficient. A correlation coefficient of 0.61 to 0.80 was considered good agreement and values between 0.80 and 1.00 were considered as excellent agreement.[18]

The intrarater reliability ([Table 1]) of GRS showed a good agreement with an intraclass Pearson's correlation coefficient of 0.71. The intrarater reliability of UWOMSA showed a good agreement with intraclass correlation coefficient of 0.75 for the knot-tying module, but only a moderate agreement of 0.53 for the anastomotic module. The interrater reliability ([Table 2]) of GRS was in good agreement with an intraclass correlation coefficient of 0.77. The interrater reliability of UWOMSA was noted as an excellent agreement with an intraclass correlation coefficient of 0.8 for the knot-tying module and a moderate agreement of 0.56 for the anastomotic module. These results show that UWOMSA is a reliable tool for assessment of microsurgical skills. However, in our study, reliability is more for assessment of knot-tying skills than for anastomotic skills.


#

Validity Assessment

As per broad definition, validity refers to the extent to which an assessment tool measures what it is intended to measure.[17] [18] For UWOMSA, it refers to whether the scale does indeed measure microsurgical skills, adequately. This requires determining the criterion validity for UWOMSA in two steps. First, each video was rated with Reznick's[3] GRS which is a prevalidated, generic scale and an accepted standard for performance assessment in several surgical procedures. Correlation between the GRS and UWOMSA scales was then determined using Pearson's correlation coefficient. Good agreements between the generic GRS and specific UWOMSA tools for both knot-tying modules and anastomotic modules were obtained with a correlation coefficient of 0.71 and 0.74, respectively ([Table 3]). This shows that UWOMSA is a valid tool for microsurgery skill assessment.


#

Assessment of Progression of Training

Measurement of the ANOVA was used to assess the difference between scores of residents against their years of training. p-Value of less than 0.05 was considered significant. The p-values obtained were 0.037 for UWOMSA (knot-tying module) and 0.034 for GRS respectively and hence statistically significant ([Table 4]). The p-value was 0.12 for the UWOMSA anastomosis module. Mean scores of UWOMSA ([Figs. 1] [2]) and GRS ([Fig. 3]) showed a significant improvement of scores from PGY1 to PGY2. There was a less appreciable change from PGY2 to PGY3. We were thus able to chart the progression of our postgraduate trainees over the years of training and objectively measure the increment in scores across the years of training in all our residents.


#
#

Discussion

Microvascular surgery involves vascular repair of very small caliber blood vessels of 1 to 4 mm diameter or less.[19] It involves a “unique set of surgical principles” and can be considered technically more demanding than routine surgical practice.[2] Microsurgery demands finer hand movements and precision dexterity in combination with stereoscopic vision and visuospatial skills.[2] [20] [21] Needless to say, microsurgery has a steep learning curve and successful outcome correlates with the surgical experience. This exemplifies the educational theory of Dreyfus[22] [23] where skill acquisition occurs in a stepwise manner from novice to expert level. Though it is commonly acknowledged that the hallmark of a good surgeon is a combination of cognitive ability and dexterity, recent research has shown that there is no correlation between the two.[2] [24] It is imperative that all plastic surgery teaching units must have a curriculum and objective assessment methods aligned for evaluation of both surgical skills and theoretical knowledge.

Ramachandran et al[2] and Ghanem et al[25] et al have done a critical analysis of the literature on objective assessment tools for microsurgical competency. Standardized microsurgical tests were pioneered by Starkes et al[26] in 1993 on a low-fidelity bench model. Grober et al[12] [13] and Hong[27] et al have experimented with sophisticated objective assessment techniques for stereoscopic visual acuity and hand motion analysis. More recently, focus was given to work place-based assessments wherein the tools have to be adapted for real-time assessments within the operation theater.

Kalu et al[28] have categorized subjective and objective types of assessment for microsurgical skills. An ideal tool for assessing microsurgical skills should be “based on objective structured criteria, inexpensive, and acceptable to stake holders.”[2] [29] It must deconstruct the skills for microsurgery, namely:

  • Dexterity (steadiness of hands, flow of movements, finesse of surgery).

  • Visuospatial ability (e.g., tissue dissection instrument handling, suture placement, knot tying).

  • Operative flow (e.g., steps, movements, speed).

  • Judgment (e.g., timely irrigation of the field, patency tests, control of bleeding in the field).

In 2009, Chan et al[30] have developed the structured assessment of microsurgical skills (SAMS) in a clinical setting. More recently, the UWOMSA tool was developed in 2011 by Ross and Temple[16] as a bench training model on chicken leg artery but is finding application in microsurgery assessment during real surgical scenarios as well. The scale has two separate modules for knot tying and anastomosis with behavioral anchors as items considered important for successful microvascular anastomosis. Both SAMS and UWOMSA are good tools which combine three forms of objective assessment—a rating scale, an error list, and summative rating. However, there is no consensus on a standard scale to be in microsurgery practice today. Again, having high scores on objective assessment alone may not correlate with professional success in microsurgery despite ensuring proficiency in technical skill of the learner.

So why do we need objective assessment of microsurgical skills? Grober et al[12] [13] have stated that operation theaters may no longer provide the ideal atmosphere to foster the skills of a new inexperienced surgeon, due to ethical issues as well as time and resource constraints. Though multihour microsurgery courses do improve the microsurgery skills, the participants are only given a “certificate of completion of course” rather than a “certificate of competence.”[2] In our study, we considered a score of 3 or more in each item on the Likert scale as the level of attaining competence.[26] Furthermore, the longevity of such skills acquired and their translation to clinical setting will depend upon the time lapse between the course and clinical performance. Apart from certification, evidence shows that objective assessment of microsurgery will help to assess the degree of improvement of trainees, mark the career progression, and also ensure exacting standards of training.[31] [32] [33]

Atkins et al[34] enlisted several sophisticated assessment methods like hand motion analysis and physiologic study of the anastomosed vessel but these are expensive techniques that cannot be transferred to an actual clinical setting in a teaching unit. In our study, we have tried to do objective assessment of our residents using a simple UWOMSA tool during surgery. Feasibility, reliability, and validity of such a tool had to be assessed first. We found that such an assessment is feasible even in a setting with basic infrastructure. A good agreement with the prevalidated GRS—an accepted tool for a variety of surgical settings—showed that UWOMSA is a valid tool to assess microsurgery skills. Reliability of UWOMSA was assessed with two blinded microsurgery experts, repeatedly at different points of time, with GRS as control. Good and excellent agreement with interrater and intrarater scores proved that the UWOMSA especially the knot-tying module was reliable for microsurgery assessment. There was only moderate agreement for scoring of the anastomotic module which needs further studies. We feel that modification of anastomotic module by adding more subcompetencies would help to improve the reliability of UWOMSA. UWOMSA also lacks “judgment” as a descriptive parameter as compared with the SAMS tool.

We were also able to chart the progression of our postgraduate trainees over the years of training and measure a marked improvement in scores between the first and second years. This probably indicates that the basic microsurgical skill acquisition happens during this period and further along the training the refinement of the skill takes place. Selber et al[35] applied the SAMS tool and were able to demonstrate similar improvement in performance of their microsurgery fellows.

Kaufman et al[36] stated that mastery of a psychomotor skill passes through cognitive, associative, and autonomous phases. Acquisition of such a skill demands that the learner is periodically provided with the “knowledge of results” or feedback such that they can improve on their performance.[37] Our study shows that assessment with UWOMSA can satisfy such a formative role whereby the quantitative measurement of microsurgical skills and degree of improvement of our residents can be documented and used for appropriate feedback on progress of learning. We believe that the acceptance and validation of such an assessment in all teaching units will be the first step towards global standardization of training and development of a uniform curriculum for microsurgery education.[38] [39] [40] This study is even more relevant considering the changing trends of medical education in India. Fig. 4 The new Graduate Medical Regulations 2019 for MBBS is also built on “competency based medical education” which requires certification of predefined skills of the student at each level of training.


#

Conclusion

On the basis of our results, we conclude that the objective assessment of microsurgical skills in an actual clinical setting is feasible. The UWOMSA is a valid, reliable, and feasible tool for assessment of microsurgical performance. Formative assessments with such a standardized tool will be useful to chart progression of learning and become a valuable source of feedback to learners. Acceptance and wider adoption of such objective assessment tools will help to improve resident education and bring uniformity to microsurgery training across centers.


#
#

Conflict of Interest

None declared.

Supplementary Material

  • References

  • 1 Barnes RW, Lang NP, Whiteside MF. Halstedian technique revisited. Innovations in teaching surgical skills. Ann Surg 1989; 210 (01) 118-121
  • 2 Ramachandran S, Ghanem AM, Myers SR. Assessment of microsurgery competency-where are we now?. Microsurgery 2013; 33 (05) 406-415
  • 3 Reznick RK, MacRae H. Teaching surgical skills--changes in the wind. N Engl J Med 2006; 355 (25) 2664-2669
  • 4 Doyle JD, Webber EM, Sidhu RS. A universal global rating scale for the evaluation of technical skills in the operating room. Am J Surg 2007; 193 (05) 551-555 , discussion 555
  • 5 Saleh GM, Voyatzis G, Hance J, Ratnasothy J, Darzi A. Evaluating surgical dexterity during corneal suturing. Arch Ophthalmol 2006; 124 (09) 1263-1266
  • 6 Balasundaram I, Aggarwal R, Darzi LA. Development of a training curriculum for microsurgery. Br J Oral Maxillofac Surg 2010; 48 (08) 598-606
  • 7 Martin JA, Regehr G, Reznick R. et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 1997; 84 (02) 273-278
  • 8 Goldenberg MG, Grantcharov TP. Video-analysis for the assessment of practical skill. Tijdschrift voor Urologie 2016; 6 (08) 128-136
  • 9 Mota P, Carvalho N, Carvalho-Dias E, João Costa M, Correia-Pinto J, Lima E. Video-based surgical learning: improving trainee education and preparation for surgery. J Surg Educ 2018; 75 (03) 828-835
  • 10 Hu YY, Mazer LM, Yule SJ. et al. Complementing operating room teaching with video-based coaching. JAMA Surg 2017; 152 (04) 318-325
  • 11 Herrera-Almario GE, Kirk K, Guerrero VT, Jeong K, Kim S, Hamad GG. The effect of video review of resident laparoscopic surgical skills measured by self- and external assessment. Am J Surg 2016; 211 (02) 315-320
  • 12 Grober ED, Hamstra SJ, Wanzel KR. et al. Laboratory based training in urological microsurgery with bench model simulators: a randomized controlled trial evaluating the durability of technical skill. J Urol 2004; 172 (01) 378-381
  • 13 Grober ED, Hamstra SJ, Wanzel KR. et al. Validation of novel and objective measures of microsurgical skill: hand-motion analysis and stereoscopic visual acuity. Microsurgery 2003; 23 (04) 317-322
  • 14 Ezra DG, Aggarwal R, Michaelides M. et al. Skills acquisition and assessment after a microsurgical skills course for ophthalmology residents. Ophthalmology 2009; 116 (02) 257-262
  • 15 Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg 1997; 173 (03) 226-230
  • 16 Temple CL, Ross DC. A new, validated instrument to evaluate competency in microsurgery: the University of Western Ontario Microsurgical Skills Acquisition/Assessment instrument [outcomes article]. Plast Reconstr Surg 2011; 127 (01) 215-222
  • 17 Carmines ED, Zeller RA. eds. Reliability and Validity Assessment (Quantitative Applications in the Social Sciences). Thousand oaks, CA: Sage Publications; 1979
  • 18 Byrt T. How good is that agreement?. Epidemiology 1996; 7 (05) 561
  • 19 Lee S, Frank DH, Choi SY. Historical review of small and microvascular vessel surgery. Ann Plast Surg 1983; 11 (01) 53-62
  • 20 Wanzel KR, Hamstra SJ, Caminiti MF, Anastakis DJ, Grober ED, Reznick RK. Visual-spatial ability correlates with efficiency of hand motion and successful surgical performance. Surgery 2003; 134 (05) 750-757
  • 21 Murdoch JR, Bainbridge LC, Fisher SG, Webster MH. Can a simple test of visual-motor skill predict the performance of microsurgeons?. J R Coll Surg Edinb 1994; 39 (03) 150-152
  • 22 Dreyfus SE. The five-stage model of adult skill acquisition. Bull Sci Technol Soc 2004; 24 (03) 177-181
  • 23 Dreyfus HL, Dreyfus SE, Athanasiou T. Mind Over Machine: The Power of Human Intuition and Expertise in the Era of the Computer. New York, NY: The Free Press; 1986
  • 24 Scott DJ, Valentine RJ, Bergen PC. et al. Evaluating surgical competency with the American Board of Surgery In-Training Examination, skill testing, and intraoperative assessment. Surgery 2000; 128 (04) 613-622
  • 25 Ghanem AM, Hachach-Haram N, Leung CC, Myers SR. A systematic review of evidence for education and training interventions in microsurgery. Arch Plast Surg 2013; 40 (04) 312-319
  • 26 Starkes JL, Payk I, Hodges NJ. Developing a standardized test for the assessment of suturing skill in novice microsurgeons. Microsurgery 1998; 18 (01) 19-22
  • 27 Hong JW, Kim YS, Lee WJ, Hong HJ, Roh TS, Song SY. Evaluation of the efficacy of microsurgical practice through time factor added protocol: microsurgical training using nonvital material. J Craniofac Surg 2010; 21 (03) 876-881
  • 28 Kalu PU, Atkins J, Baker D, Green CJ, Butler PE. How do we assess microsurgical skill?. Microsurgery 2005; 25 (01) 25-29
  • 29 Aggarwal R, Grantcharov T, Moorthy K, Milland T, Darzi A. Toward feasible, valid, and reliable video-based assessments of technical surgical skills in the operating room. Ann Surg 2008; 247 (02) 372-379
  • 30 Chan W, Niranjan N, Ramakrishnan V. Structured assessment of microsurgery skills in the clinical setting. J Plast Reconstr Aesthet Surg 2010; 63 (08) 1329-1334
  • 31 Dumestre D, Yeung JK, Temple-Oberle C. Evidence-based microsurgical skills acquisition series part 2: validated assessment instruments--a systematic review. J Surg Educ 2015; 72 (01) 80-89
  • 32 van Hove PD, Tuijthof GJ, Verdaasdonk EG, Stassen LP, Dankelman J. Objective assessment of technical surgical skills. Br J Surg 2010; 97 (07) 972-987
  • 33 Insel A, Carofino B, Leger R, Arciero R, Mazzocca AD. The development of an objective model to assess arthroscopic performance. J Bone Joint Surg Am 2009; 91 (09) 2287-2295
  • 34 Atkins JL, Kalu PU, Lannon DA, Green CJ, Butler PE. Training in microsurgical skills: does course-based learning deliver?. Microsurgery 2005; 25 (06) 481-485
  • 35 Selber JC, Chang EI, Liu J. et al. Tracking the learning curve in microsurgical skill acquisition. Plast Reconstr Surg 2012; 130 (04) 550e-557e
  • 36 Kaufman HH, Wiegand RL, Tunick RH. Teaching surgeons to operate--principles of psychomotor skills training. Acta Neurochir (Wien) 1987; 87 (01) (02) 1-7
  • 37 Pandey VA, Wolfe JH, Black SA, Cairols M, Liapis CD, Bergqvist D. European Board of Vascular Surgery. Self-assessment of technical skill in surgery: the need for expert feedback. Ann R Coll Surg Engl 2008; 90 (04) 286-290
  • 38 Leung CC, Ghanem AM, Tos P, Ionac M, Froschauer S, Myers SR. Towards a global understanding and standardisation of education and training in microsurgery. Arch Plast Surg 2013; 40 (04) 304-311
  • 39 Satterwhite T, Son J, Carey J. et al. Microsurgery education in residency training: validating an online curriculum. Ann Plast Surg 2012; 68 (04) 410-414
  • 40 Tolba RH, Czigány Z, Osorio Lujan S. et al. Defining standards in experimental microsurgical training: recommendations of the European Society for Surgical Research (ESSR) and the International Society for Experimental Microsurgery (ISEM). Eur Surg Res 2017; 58 (05) (06) 246-262

Address for correspondence

Sheeja Rajan T M, MS, DLO, MCh
DNB Plastic Surgery, Department of Plastic Surgery, Government Medical College
Kozhikode, Kerala
India   

  • References

  • 1 Barnes RW, Lang NP, Whiteside MF. Halstedian technique revisited. Innovations in teaching surgical skills. Ann Surg 1989; 210 (01) 118-121
  • 2 Ramachandran S, Ghanem AM, Myers SR. Assessment of microsurgery competency-where are we now?. Microsurgery 2013; 33 (05) 406-415
  • 3 Reznick RK, MacRae H. Teaching surgical skills--changes in the wind. N Engl J Med 2006; 355 (25) 2664-2669
  • 4 Doyle JD, Webber EM, Sidhu RS. A universal global rating scale for the evaluation of technical skills in the operating room. Am J Surg 2007; 193 (05) 551-555 , discussion 555
  • 5 Saleh GM, Voyatzis G, Hance J, Ratnasothy J, Darzi A. Evaluating surgical dexterity during corneal suturing. Arch Ophthalmol 2006; 124 (09) 1263-1266
  • 6 Balasundaram I, Aggarwal R, Darzi LA. Development of a training curriculum for microsurgery. Br J Oral Maxillofac Surg 2010; 48 (08) 598-606
  • 7 Martin JA, Regehr G, Reznick R. et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 1997; 84 (02) 273-278
  • 8 Goldenberg MG, Grantcharov TP. Video-analysis for the assessment of practical skill. Tijdschrift voor Urologie 2016; 6 (08) 128-136
  • 9 Mota P, Carvalho N, Carvalho-Dias E, João Costa M, Correia-Pinto J, Lima E. Video-based surgical learning: improving trainee education and preparation for surgery. J Surg Educ 2018; 75 (03) 828-835
  • 10 Hu YY, Mazer LM, Yule SJ. et al. Complementing operating room teaching with video-based coaching. JAMA Surg 2017; 152 (04) 318-325
  • 11 Herrera-Almario GE, Kirk K, Guerrero VT, Jeong K, Kim S, Hamad GG. The effect of video review of resident laparoscopic surgical skills measured by self- and external assessment. Am J Surg 2016; 211 (02) 315-320
  • 12 Grober ED, Hamstra SJ, Wanzel KR. et al. Laboratory based training in urological microsurgery with bench model simulators: a randomized controlled trial evaluating the durability of technical skill. J Urol 2004; 172 (01) 378-381
  • 13 Grober ED, Hamstra SJ, Wanzel KR. et al. Validation of novel and objective measures of microsurgical skill: hand-motion analysis and stereoscopic visual acuity. Microsurgery 2003; 23 (04) 317-322
  • 14 Ezra DG, Aggarwal R, Michaelides M. et al. Skills acquisition and assessment after a microsurgical skills course for ophthalmology residents. Ophthalmology 2009; 116 (02) 257-262
  • 15 Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg 1997; 173 (03) 226-230
  • 16 Temple CL, Ross DC. A new, validated instrument to evaluate competency in microsurgery: the University of Western Ontario Microsurgical Skills Acquisition/Assessment instrument [outcomes article]. Plast Reconstr Surg 2011; 127 (01) 215-222
  • 17 Carmines ED, Zeller RA. eds. Reliability and Validity Assessment (Quantitative Applications in the Social Sciences). Thousand oaks, CA: Sage Publications; 1979
  • 18 Byrt T. How good is that agreement?. Epidemiology 1996; 7 (05) 561
  • 19 Lee S, Frank DH, Choi SY. Historical review of small and microvascular vessel surgery. Ann Plast Surg 1983; 11 (01) 53-62
  • 20 Wanzel KR, Hamstra SJ, Caminiti MF, Anastakis DJ, Grober ED, Reznick RK. Visual-spatial ability correlates with efficiency of hand motion and successful surgical performance. Surgery 2003; 134 (05) 750-757
  • 21 Murdoch JR, Bainbridge LC, Fisher SG, Webster MH. Can a simple test of visual-motor skill predict the performance of microsurgeons?. J R Coll Surg Edinb 1994; 39 (03) 150-152
  • 22 Dreyfus SE. The five-stage model of adult skill acquisition. Bull Sci Technol Soc 2004; 24 (03) 177-181
  • 23 Dreyfus HL, Dreyfus SE, Athanasiou T. Mind Over Machine: The Power of Human Intuition and Expertise in the Era of the Computer. New York, NY: The Free Press; 1986
  • 24 Scott DJ, Valentine RJ, Bergen PC. et al. Evaluating surgical competency with the American Board of Surgery In-Training Examination, skill testing, and intraoperative assessment. Surgery 2000; 128 (04) 613-622
  • 25 Ghanem AM, Hachach-Haram N, Leung CC, Myers SR. A systematic review of evidence for education and training interventions in microsurgery. Arch Plast Surg 2013; 40 (04) 312-319
  • 26 Starkes JL, Payk I, Hodges NJ. Developing a standardized test for the assessment of suturing skill in novice microsurgeons. Microsurgery 1998; 18 (01) 19-22
  • 27 Hong JW, Kim YS, Lee WJ, Hong HJ, Roh TS, Song SY. Evaluation of the efficacy of microsurgical practice through time factor added protocol: microsurgical training using nonvital material. J Craniofac Surg 2010; 21 (03) 876-881
  • 28 Kalu PU, Atkins J, Baker D, Green CJ, Butler PE. How do we assess microsurgical skill?. Microsurgery 2005; 25 (01) 25-29
  • 29 Aggarwal R, Grantcharov T, Moorthy K, Milland T, Darzi A. Toward feasible, valid, and reliable video-based assessments of technical surgical skills in the operating room. Ann Surg 2008; 247 (02) 372-379
  • 30 Chan W, Niranjan N, Ramakrishnan V. Structured assessment of microsurgery skills in the clinical setting. J Plast Reconstr Aesthet Surg 2010; 63 (08) 1329-1334
  • 31 Dumestre D, Yeung JK, Temple-Oberle C. Evidence-based microsurgical skills acquisition series part 2: validated assessment instruments--a systematic review. J Surg Educ 2015; 72 (01) 80-89
  • 32 van Hove PD, Tuijthof GJ, Verdaasdonk EG, Stassen LP, Dankelman J. Objective assessment of technical surgical skills. Br J Surg 2010; 97 (07) 972-987
  • 33 Insel A, Carofino B, Leger R, Arciero R, Mazzocca AD. The development of an objective model to assess arthroscopic performance. J Bone Joint Surg Am 2009; 91 (09) 2287-2295
  • 34 Atkins JL, Kalu PU, Lannon DA, Green CJ, Butler PE. Training in microsurgical skills: does course-based learning deliver?. Microsurgery 2005; 25 (06) 481-485
  • 35 Selber JC, Chang EI, Liu J. et al. Tracking the learning curve in microsurgical skill acquisition. Plast Reconstr Surg 2012; 130 (04) 550e-557e
  • 36 Kaufman HH, Wiegand RL, Tunick RH. Teaching surgeons to operate--principles of psychomotor skills training. Acta Neurochir (Wien) 1987; 87 (01) (02) 1-7
  • 37 Pandey VA, Wolfe JH, Black SA, Cairols M, Liapis CD, Bergqvist D. European Board of Vascular Surgery. Self-assessment of technical skill in surgery: the need for expert feedback. Ann R Coll Surg Engl 2008; 90 (04) 286-290
  • 38 Leung CC, Ghanem AM, Tos P, Ionac M, Froschauer S, Myers SR. Towards a global understanding and standardisation of education and training in microsurgery. Arch Plast Surg 2013; 40 (04) 304-311
  • 39 Satterwhite T, Son J, Carey J. et al. Microsurgery education in residency training: validating an online curriculum. Ann Plast Surg 2012; 68 (04) 410-414
  • 40 Tolba RH, Czigány Z, Osorio Lujan S. et al. Defining standards in experimental microsurgical training: recommendations of the European Society for Surgical Research (ESSR) and the International Society for Experimental Microsurgery (ISEM). Eur Surg Res 2017; 58 (05) (06) 246-262

Zoom Image
Fig. 1 Changes in mean scores of UWOMSA knot tying module (KTM) with years of postgraduate training.
Zoom Image
Fig. 2 Changes in mean scores of UWOMSA anastomoses module (AM) with years of postgraduate training.
Zoom Image
Fig. 3 Changes in mean scores of GRS with years of postgraduate training.