CC BY-NC-ND 4.0 · Journal of Academic Ophthalmology 2017; 09(01): e21-e25
DOI: 10.1055/s-0037-1607238
Research Article
Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

Cataract Video Coaching: Surgical Curriculum Enhancement in a U.S. Residency Program

Steven H. Tucker
1   Department of Ophthalmology, Emory Eye Center, Emory University, Atlanta, Georgia
,
Jeremy K. Jones
1   Department of Ophthalmology, Emory Eye Center, Emory University, Atlanta, Georgia
,
Maria M. Aaron
1   Department of Ophthalmology, Emory Eye Center, Emory University, Atlanta, Georgia
,
Yousuf M. Khalifa
1   Department of Ophthalmology, Emory Eye Center, Emory University, Atlanta, Georgia
› Author Affiliations
Further Information

Address for correspondence

Yousuf M. Khalifa, MD
Department of Ophthalmology, Emory Eye Center
Emory University School of Medicine, Grady Memorial Hospital, 1365B Clifton Road NE, Suite BT401A, Atlanta, GA 30322

Publication History

16 November 2016

24 July 2017

Publication Date:
10 October 2017 (online)

 

Abstract

The aim of this study was to examine the perceived utility of a video-coaching curriculum in cataract surgery training. This study took place in a conference room at the Emory University School of Medicine. This is an evaluation study using questionnaires after each resident's presentation. A curriculum was developed with a resident presenting surgical cases to a group of students, residents, and faculty. All participants filled out a survey focused on video coaching, performance, and an Objective Structured Assessment of Technical Skill (OSATS) evaluation. Thirteen presenting residents, 99 observing residents, and 35 faculty provided responses for 12 video-coaching sessions. The average OSATS score was lower for presenting residents (3.32) compared with observing residents (4.14) and faculty (4.20) (p < 0.01). All 13 presenting and 99 observing residents as well as all 35 faculty found benefit in video coaching with the subcategories of avoiding errors and overall performance rated as the most beneficial. All 13 presenting and 99 observing residents felt comfortable presenting cases with zero preferring an alternative setting. A formal surgical video-coaching curriculum in ophthalmology is a useful adjunct to traditional surgical curricula. There was a consensus that the curriculum was beneficial for cataract surgery preparation. All participants were comfortable taking part in the curriculum and none preferred an alternative curriculum.


#

Medical and surgical training has long followed an apprenticeship model with the historical teaching of “See one, do one, teach one.”[1] [2] As medicine has advanced, both technologically and in the ability to measure patient outcomes, there has been a push toward competency-based assessment. Starting in 1998, the American Council on Graduate Medical Education (ACGME) developed six areas of competency that residents had to meet: patient care, medical knowledge, practice-based learning, interpersonal and communication skills, professionalism, and systems-based practice. The ACGME has also recently instituted the CLER (Clinical Learning Environment Review) Pathways to Excellence. Rather than simply counting cases, monitoring training outcomes became required and expectations on topics such as patient safety, health care quality, and supervision were increased.[3] [4] [5] [6]

More than 3 million cataract surgeries are performed in the United States annually[7] and cataract surgery is the most commonly performed intraocular surgery by residents during residency training. Currently, the minimum number of cataract surgeries required by the ACGME is 86 with most programs easily exceeding this number.[8] After the ACGME mandate, there were many studies performed to evaluate resident outcomes, methods of teaching cataract surgery, and the cost of resident educational modalities.[3] [9] [10] [11] [12] [13] There was a demonstrable learning curve for phacoemulsification cataract surgery with reductions in complications and improvements in efficiency with more experience.[14] Newer methodologies of training and learning include wet laboratory curricula,[9] [12] virtual reality training,[15] [16] surgical simulators such as EyeSi, and tools to evaluate progression,[3] [10] [11] all with the goal of having surgeons master the learning curve earlier with fewer complications. When discussing these new modalities, cost has been an increasingly important consideration to ensure ability for implementation, sustainability, and positive net benefit.[13]

One area of training that many other surgical fields, including general and orthopedic surgery, have adapted is personalized video-based feedback to improve performance in surgeries.[17] [18] Reflective surgical practice has long been effective in enhancing education as it has been used at conferences, presentations, and research forums.[19] Furthermore, video review often allows root cause analysis to learn from habits or mistakes that lead to system errors.[20] In a recent general surgery study, comprehensive surgical coaching enhanced the surgery training and led to superior skill acquisition when compared with conventional training.[21]

Learning through simulators, operating room experience, and other methods may be enhanced by a formal curriculum, including objective assessment of actual surgical cases, debriefing, feedback from multiple levels of surgeons, and self-reflection. The aim of this study was to develop, implement, and evaluate a video-coaching curriculum for cataract surgery in a U.S. residency program.

Methods

A group of experienced faculty members met to develop a 6-month pilot video-coaching curriculum for the Ophthalmology Residency Program at Emory University in Atlanta, Georgia.

The developed curriculum included bi-weekly video conferences running from January 2016 through June 2016. At each session, one or two senior residents showed a surgical video he/she performed with a faculty surgeon demonstrating a specific topic. The cases were chosen by the resident in consultation with the faculty surgeon. Topics were focused on either an intraoperative complication or a complex case requiring advanced techniques. The sessions were 1 hour long and performed in an open format with the video running at one to two times the real speed. The video was projected on a 10 × 8 ft screen so that all participants could view the surgery footage. The resident and faculty physician went through the case in a step-by-step fashion with frequent pauses for discussion. Attendees included faculty physicians and individuals in all levels of training such as resident physicians (PGY2–4) and medical students. The faculty mentor would ensure that all important aspects of the case were reviewed.

In addition to the resident physician describing his/her thought process on certain steps, the faculty physician provided tips and pitfalls to avoid. Frequent questions were asked of the residents and students in the audience with input from all faculty physicians. The environment was designed to be collaborative, nonjudgmental, and educational.

Each operating resident, faculty attending surgeon, and observing resident filled out a post-session questionnaire ([supplementary Figs. S1] and [S2]). The questionnaires included questions on video coaching, trainee's performance, mentor's style, reasons for stalls, and an Objective Structured Assessment of Technical Skill (OSATS) evaluation of the surgery. Likert scales were used to provide quantitative data as answers for the majority of the questions with a few questions permitting qualitative feedback. The questionnaire responses were not made available to the presenting resident, the audience, or the attending physicians and the results had no bearing on judging the resident's surgical proficiency. This was made clear to all in attendance at the beginning of each session.

The questionnaire response data were entered into an Excel spreadsheet and statistical analyses were performed between observing resident, presenting resident, and faculty attendee questionnaires. Analyses included mean and standard deviation comparisons, correlation coefficients, and Student's t-tests.


#

Results

Through the 6-month pilot study, 6 senior resident physicians presented a total of 24 cases during 12 sessions. There were 13 presenting resident questionnaire responses, 99 observing resident responses, 2 medical student responses, and 35 faculty responses after the 12 sessions. There were four faculty members with at least one of the faculty present at each session to provide faculty responses.

[Table 1] illustrates the OSATS scores, including scores for each individual category and the overall average score for each participating group. Presenting residents have a lower average score of 3.32 when compared with both observing residents' score of 4.14 (p < 0.001) and faculty physicians' score of 4.20 (p < 0.001). There is no significant difference between observing resident and faculty scores.

Table 1

Post-session objective structured assessment of technical skill (OSATS) scores

Respect for tissue

Time and motion

Knowledge and handling of instrument

Flow of operation

Use of assistants

Knowledge of specific procedure

Average score

Presenting residents

3.31

3.00

3.54

3.23

3.54

3.31

3.32

Observing residents

4.09

4.04

4.26

4.08

4.18

4.18

4.14

Faculty

4.19

4.09

4.26

4.13

4.28

4.27

4.20

Notes: p-Values are <0.001 for presenting residents versus observing residents and presenting residents versus faculty. p-Value is 0.18 and is not statistically significant for observing residents versus faculty.


The perceived benefits of the video-coaching curriculum are demonstrated in [Table 2]: 100% of residents and faculty found the curriculum beneficial. The majority of participants in each category found benefit in recall of technical cues (11 of 13 presenting residents, 76 of 99 observing residents, 29 of 35 faculty), avoiding errors (12 of 13 presenting residents, 81 of 99 observing residents, 31 of 35 faculty), flow of performance (9 of 13 presenting residents, 50 of 99 observing residents, 27 of 35 faculty), recall of procedural steps (8 of 13 presenting residents, 56 of 99 observing residents, 26 of 35 faculty), and overall performance (12 of 13 presenting residents, 66 of 99 observing residents, 31 of 35 faculty). The majority of presenting residents and faculty found benefit in time of task (7 of 13 and 28 of 35, respectively), but observing residents (36 of 99) did not perceive this as a benefit.

Table 2

Benefits of video coaching

Presenting residents (13)

Observing residents (99)

Faculty (35)

Overall benefit

13 (100%)

99 (100%)

35 (100%)

Recall of technical cues

11 (85%)

76 (77%)

29 (83%)

Time of task

7 (53%)

36 (36%)

28 (80%)

Avoiding errors

12 (92%)

81 (82%)

31 (89%)

Flow of performance

9 (69%)

50 (51%)

27 (77%)

Recall of procedural steps

8 (62%)

56 (57%)

26 (74%)

Overall performance

12 (100%)

66 (67%)

31 (89%)

[Table 3] illustrates the participants' overall performance rating and the resulting statistical analysis. Observing residents and faculty had similar average score (1.7 and 1.6, respectively) with the majority choosing excellent (54 of 99 and 20 of 35, respectively), while presenting residents overwhelmingly chose satisfactory (8 of 13) with a statistically significant (p < 0.001) lower average score (2.5).

Table 3

Post-session overall performance

Presenting residents (13)

Observing residents (99)

Faculty (35)

Excellent (1)

2 (15%)

54 (55%)

20 (57%)

Average (2)

2 (15%)

22 (22%)

12 (34%)

Satisfactory (3)

8 (62%)

22 (22%)

0 (0%)

Unsatisfactory (4)

1 (8%)

1 (1%)

3 (8.6%)

Average 4-point score

2.5

1.7

1.6

Notes: p-Values are <0.001 for presenting residents versus observing residents and presenting residents versus faculty. p-Value is 0.28 and is not statistically significant for observing residents versus faculty.


[Table 4] illustrates the comfort level of the various groups with residents performing different steps of cataract surgery: 100% of presenting residents (13 of 13) felt comfortable with each individual step of the surgical procedure, except nucleus disassembly where 85% (11 of 13) noted comfort. Attending physicians had varying comfort levels with 86% (30 of 35) expressing confidence for nucleus disassembly, quadrant removal, and cortical clean-up, while 77% (27 of 35) had comfort with closure, 89% (31 of 35) had comfort with intraocular lens insertion, 91% (32 of 35) had comfort with capsulorrhexis, and 100% (35 of 35) had comfort with incisions.

Table 4

Comfort with resident performing task autonomously

Presenting residents (13)

Observing residents (99)

Faculty (35)

Incisions

13 (100%)

75 (76%)

35 (100%)

Capsulorrhexis

13 (100%)

53 (54%)

32 (91%)

Nucleus disassembly

11 (85%)

49 (49%)

30 (86%)

Quadrant removal

13 (100%)

45 (45%)

30 (86%)

Cortical clean up

13 (100%)

46 (46%)

30 (86%)

IOL insertion

13 (100%)

70 (71%)

31 (89%)

Closure

13 (100%)

65 (66%)

27 (77%)

Abbreviation: IOL, intraocular lens.


Resident preferences on the video-coaching curriculum were also evaluated and demonstrate that 100% of both observing (99 of 99) and presenting (13 of 13) residents had comfort presenting cases with none having a desire for an alternative setting. The majority of residents (10 of 13 presenting and 60 of 99 observing) felt the session was mentor driven.


#

Discussion

The video-coaching curriculum was developed and implemented over a 6-month-trial period. This study involved 6 senior residents presenting a total of 24 cases. All faculty and residents involved found benefit in the new curriculum, particularly in overall performance and avoiding errors. The curriculum allowed resident ownership of cases, self-reflection, teaching to other experience levels, and objective evaluation of individual surgeries.

In traditional ophthalmology residency programs, the majority of operations occur in the final 12 months of a resident's training. While there has been improvement in integrating surgery earlier into residency programs, this video-coaching curriculum makes surgical terminology, techniques, and discussions available to beginning residents and students. It helps maintain interest in one of the core aspects of ophthalmology and helps prepare less experienced residents for their operating time. In addition, it enhances the primary surgical year of senior resident physicians.

Residents spend up to 25% of their time teaching other residents and students and is a core component of residency training.[22] This curriculum provides another opportunity for resident physicians to develop techniques of reviewing and teaching to an audience. It provides an opportunity for residents to take ownership of cases in a safe and structured forum. Unlike traditional morbidity and mortality surgical conferences, these sessions provide the novice surgeon the opportunity to both teach and learn aspects of surgery that are only available in reflective practice. Faculty went to great lengths to keep the environment unintimidating, nonthreatening, and collegial to facilitate discussion of complications. The acquired data illustrate the comfort of residents with presenting in this setting and their desire for no changes to the current curriculum.

In addition to the educational benefit, this particular learning experience fulfills several of the ACGME's institutional expectations for patient safety, health care quality, and supervision as required by the CLER pathways to excellence. By focusing on techniques to improve surgical skills and improve recognition of difficult surgical situations, this curriculum addresses operating room system issues as well as approaches to prevent further problems. By providing direct feedback on individual surgeries in this setting, residents have the opportunity to receive additional input on their previous experiences to prepare for future surgeries.

While designing educational curricula, time and cost is an important factor. Both resident and faculty physicians have limited time and resources requiring educational sessions to be efficient. This forum allows the review of a wide array of topics, from instruments to technique to near misses, to be discussed with a moderate to large group in 1 hour. While it does require at least one faculty member to facilitate the session with the senior resident, it provides surgical exposure that typically would require one-on-one attention.

This study also provided insight into resident physicians' ability to rate their own performance as well as that of their coresidents. In previous reviews, residents were more accurate at rating global indices of a performance rather than specific tasks with a more accurate assessment by more senior residents.[23] The outcomes from this study illustrated that presenting residents rated their performances lower than faculty or other observing residents. Specifically, residents scored themselves lowest in the category “time and motion.” These results may suggest a more critical eye when evaluating personal work with a desire for perfection and/or a lack of confidence. These sessions can be used to help residents develop a sense of their progress and provide confidence that they are progressing as surgeons. Similar to other surgical fields' experience, it is likely that this curriculum can also lead to a faster learning curve and improved residents' surgical experience.

Additional benefits of this curriculum include the ability to probe a resident's surgical understanding without interfering with patient's confidence. Surgeries in ophthalmology, unlike many other surgical fields, often occur with the patient awake. It is not appropriate in many situations to probe the residents' knowledge base due to patient concerns, but this setting provides an opportunity for that learning modality.

This study and evaluation has several limitations. While it provides a framework for a video-coaching curriculum that can be utilized by other residency programs, it does not directly compare outcomes in a controlled fashion. Further research could include a randomized controlled study evaluating the outcomes after implementation of a similar curriculum. Additionally, the pilot sample size was small with only 24 surgical case presentations, though there were significantly more corresponding questionnaires completed. To widen the reach and utility of this program, a live feed for virtual observers or video recording with dissemination may be beneficial and considered in the future.


#

Conclusion

A formal video-coaching curriculum in ophthalmology provides a residency program with a unique opportunity to help trainees of all levels. In all participants, there was a consensus that the curriculum was beneficial for cataract surgery preparation. While there were differences in how presenting residents rated their performance, all participants were comfortable taking part in the curriculum and did not prefer an alternative curriculum.


#
#

Financial Support

None.

Note

This article was presented as a poster presentation and awarded “Best in Session for Medical Education” at The American Academy of Ophthalmology Annual Meeting, Chicago, IL, October 15–16, 2016. This article was also presented as a poster presentation at The Association of University Professors of Ophthalmology Annual Meeting, San Diego, CA, January 27, 2017.


  • References

  • 1 Vozenilek J, Huff JS, Reznek M, Gordon JA. See one, do one, teach one: advanced technology in medical education. Acad Emerg Med 2004; 11 (11) 1149-1154
  • 2 Lenchus JD. End of the “see one, do one, teach one” era: the next generation of invasive bedside procedural instruction. J Am Osteopath Assoc 2010; 110 (06) 340-346
  • 3 Smith RJ, McCannel CA, Gordon LK. , et al. Evaluating teaching methods of cataract surgery: validation of an evaluation tool for assessing surgical technique of capsulorhexis. J Cataract Refract Surg 2012; 38 (05) 799-806
  • 4 Mills RP, Mannis MJ. ; American Board of Ophthalmology Program Directors' Task Force on Competencies. Report of the American Board of Ophthalmology task force on the competencies. Ophthalmology 2004; 111 (07) 1267-1268
  • 5 Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach 2007; 29 (07) 648-654
  • 6 Weiss KB, Wagner R, Nasca TJ. Development, testing, and implementation of the ACGME Clinical Learning Environment Review (CLER) Program. J Grad Med Educ 2012; 4 (03) 396-398
  • 7 Cullen KA, Hall MJ, Golosinskiy A. Ambulatory surgery in the United States, 2006. Natl Health Stat Rep 2009; (11) 1-25
  • 8 Accreditation Council for Graduate Medical Education. ACGME Program Requirements for Graduate Medical Education in Ophthalmology [Internet]. Available at: https://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/240_ophthalmology_07012014.pdf . Accessed March 5, 2017
  • 9 Daly MK, Gonzalez E, Siracuse-Lee D, Legutko PA. Efficacy of surgical simulator training versus traditional wet-lab training on operating room performance of ophthalmology residents during the capsulorhexis in cataract surgery. J Cataract Refract Surg 2013; 39 (11) 1734-1741
  • 10 Henderson BA, Ali R. Teaching and assessing competence in cataract surgery. Curr Opin Ophthalmol 2007; 18 (01) 27-31
  • 11 Puri S, Sikder S. Cataract surgical skill assessment tools. J Cataract Refract Surg 2014; 40 (04) 657-665
  • 12 Ezra DG, Aggarwal R, Michaelides M. , et al. Skills acquisition and assessment after a microsurgical skills course for ophthalmology residents. Ophthalmology 2009; 116 (02) 257-262
  • 13 Nandigam K, Soh J, Gensheimer WG, Ghazi A, Khalifa YM. Cost analysis of objective resident cataract surgery assessments. J Cataract Refract Surg 2015; 41 (05) 997-1003
  • 14 Randleman JB, Wolfe JD, Woodward M, Lynn MJ, Cherwek DH, Srivastava SK. The resident surgeon phacoemulsification learning curve. Arch Ophthalmol 2007; 125 (09) 1215-1219
  • 15 Thomsen AS, Kiilgaard JF, Kjaerbo H, la Cour M, Konge L. Simulation-based certification for cataract surgery. Acta Ophthalmol 2015; 93 (05) 416-421
  • 16 Henderson BA, Kim JY, Golnik KC. , et al. Evaluation of the virtual mentor cataract training program. Ophthalmology 2010; 117 (02) 253-258
  • 17 Karam MD, Thomas GW, Koehler DM. , et al. Surgical coaching from head-mounted video in the training of fluoroscopically guided articular fracture surgery. J Bone Joint Surg Am 2015; 97 (12) 1031-1039
  • 18 Vaughn CJ, Kim E, O'Sullivan P. , et al. Peer video review and feedback improve performance in basic surgical skills. Am J Surg 2016; 211 (02) 355-360
  • 19 Rehim SA, Chung KC. Educational video recording and editing for the hand surgeon. J Hand Surg Am 2015; 40 (05) 1048-1054
  • 20 Bonrath EM, Gordon LE, Grantcharov TP. Characterising “near miss” events in complex laparoscopic surgery through video analysis. BMJ Qual Saf 2015; 24 (08) 516-521
  • 21 Bonrath EM, Dedy NJ, Gordon LE, Grantcharov TP. Comprehensive surgical coaching enhances surgical skill in the operating room: a randomized controlled trial. Ann Surg 2015; 262 (02) 205-212
  • 22 Ryg PA, Hafler JP, Forster SH. The efficacy of residents as teachers in an ophthalmology module. J Surg Educ 2016; 73 (02) 323-328
  • 23 Casswell EJ, Salam T, Sullivan PM, Ezra DG. Ophthalmology trainees' self-assessment of cataract surgery. Br J Ophthalmol 2016; 100 (06) 766-771

Address for correspondence

Yousuf M. Khalifa, MD
Department of Ophthalmology, Emory Eye Center
Emory University School of Medicine, Grady Memorial Hospital, 1365B Clifton Road NE, Suite BT401A, Atlanta, GA 30322

  • References

  • 1 Vozenilek J, Huff JS, Reznek M, Gordon JA. See one, do one, teach one: advanced technology in medical education. Acad Emerg Med 2004; 11 (11) 1149-1154
  • 2 Lenchus JD. End of the “see one, do one, teach one” era: the next generation of invasive bedside procedural instruction. J Am Osteopath Assoc 2010; 110 (06) 340-346
  • 3 Smith RJ, McCannel CA, Gordon LK. , et al. Evaluating teaching methods of cataract surgery: validation of an evaluation tool for assessing surgical technique of capsulorhexis. J Cataract Refract Surg 2012; 38 (05) 799-806
  • 4 Mills RP, Mannis MJ. ; American Board of Ophthalmology Program Directors' Task Force on Competencies. Report of the American Board of Ophthalmology task force on the competencies. Ophthalmology 2004; 111 (07) 1267-1268
  • 5 Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach 2007; 29 (07) 648-654
  • 6 Weiss KB, Wagner R, Nasca TJ. Development, testing, and implementation of the ACGME Clinical Learning Environment Review (CLER) Program. J Grad Med Educ 2012; 4 (03) 396-398
  • 7 Cullen KA, Hall MJ, Golosinskiy A. Ambulatory surgery in the United States, 2006. Natl Health Stat Rep 2009; (11) 1-25
  • 8 Accreditation Council for Graduate Medical Education. ACGME Program Requirements for Graduate Medical Education in Ophthalmology [Internet]. Available at: https://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/240_ophthalmology_07012014.pdf . Accessed March 5, 2017
  • 9 Daly MK, Gonzalez E, Siracuse-Lee D, Legutko PA. Efficacy of surgical simulator training versus traditional wet-lab training on operating room performance of ophthalmology residents during the capsulorhexis in cataract surgery. J Cataract Refract Surg 2013; 39 (11) 1734-1741
  • 10 Henderson BA, Ali R. Teaching and assessing competence in cataract surgery. Curr Opin Ophthalmol 2007; 18 (01) 27-31
  • 11 Puri S, Sikder S. Cataract surgical skill assessment tools. J Cataract Refract Surg 2014; 40 (04) 657-665
  • 12 Ezra DG, Aggarwal R, Michaelides M. , et al. Skills acquisition and assessment after a microsurgical skills course for ophthalmology residents. Ophthalmology 2009; 116 (02) 257-262
  • 13 Nandigam K, Soh J, Gensheimer WG, Ghazi A, Khalifa YM. Cost analysis of objective resident cataract surgery assessments. J Cataract Refract Surg 2015; 41 (05) 997-1003
  • 14 Randleman JB, Wolfe JD, Woodward M, Lynn MJ, Cherwek DH, Srivastava SK. The resident surgeon phacoemulsification learning curve. Arch Ophthalmol 2007; 125 (09) 1215-1219
  • 15 Thomsen AS, Kiilgaard JF, Kjaerbo H, la Cour M, Konge L. Simulation-based certification for cataract surgery. Acta Ophthalmol 2015; 93 (05) 416-421
  • 16 Henderson BA, Kim JY, Golnik KC. , et al. Evaluation of the virtual mentor cataract training program. Ophthalmology 2010; 117 (02) 253-258
  • 17 Karam MD, Thomas GW, Koehler DM. , et al. Surgical coaching from head-mounted video in the training of fluoroscopically guided articular fracture surgery. J Bone Joint Surg Am 2015; 97 (12) 1031-1039
  • 18 Vaughn CJ, Kim E, O'Sullivan P. , et al. Peer video review and feedback improve performance in basic surgical skills. Am J Surg 2016; 211 (02) 355-360
  • 19 Rehim SA, Chung KC. Educational video recording and editing for the hand surgeon. J Hand Surg Am 2015; 40 (05) 1048-1054
  • 20 Bonrath EM, Gordon LE, Grantcharov TP. Characterising “near miss” events in complex laparoscopic surgery through video analysis. BMJ Qual Saf 2015; 24 (08) 516-521
  • 21 Bonrath EM, Dedy NJ, Gordon LE, Grantcharov TP. Comprehensive surgical coaching enhances surgical skill in the operating room: a randomized controlled trial. Ann Surg 2015; 262 (02) 205-212
  • 22 Ryg PA, Hafler JP, Forster SH. The efficacy of residents as teachers in an ophthalmology module. J Surg Educ 2016; 73 (02) 323-328
  • 23 Casswell EJ, Salam T, Sullivan PM, Ezra DG. Ophthalmology trainees' self-assessment of cataract surgery. Br J Ophthalmol 2016; 100 (06) 766-771