CC BY-NC-ND 4.0 · Journal of Academic Ophthalmology 2020; 12(02): e221-e233
DOI: 10.1055/s-0040-1718555
Research Article

The Utility of Virtual Reality Simulation in Cataract Surgery Training: A Systematic Review

1   Department of Ophthalmology, Weill Cornell Medicine, New York, New York
,
Jeffrey F. McMahon
1   Department of Ophthalmology, Weill Cornell Medicine, New York, New York
,
Grace Sun
1   Department of Ophthalmology, Weill Cornell Medicine, New York, New York
› Author Affiliations
Funding This study was supported by an unrestricted grant from Research to Prevent Blindness. The sponsor or funding organization had no role in the design or conduct of this research.
 

Abstract

Introduction Cataract surgery is a fundamental intraocular procedure with a steep learning curve. Virtual reality simulation offers opportunity to streamline this aspect of ophthalmic education by exposing trainees to operative techniques in a controlled setting.

Materials and Methods A systematic review of the PubMed database was conducted through December 2019 for English language studies reporting on use of virtual reality simulation in cataract surgery training to assess usefulness. Studies meeting inclusion criteria were examined for pertinent data: study design, number of subjects and live cases, simulator model, training regimen, surgical skills assessed, and overall outcomes.

Results Of the 41 analyzed studies, 15 investigated the impact of virtual reality simulation-based training on performance in live surgery or wet laboratories; 20 used simulation as a device for direct assessment of operative proficiency; 6 explored simulation-based training's effect on performance in simulated surgery. Thirty-seven studies employed an iteration of the Eyesi simulator, though methodologies varied widely with a few randomized trials available. The literature endorsed validity of simulator-based assessment and benefits of structured training on live complication rates, operative times, and self- and faculty-perceived competency, particularly in novice surgeons.

Discussion The literature surrounding simulation in cataract surgery training is characterized by significant heterogeneity in design. However, most works describe advantages that may outweigh the costs of implementation into training curricula. Collaborative efforts at establishing a structured, proficiency-based cataract surgery curriculum built around virtual reality and wet laboratory simulation have the potential to improve outcomes and enhance future surgical training.


#

Cataract surgery is the most frequently performed surgical procedure in the United States.[1] Likewise, it is the surgery performed most often by ophthalmology residents during their training.[2] The literature describes a steep and lengthy associated learning curve that can extend beyond minimum caseloads required for accreditation.[3] As such, enhancing cataract surgery training is of utmost importance to academic programs aiming to produce skilled surgeons.

Meanwhile, simulation has played a role in medical training since the advent of computers and primitive graphics hardware,[4] with pertinent randomized controlled trials published as early as 1999.[5] In ophthalmology, surgical simulators such as the Eyesi (VRmagic, Mannheim, Germany), MicroVisTouch (ImmersiveTouch Inc., Chicago, IL), and PhacoVision (Melerit Medical, Linköping, Sweden) are intended to develop surgical technique by using microscopes with stereoscopic imaging and haptic feedback to portray a realistic operative environment.[6] Overall, surgical simulation is an attractive educational modality for its potential to enhance knowledge and skill acquisition and retention without risk to live patients.[7]

In 2014, a review of 10 works on simulation-based cataract surgery training detailed construct validity of simulator modules, subsequent performance in wet laboratories, and association with lower complication rates in one study utilizing a control group.[6] Wider integration of simulation within ophthalmic surgical training has since contributed to an influx of pertinent literature. Our scope comprises over 40 studies, many of which incorporate new methodology or evaluate previously unreviewed hypotheses pertaining to variable intraoperative conditions, curricular satisfaction, or predictiveness of future live operative performance. The aim of this review is to update and more broadly evaluate the utility of virtual reality simulation (VRS) for educating residents to become proficient cataract surgeons, especially in comparison to more traditional teaching methods.

Materials and Methods

The PubMed computerized database was systematically searched to identify all literature reporting on use of VRS in cataract surgery through the year 2019. Articles were obtained using a search of Medical Subject Headings and keyword terms and their respective combinations ([Table 1]). Full text English language studies were included in analysis, while review articles, surgical technique guides, case reports, and comments were excluded.

Table 1

Database search for systematic review. Search terms were entered into PubMed search engine to identify English language studies through December 2019

Database

Search terms

PubMed

Keyword: (“cataract extraction”[MeSH Terms] OR (“cataract”[All Fields] AND “extraction”[All Fields]) OR “cataract extraction”[All Fields] OR (“cataract”[All Fields] AND “surgery”[All Fields]) OR “cataract surgery”[All Fields]) AND (simulator[All Fields] OR “virtual reality”[MeSH Terms])

The literature search is outlined in [Fig. 1]. Initial search of titles yielded a subset of possible articles that were subsequently selected for relevant abstract contents. Full text was reviewed for articles meeting criteria and appropriate studies were retained. The title, abstract, and full-text selection process with assessment of bias at the study level was performed independently by study authors with discrepancies discussed and resolved by mutual agreement.

Zoom Image
Fig. 1 Flow diagram presenting the systematic review process used in this study.

Data Extraction and Analysis

Several metrics were obtained from each article to describe study characteristics: level of evidence per Oxford Center for Evidence-Based Medicine (OCEBM) criteria,[8] study design, number and identity of subjects, outcome variables, and overall findings.


#
#

Results

Study Characteristics

This systematic review comprised 41 published works: 5 level II studies (12%), 18 level III studies (44%), 17 level IV studies (41%), and 1 level V study (2%) per OCEBM criteria.[8] Six randomized controlled trials (1 nonblinded, 4 single-blinded, 1 double-blinded), 16 cohort studies (7 prospective, 9 retrospective), 16 cross-sectional studies, and 3 case series were included.

The Eyesi, MicroVisTouch, and Sensimmer Virtual Phaco Trainer systems were used in 37, 1, and 1 studies, respectively. An additional study used a computer-based simulator developed in-house, and the final study was questionnaire-based and did not specify a requisite model.

The mean number of subjects in each study was 33.6 ± 42.4 (median, 22; range, 3–265; n = 41), the majority of which were ophthalmic residents. However, more experienced surgeons and novice students were also involved, particularly in studies of construct validity that used level of surgical expertise as an independent variable.

Researchers often set out to answer similar questions pertaining to simulation and surgical performance, but with unique and varied approaches in terms of study design (training modules used, outcome variables tracked, etc.). Most often, studies compared real-world and/or simulated performance using differing training methods or baseline experience level. As far as real-world performance was concerned, 11 studies analyzed VRS impact on video-derived scores (Objective Structured Assessment of Cataract Surgical Skill—OSACSS[9] or Global Rating Assessment of Skills in Intraocular Surgery—GRASIS[10]) or operative complications across 24,352 total cases (mean, 2,214; range, 21–17,831).


#

Temporal Trends

The number of articles meeting inclusion criteria published in any given year varied from 1 (2008, 2009, 2010) to 8 (2013). [Figure 2] depicts the count of reviewed articles published by year, stratified by category. The majority of articles was published after 2013.

Zoom Image
Fig. 2 Article count by year, stratified by category.

#

Methodology

Literature meeting criteria for review examined the use of VRS training in cataract surgery primarily from one of three angles: the first, exploring association with surgical performance in the operating room or wet laboratories ([Table 2], n = 15); the second, investigating VRS as an assessment tool for surgical competency ([Table 3]; n = 21); the third, evaluating effect of VRS training on VRS performance ([Table 4]; n = 5). Among those approaching the first angle, some studies specifically were designed to investigate the effect of VRS versus other training methodologies (wet laboratories or no training, for example). Others were designed to track performance metrics both at baseline and after VRS training on an individual basis. The remainder provided qualitative data on participants' attitudes toward VRS and perceived contribution to operating room preparedness. Among studies approaching the second angle, many aimed to establish construct validity of simulator devices. Others assessed the effect of VRS on simulator-scored performance, used it to predict future performance or validate other means of assessment, or investigated questions pertaining to operator or other characteristics as related to simulator-scored performance. Studies approaching the third angle focused on simulator task repetition and improvement, intermodule skills transfer, or nondominant hand use that could not ethically be conducted on live patients.

Table 2

Characteristics and findings of studies describing the impact of simulated cataract surgery module training on operative performance in live surgeries and wet laboratories (n = 15)

Author (y)

Design

Level of evidence (OCEBM)

Participants

n (live cases)

Simulator

Independent variable(s)

Outcome variable(s)

Skills/modules tested

Findings

Baxter et al (2013)[13]

CS

4

3 residents

903

Eyesi

Sim and wet laboratory training

Case load, complications, survey

Unspecified (∼50 h)

Complication rates of ∼1% over first 100 live cases

Belyea et al (2011)[14]

RCS

3

42 residents

592

Eyesi

Sim training (>2 h/y)

Phaco time, power, complications

CCC ± phacoemulsification

Reduced phacoemulsification time and power in VRS group

Daly et al (2013)[12]

RCT (single-blinded)

2

21 residents

21

Eyesi

Sim versus wet laboratory training

Video grading, satisfaction questionnaire

CCC

Comparable performance; shorter CCC time and more effective orientation to instruments in wet laboratory group

Ferris et al (2019)[21]

RCS

3

265 residents

17831

Eyesi

Sim training

PCR rate

Various

Reduced PCR rate in VRS group

Feudner et al (2009)[11]

RCT (single-blinded)

2

32 residents, 30 students

N/A (372 porcine)

Eyesi

Sim training

Video grading of wet laboratories, perceived usefulness

Forceps, antitermor, CCC

Improved wet laboratory capsulorhexis score in VRS group

Kloek et al (2014)[24]

RCS

3

23 residents, 8 faculty

N/A

Eyesi

Sim training

Resident comfort, faculty assessment, case load, % case turnover

CCC, hydrodissection, sculpting, quadrant removal

Increased case load, improved comfort with surgical steps and faculty assessment in VRS group

Lopez-Beauchamp et al (2019)[20]

RCS

3

29 residents

722

Eyesi

Sim training

Operative time, vitreous loss rate

CT-A, CT-B

Reduced operative time in VRS group

Lucas et al (2019)[19]

RCS

3

14 residents

140

Eyesi

Sim training

Complications (post capsule rupture, aphakia, nucleus fragment dislocation, extracapsular conversion)

CT-C: CCC, divide and conquer, chopping, I/A, toric IOLs

Reduced total complications over first 10 live cases in VRS group

McCannel et al (2013)[16]

RCS

3

48 residents

1037

Eyesi

Sim training

Rate of errant CCC, trypan blue use

33-module CITC: antitermor, navigation, forceps, bimanual, CCC

Reduced errant CCC rate in VRS group

McCannel (2017)[17]

RCS

3

38 residents

1037

Eyesi

Sim training

Complications (vitreous loss, retained lens material, errant CCC)

33-module CITC: antitermor, navigation, forceps, bimanual, CCC

Comparable vitreous loss rates overall; increased non-errant CCC associated vitreous loss in VRS group

Pokroy et al (2013)[15]

RCS

3

20 residents

1000

Eyesi

Sim training

PCR ± vitreous loss, operative time, attending survey of improved group performance

Unspecified (>6h)

Reduced % of long cases (>40 minute) among cases 10–50 in VRS group

Puri et al (2017)[25]

XS

5

116 residents

N/A

N/A

N/A

Self-perceived preparedness, competency

Unspecified (onsite availability)

Improved preparedness/competency among residents with supervised wet laboratory or VRS training

Roohipoor et al (2017)[23]

CS

4

30 residents

N/A

Eyesi

Sim scores

Case load, GRASIS score, need for help

Antitermor, bimanual, CCC, forceps, navigation

Correlation between VRS scores and future case load, total GRASIS score, reduced need for extra help

Staropoli et al (2018)[18]

RCS

3

22 residents

955

Eyesi

Sim training

Complications (PCR, vitreous prolapse, retained lens fragment, zonular dehiscence, endophthalmitis, IOL dislocation, return to OR), qualitative survey

CT-A, CT-B

Reduced rate of total complications, PCR, vitreous prolapse in VRS group

Thomsen et al (2017)[22]

CS

4

18 surgeons, varying experience

114

Eyesi

Sim training

Video-based OSACSS score

Navigation, antitermor, forceps, bimanual, CCC, divide and conquer

Improved OSACCS score among novice/intermediate after VRS training

Abbreviations: CCC, [continuous curvilinear] capsulorhexis; CS, case series; I/A, irrigation/aspiration; IOL, intraocular lens; OSACSS, objective structured assessment of cataract surgical skill; PCR, posterior capsular rupture; PCS/RCS, prospective/retrospective cohort study; RCT, randomized controlled trial; XS, cross-sectional study.


Table 3

Characteristics and findings of studies describing simulation technology as an assessment tool in cataract surgery training (n = 21)

Author (y)

Design

Level of evidence (OCEBM)

Participants

n (live cases)

Simulator

Independent variable(s)

Outcome variable(s)

Skills/modules tested

Findings

Balal et al (2019)[39]

XS

4

40 residents

120

Eyesi

Surgical experience

Eyesi-based PhacoTracking metrics (live surgeries)

CCC, phacoemulsification, I/A

Correlation between experience level and VRS performance (CCC, phacoemulsification, and I/A tasks: path length, number of movements, time)

Banerjee et al (2012)[34]

XS

4

8 residents

24

ImmersiveTouch

Live surgical scores

Sim scores

CCC

Correlation between VRS and live performance (CCC circularity, standard deviation of duration, number of forceps grabs)

Bozkurt-Oflaz et al (2018)[33]

XS

4

16 (7 novice residents, 6 intermediate, 3 faculty)

N/A

Eyesi

Surgical experience

Sim scores

Navigation, forceps, bimanual, antitermor, CCC

Correlation between experience level and VRS performance (baseline CCC score, improvement with repetition, nondominant hand CCC, mature cataract CCC)

Jacobsen et al (2019)[36]

XS

4

19 surgeons, varying experience

57

Eyesi

Live surgical performance (OSACSS)

Sim scores

Navigation, antitremor, forceps, bimanual, CCC, divide and conquer

Correlation between mean OSACSS score and VRS performance (total score)

Lam et al (2016)[32]

XS

4

16 (10 attendings, 6 residents)

N/A

In-house: PC, sim software, dual haptic devices

Surgical experience

Sim scores

Corneal incision, CCC, phacoemulsification, IOL implantation

Correlation between experience level and VRS performance(total scores, operative time, antitermor, antirupture, CCC, phacoemulsification, IOL implantation)

Mahr and Hodge (2008)[26]

XS

4

15 (12 residents, 3 experienced surgeons)

N/A

Eyesi

Surgical experience

Sim scores

Forceps, antitermor

Correlation between experience level and VRS performance (forceps, antitermor, operative time, out-of-tolerance %)

Modi et al (2015)[46]

XS

4

30 residents

N/A

Eyesi

Learning style (Kolb Inventory)

Sim scores

Forceps

No association between particular learning style and VRS performance (total score, odometer movement, cornea injury, lens injury, operative time)

Park et al (2011)[41]

PCS

3

21 (14 novice, 7 expert surgeons)

N/A

Eyesi

Surgical experience +/− distraction

Sim scores

Forceps

No change in VRS performance with distracting cognitive task; reduced rate of distractive cognitive task completion during VRS task

Park et al (2012)[44]

PCS

3

30 residents

N/A

Eyesi

Dominant versus nondominant hand use

Sim scores

Forceps

Reduced VRS performance with nondominant hand use (total score, operative time, lens injury); intra-user correlation between dominant and nondominant hand VRS performance

Podbielski et al (2012)[43]

PCS

3

18 residents

N/A

Eyesi

Hand vs foot-activated forceps use

Sim scores, subjective preference

Dexterity, CCC

No association between hand versus foot forceps use and VRS performance; no preference for either modality

Pointdujour et al (2011)[42]

RCT (Double-blinded)

2

18 (3 students, 3 optometrists, 6 residents, 6 experienced surgeons)

N/A

Eyesi

Placebo versus propranolol versus caffeine

Sim scores

Antitermor, forceps, CCC

Improved VRS performance with beta-blocker use among novice surgeons

Privett et al (2010)[27]

XS

4

23 (16 students/residents, 7 experienced surgeons)

N/A

Eyesi

Surgical experience

Sim scores

CCC

Correlation between experience level and VRS performance (total score, centering, corneal injury, spikes, loss of red reflex, roundness, operative time)

Saleh et al (2013)[38]

XS

4

18 residents

N/A

Eyesi

Attempt number

Sim scores (intra-subject variance)

CCC, cracking and chopping, navigation, bimanual, antitermor

Correlation between intra-novice task scores on first and second or first and third attempt

Selvander and Åsman (2011)[45]

XS

4

70 students

N/A

Eyesi

Stereoacuity level

Sim scores

Navigation, forceps, CCC

Correlation between novice stereoacuity and VRS performance (navigation, forceps)

Selvander and Asman (2013)[28]

XS

4

24 (7 surgeons, 17 students)

N/A

Eyesi

Surgical experience

Sim scores, OSACSS, OSATS

CCC, hydromaneuvers, divide and conquer, navigation, forceps, cracking and chopping

Correlation between experience level and VRS performance (overall scores, OSATS, OSACSS)

Selvander and Åsman (2013)[30]

XS

4

24 (7 surgeons, 17 students)

N/A

Eyesi

Surgical experience

Sim scores, OSACSS, OSATS

CCC, hydromaneuvers, divide and conquer

OSACSS provided superior discrimination of experience level compared with VRS scoring

Sikder et al (2015)[37]

PCS

3

40 residents

N/A

MicroVisTouch

Interim surgical experience

Sim scores

CCC

Improved VRS performance and reduced standard deviation after interim residency training

Spiteri et al (2014)[29]

XS

4

30 residents, varying experience

N/A

Eyesi

Surgical experience

Sim scores

Forceps, antitermor, CCC

Correlation between experience level and VRS performance (procedural modules); ceiling effect among intermediate/experienced surgeons (abstract modules)

Thomsen et al (2015)[31]

XS

4

42 (26 novice residents, 11 experienced cataract surgeons, 5 experienced vitreoretinal surgeons)

N/A

Eyesi

Surgical experience

Sim scores

Navigation, antitermor, forceps, bimanual, cracking and chopping, phacoemulsification, CCC, hydrodissection, divide and conquer, I/A, IOL insertion

Correlation between experience level and VRS performance; cataract surgeons did not significantly outperform vitreoretinal surgeons

Thomsen et al (2017)[35]

XS

4

11 surgeons, varying experience

33

Eyesi

Live surgical performance (motion-tracking score)

Sim scores

Navigation, antitermor, forceps, bimanual, CCC, divide and conquer

Correlation between VRS performance and live motion-tracking scores

Waqar et al (2011)[40]

PCS

3

7 experienced surgeons

N/A

Eyesi

Pre versus post live OR session

Sim scores

Forceps

Improved VRS performance (plateau total score, operative time) after day of live cases

Abbreviations: CCC, [continuous curvilinear] capsulorhexis; CS, case series; IOL, intraocular lens; OSACSS, objective structured assessment of cataract surgical skill; OSATS, objective structured assessment of technical surgical skills; PCS/RCS, prospective/retrospective cohort study; RCT, randomized controlled trial; XS, cross-sectional study.


Table 4

Characteristics and findings of studies describing the impact of simulated cataract surgery module training on operative performance as assessed by simulation technology (n = 5)

Author (y)

Design

Level of evidence (OCEBM)

Participants

n (live cases)

Simulator

Independent variable(s)

Outcome variable(s)

Skills/modules tested

Findings

Bergqvist et al (2014)[48]

RCT (nonblinded)

2

20 students, 2 surgeons

N/A

Eyesi

Sim training (varying)

Sim scores

CCC, hydromaneuver, divide and conquer

Correlation between extent of VRS training and VRS performance (overall score, capsule rupture/damage)

Gonzalez-Gonzalez et al (2016)[49]

PCS

3

14 (3 attendings, 11 residents)

N/A

Eyesi

Sim training (pre versus post), dominant versus nondominant hand usage, surgical experience

Sim scores, satisfaction questionnaire

CCC

Improved VRS performance (overall score, average radius, maximum radial extension, odometer, operative time) from baseline with dominant/nondominant hand use; reduced VRS performance with nondominant hand use at baseline with steeper rate of subsequent improvement; no significant differences noted between trainees and attendings on satisfaction questionnaire

Saleh et al (2013)[47]

PCS

3

16 residents

N/A

Eyesi

Sim training (pre versus post)

Sim scores

Navigation, antitermor, CCC, cracking and chopping

Improved VRS performance (total scores) from baseline after completion of VRS training

Selvander and Åsman (2012)[50]

RCT (single-blinded)

2

35 students

N/A

Eyesi

Sim training (capsulorhexis versus navigation-centric), user stereoacuity

Sim scores, OSACSS, OSATS

Navigation, CCC

Improved VRS performance and plateau with repetition (navigation score, CCC score, operative time, corneal damage); correlation between VRS performance (CCC score) and OSACSS; no significant skill transfer between navigation and CCC modules

Thomsen et al (2017)[51]

RCT (single-blinded)

2

12 residents, 3 attendings

N/A

Eyesi

Sim training (cataract)

Sim scores (vitreoretinal)

Navigation, forceps, vitrector, antitermor, bimanual, bimanual scissors, laser coagulation, posterior hyaloid, epiretinal membrane, internal limiting membrane peel, retinal detachment

Correlation between experience level and VRS performance on vitreoretinal modules; no association between VRS cataract pre-training and VRS performance on vitreoretinal modules

Abbreviations: CCC, [continuous curvilinear] capsulorhexis; CS, case series; OSACSS, objective structured assessment of cataract surgical skill; OSATS, objective structured assessment of technical surgical skills; PCS/RCS, prospective/retrospective cohort study; RCT, randomized controlled trial; XS, cross-sectional study.



#

Impact on Performance in Live Surgeries/Wet Laboratories ([Table 2])

Simulation versus Other Training Methodology: Wet Laboratory-Based, No Exposure, etc

As early as 2009, Feudner et al found prior VRS training to be associated with improved baseline wet laboratory capsulorhexis performance by residents and students.[11] Within 4 years, other groups noted comparable performance between VRS and wet laboratory-trained residents at baseline[12] and improved outcomes compared with those reported in the general cataract surgery literature.[13] Thereafter, a common study methodology involved assessing live surgical performance among resident cohorts in years prior to and after implementing VRS training. Each of the following outcomes was found to be significantly improved in some or all such studies: number of cases performed, operative score as measured by OSACSS or GRASIS, mean/adjusted phacoemulsification time and power, total operative time, trypan blue use, complication rate (posterior capsule rupture [PCR], nucleus fragment dislocation, extracapsular conversion, aphakia, errant continuous curvilinear capsulorhexis, vitreous loss, etc.).[14] [15] [16] [17] [18] [19] [20] [21]


#

Comparative Performance before and after Simulation Training

Thomsen et al assessed OSACSS scores before and after VRS training, finding improved performance by novice and intermediate surgeons, but not by expert colleagues.[22]


#

Prediction of Future Live Performance

Roohipoor et al found increased simulator score during early residency to be associated with increased future case load, primary surgery volume, and third-year GRASIS scores.[23]


#

Qualitative Assessment of Satisfaction and Preparedness

In their survey-based study, Kloek et al noted VRS-trained residents to feel more comfortable with surgical steps and rate as more well-prepared by faculty.[24] Daly et al reported that residents deemed VRS and wet laboratory-based curricula to be similarly helpful and realistic, but with different strengths.[12] Finally, Puri et al highlighted increased self-perceived competency among residents with access to either VRS or wet laboratory training in a large cohort spanning numerous training programs.[25]


#
#

Simulation as an Assessment Tool ([Table 3])

Concurrent and Construct Validity: Simulator Technique and Scoring

Other studies were conducted to establish concurrent and construct validity of VRS equipment and scoring. Most compared computed module performance across first-time operators and found expert/intermediate surgeons to outperform novices.[26] [27] [28] [29] [30] [31] [32] [33] Banerjee et al found simulator scores to correlate with prior live surgical performance,[34] while others described similar associations with additional stratification by experience level.[35] [36] Meanwhile, Sikder et al endorsed improvement in capsulorhexis scores from baseline after 6 months of residency training.[37] Saleh et al noted significant intrasubject variability in simulator task scores during early attempts by novice surgeons.[38]


#

Concurrent and Construct Validity: Other Assessment Tools

Balal et al successfully discriminated between live surgical videos of junior and senior surgeons using a motion tracking system correlated with previously validated simulator parameters.[39]


#

Other Simulator-Based Assessment

Multiple groups used VRS performance as measured by the simulator itself to quantify the impact of external conditions: fatigue after a day of live operative cases, distractive arithmetic tasks, beta-blocker use, and caffeine intake.[40] [41] [42] Preference for and safety of various operative approaches such as hand- versus foot-operated forceps and nondominant hand surgery were similarly assessed.[43] [44] Lastly, VRS performance was compared in context of inherent user characteristics including stereoacuity and Kolb Inventory learning style.[45] [46]


#

Effect of Simulator-Based Training on Simulator-Based Performance

Finally, other studies described significant improvement in VRS performance of trainees after varying extents of VRS training.[47] [48] Gonzalez-Gonzalez et al stratified by experience level and dominant/nondominant hand use during VRS capsulorhexis, noting steepest improvement in the nondominant hand/novice subgroup.[49] Selvander and Åsman observed limited skill transfer between capsulorhexis and navigation modules among students,[50] whereas Thomsen et al described similar findings between prior VRS cataract surgery and vitreoretinal modules ([Table 4]).[51]


#
#
#

Discussion

Limitations

The predominant limitations of this review include study heterogeneity and small sample size. Protocols ranged from merely providing trainees with VRS access to mandating completion of more rigorous module sequences in the accompanying Eyesi curriculum ([Table 5]). Outcome variables of interest also varied. Studies assessing VRS as a training modality measured outcomes of live surgeries and/or wet laboratories objectively (case load, complication rates, phacoemulsification/total operative time, phacoemulsification power) and subjectively (qualitative trainer/trainee surveys, video grading, OSACCS score, need for additional help). Similarly, those assessing VRS as a proficiency assessment tool relied predominantly on simulator-based and other scoring metrics, both objective (path length, number of movements, time, circularity, number of forceps grabs) and subjective (post-training satisfaction survey, OSACSS score). Certainly, studies associating VRS training with larger operative caseload later in residency hint at a potential confounder for improved performance in virtual and live environments.

Table 5

Eyesi cataract curriculum

CAT-A introductory courses

 Anterior chamber navigation

 Intracapsular navigation

 Bimanual navigation

 Instruments

CAT-B beginners's courses

 Navigation and instruments

 Capsulorhexis

 Intracapsular tissue

 Stop and chop

 IOL insertion

CAT-C intermediate courses

 Capsulorhexis

 Divide and conquer

 Chopping

 Irrigation/aspiration

 Toric IOLs

CAT-D advanced courses

 Capsulorhexis errant tear

 Weak structures

 White cataracts

 Capsular plaques

 Varying cases

 Anterior vitrectomy

Abbreviation: IOL, intraocular lens.


Note: Module performance is calculated using pertinent scoring criteria: target achievement, efficiency, instrument handling, tissue treatment, etc. A “cataract challenge” is presented every 60 minutes of training time in the beginners' intermediate, and advanced courses.


The vast majority of reviewed studies (37) used the Eyesi as opposed to other devices (ImmersiveTouch, MicroVisTouch, or other in-house models). This reduces the generalizability of our conclusions to all simulators, which may boast different strengths. However, at present, over 70% of accredited residency programs possess at least one Eyesi; with 109 Eyesi simulators active at US accredited residency and fellowship programs, federal government-affiliated centers, and nongovernmental organizations, it is the most commonly used VRS device used by current-day prospective ophthalmologists trained in the US (Marshall Dial, e-mail communication, January 2020).

VRS training and evaluation in the analyzed studies placed greater emphasis on intraocular maneuvers as opposed to periocular manipulation at the conjunctiva, sclera, and cornea. By our literature search, these specific skills were not practiced or assessed in training modules, but are emphasized in newer virtual environments like the HelpMeSee simulator, which is used for manual small incision cataract surgery (MSICS) training. Translation of skills required for MSICS remains unstudied with respect to resident performance in other ophthalmic procedures.[52]

Lastly, our search was limited to literature for which English full text was available in the PubMed database. Few studies were qualified as randomized controlled trials with a high level of evidence as per OCEBM criteria. Unpublished literature was not included, therefore eliciting concern for influence of a publication bias.


#

Validity and Implications

The concept of validity is particularly important in the medical education literature. Gallagher et al defined various subtypes important in the context of surgical education and performance evaluation.[53] Concurrent validity is based on whether the test produces similar results to another purporting to measure the same, whereas construct validity is in part based on whether a test will measure what is intended based on ability to differentiate experts from novices.

We found strong evidence for the concurrent and construct validity of Eyesi-based cataract surgery proficiency assessment. Simulator scores were correlated with OSACSS and other systems used during live cases, and VRS-naïve experienced surgeons universally outperformed VRS-naïve junior trainee counterparts. Novice surgeon status and nondominant hand use were associated with more dramatic early increases in module scores. This intuitively makes sense as initial scores were generally lower under these conditions and thus allowed “more room for improvement” before a ceiling effect was observed. We presume that VRS training may be more beneficial to skill development if undertaken earlier in one's career.


#

Negative Findings

Nonetheless, even studies providing compelling evidence for VRS training yielded varying degrees of improvement across discrete surgical maneuvers and outcomes. For example, Daly et al reported shorter capsulorhexis time after wet laboratory training.[12] Selvander and Åsman discussed that OSACCS scoring more effectively discriminated between surgical novices and experts in simulated phacoemulsification tasks.[30] Puri et al did not deem VRS availability to be beneficial to residents' perceived preparedness for live hydrodissection and sculpting, nor did surveyed residents endorse significantly reduced perceived difficulty of surgical steps after simulator use. Rather, they found the element of faculty supervision, discussion, and feedback to be most integral to development of proper techniques and comfort level.[25]


#

Practical Application: Cost Consideration and Alternatives

Arguably, the most instructive quality of this review is in informing whether training programs should invest in VRS technology to reap its potential benefits, both in a vacuum and in comparison with other training modalities.

A common counter-argument is based on required expense: of the simulator device and its maintenance, of a physical training space, of faculty instruction time and diversion from other educational activities in a residency curriculum.[35] [54] However, Spiteri et al noted that reductions in the live operative learning curve may actually reduce the cost of training each resident.[29] Lowry et al modeled operating room savings associated with Eyesi use based on presumed shortened surgical times alone. Cost savings ranged widely but were closely associated with resident program size and duration of Eyesi use, in some cases offsetting a significant portion of the estimated $150,000 simulator purchase price.[55] Theoretical savings attributed to reduction in complication rates or need for human mentoring were not considered in this model but would likely offer additional benefit.[56]

More recently, Ferris et al's large study describing PCR rates of early career surgeons with and without Eyesi access also modeled savings based on reduced costs incurred by PCR and other complications, in addition to cost of medical negligence claims. The group estimated a 280-case reduction in total resident PCR complications per year across all sites with Eyesi training implementation, corresponding to £560,000 in savings to the United Kingdom National Health Service annually and recoupment of aggregated simulator sticker price within 4 years. Therefore, VRS training was found to be beneficial in terms of both quality optimization and cost limitation.[21]

Furthermore, the only randomized controlled trial comparing resident performance after VRS versus wet laboratory training highlighted equally realistic training environments as rated by study participants. The Eyesi was deemed a safe alternative that could also effectively predict future operating performance, thus calling attention to trainees who may require extra assistance early. Though the wet laboratory seems to offer more effective orientation to the surgical microscope and instruments, increased need for in-person mentorship and expense of machinery maintenance, instrumentation, and the eyes themselves present obstacles to training programs.[12] Meanwhile, VRS offers consistent, repeatable intraocular scenarios and immediate feedback without equal need for human supervision or recurrent material costs.


#

Overall

Multiple studies met inclusion criteria in each year since 2010, indicating a steady release of publications throughout the decade. Works investigated the utility of VRS in capacities related to training, assessment, or both—at times in the context of inherent operator characteristics or under varying conditions. The majority reported reduced operative time and complication rates, or improved self-perceived competency and peer evaluations with structured simulation-based training. From this review of the current literature, we posit that the ideal VRS-based cataract surgery curriculum would adhere to the following core principles:

  • Commence early with novice surgeons in the first year of residency training.[29] [49] [56]

  • Adopt a proficiency-based rather than time-based learning approach, by which the trainee practices under increasingly difficult conditions until performance exceeds pre-defined pass/fail levels.[16] [17] [29] [48] [50] [56]

  • Utilize spaced repetition and global score targets in a curriculum of diverse abstract and procedural tasks of increasing difficulty (antitremor, forceps manipulation, etc.)

  • Emphasize deliberate practice of targeted capsulorhexis and phacoemulsification tasks—deemed the most difficult by novice surgeons.[57]

  • Complement a concurrent, supervised wet laboratory curriculum providing additional experience with tissue manipulation and orientation to the live operating environment.[11] [12]

  • Provide ongoing reinforcement throughout residency training via immediate simulator-based feedback, self-assessment questionnaire,[24] and routine faculty evaluation.

Finally, given overall heterogeneity of design and small size of prior studies, the creation of a large educational network for VRS-based cataract surgery training may offer additional value. A movement toward evidence-based standardization and evaluation of surgical curricula in multicenter trials utilizing VRS module scores, total simulation hours, self-assessment results, and other metrics could be networked to the American Academy of Ophthalmology as the basis for prospective studies. While VRS has seemingly asserted itself as a valuable component of the modern cataract surgery education program, broader collaboration has the potential to further optimize training by improving patient outcomes and enhancing cost-effectiveness.


#
#
#

Conflict of Interest

None declared.

  • References

  • 1 Meekins LC, Afshari NA. Ever-evolving technological advances in cataract surgery: can perfection be achieved?. Curr Opin Ophthalmol 2012; 23 (01) 1-2
  • 2 Accreditation Council for Graduate Medical Education, Department of Applications and Data Analysis, Ophthalmology Case Logs: National Data Report. Published online 2018. Available at: https://acgme.org/Portals/0/PDFs/240_National_Report_Program_Version_2017-2018.pdf. Accessed October 1, 2019
  • 3 Randleman JB, Wolfe JD, Woodward M, Lynn MJ, Cherwek DH, Srivastava SK. The resident surgeon phacoemulsification learning curve. Arch Ophthalmol 2007; 125 (09) 1215-1219
  • 4 Lam CK, Sundaraj K, Sulaiman MN. A systematic review of phacoemulsification cataract surgery in virtual reality simulators. Medicina (Kaunas) 2013; 49 (01) 1-8
  • 5 Sutherland LM, Middleton PF, Anthony A. et al. Surgical simulation: a systematic review. Ann Surg 2006; 243 (03) 291-300
  • 6 Sikder S, Tuwairqi K, Al-Kahtani E, Myers WG, Banerjee P. Surgical simulators in cataract surgery training. Br J Ophthalmol 2014; 98 (02) 154-158
  • 7 Issenberg SB, McGaghie WC, Hart IR. et al. Simulation technology for health care professional skills training and assessment. JAMA 1999; 282 (09) 861-866
  • 8 OCEBM Levels of Evidence Working Group, . The Oxford 2011 Levels of Evidence. Available at: http://www.cebm.net/index.aspx?o=5653. Accessed July 1, 2019
  • 9 Saleh GM, Gauba V, Mitra A, Litwin AS, Chung AKK, Benjamin L. Objective structured assessment of cataract surgical skill. Arch Ophthalmol 2007; 125 (03) 363-366
  • 10 Cremers SL, Lora AN, Ferrufino-Ponce ZK. Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Ophthalmology 2005; 112 (10) 1655-1660
  • 11 Feudner EM, Engel C, Neuhann IM, Petermeier K, Bartz-Schmidt K-U, Szurman P. Virtual reality training improves wet-lab performance of capsulorhexis: results of a randomized, controlled study. Graefes Arch Clin Exp Ophthalmol 2009; 247 (07) 955-963
  • 12 Daly MK, Gonzalez E, Siracuse-Lee D, Legutko PA. Efficacy of surgical simulator training versus traditional wet-lab training on operating room performance of ophthalmology residents during the capsulorhexis in cataract surgery. J Cataract Refract Surg 2013; 39 (11) 1734-1741
  • 13 Baxter JM, Lee R, Sharp JA, Foss AJ. Intensive Cataract Training Study Group. Intensive cataract training: a novel approach. Eye (Lond) 2013; 27 (06) 742-746
  • 14 Belyea DA, Brown SE, Rajjoub LZ. Influence of surgery simulator training on ophthalmology resident phacoemulsification performance. J Cataract Refract Surg 2011; 37 (10) 1756-1761
  • 15 Pokroy R, Du E, Alzaga A. et al. Impact of simulator training on resident cataract surgery. Graefes Arch Clin Exp Ophthalmol 2013; 251 (03) 777-781
  • 16 McCannel CA, Reed DC, Goldman DR. Ophthalmic surgery simulator training improves resident performance of capsulorhexis in the operating room. Ophthalmology 2013; 120 (12) 2456-2461
  • 17 McCannel CA. Continuous Curvilinear Capsulorhexis Training and Non-Rhexis Related Vitreous Loss: The Specificity of Virtual Reality Simulator Surgical Training (An American Ophthalmological Society Thesis). Trans Am Ophthalmol Soc 2017; 115: T2
  • 18 Staropoli PC, Gregori NZ, Junk AK. et al. Surgical simulation training reduces intraoperative cataract surgery complications among residents. Simul Healthc 2018; 13 (01) 11-15
  • 19 Lucas L, Schellini SA, Lottelli AC. Complications in the first 10 phacoemulsification cataract surgeries with and without prior simulator training. Arq Bras Oftalmol 2019; 82 (04) 289-294
  • 20 Lopez-Beauchamp C, Singh GA, Shin SY, Magone MT. Surgical simulator training reduces operative times in resident surgeons learning phacoemulsification cataract surgery. Am J Ophthalmol Case Rep 2019; 17: 100576 DOI: 10.1016/j.ajoc.2019.100576.
  • 21 Ferris JD, Donachie PH, Johnston RL, Barnes B, Olaitan M, Sparrow JM. Royal College of Ophthalmologists' National Ophthalmology Database study of cataract surgery: report 6. The impact of EyeSi virtual reality training on complications rates of cataract surgery performed by first and second year trainees. Br J Ophthalmol 2020; 104 (03) 324-329
  • 22 Thomsen ASS, Bach-Holm D, Kjærbo H. et al. Operating room performance improves after proficiency-based virtual reality cataract surgery training. Ophthalmology 2017; 124 (04) 524-531
  • 23 Roohipoor R, Yaseri M, Teymourpour A, Kloek C, Miller JB, Loewenstein JI. Early performance on an eye surgery simulator predicts subsequent resident surgical performance. J Surg Educ 2017; 74 (06) 1105-1115
  • 24 Kloek CE, Borboli-Gerogiannis S, Chang K. et al. A broadly applicable surgical teaching method: evaluation of a stepwise introduction to cataract surgery. J Surg Educ 2014; 71 (02) 169-175
  • 25 Puri S, Srikumaran D, Prescott C, Tian J, Sikder S. Assessment of resident training and preparedness for cataract surgery. J Cataract Refract Surg 2017; 43 (03) 364-368
  • 26 Mahr MA, Hodge DO. Construct validity of anterior segment anti-tremor and forceps surgical simulator training modules: attending versus resident surgeon performance. J Cataract Refract Surg 2008; 34 (06) 980-985
  • 27 Privett B, Greenlee E, Rogers G, Oetting TA. Construct validity of a surgical simulator as a valid model for capsulorhexis training. J Cataract Refract Surg 2010; 36 (11) 1835-1838 1
  • 28 Selvander M, Asman P. Cataract surgeons outperform medical students in Eyesi virtual reality cataract surgery: evidence for construct validity. Acta Ophthalmol 2013; 91 (05) 469-474
  • 29 Spiteri AV, Aggarwal R, Kersey TL. et al. Development of a virtual reality training curriculum for phacoemulsification surgery. Eye (Lond) 2014; 28 (01) 78-84
  • 30 Selvander M, Åsman P. Ready for OR or not? Human reader supplements Eyesi scoring in cataract surgical skills assessment. Clin Ophthalmol 2013; 7: 1973-1977
  • 31 Thomsen ASS, Kiilgaard JF, Kjaerbo H, la Cour M, Konge L. Simulation-based certification for cataract surgery. Acta Ophthalmol 2015; 93 (05) 416-421
  • 32 Lam CK, Sundaraj K, Sulaiman MN, Qamarruddin FA. Virtual phacoemulsification surgical simulation using visual guidance and performance parameters as a feasible proficiency assessment tool. BMC Ophthalmol 2016; 16: 88 DOI: 10.1186/s12886-016-0269-2.
  • 33 Bozkurt Oflaz A, Ekinci Köktekir B, Okudan S. Does cataract surgery simulation correlate with real-life experience?. Turk J Ophthalmol 2018; 48 (03) 122-126
  • 34 Banerjee PP, Edward DP, Liang S. et al. Concurrent and face validity of a capsulorhexis simulation with respect to human patients. Stud Health Technol Inform 2012; 173: 35-41
  • 35 Thomsen ASS, Smith P, Subhi Y. et al. High correlation between performance on a virtual-reality simulator and real-life cataract surgery. Acta Ophthalmol 2017; 95 (03) 307-311
  • 36 Jacobsen MF, Konge L, Bach-Holm D. et al. Correlation of virtual reality performance with real-life cataract surgery performance. J Cataract Refract Surg 2019; 45 (09) 1246-1251
  • 37 Sikder S, Luo J, Banerjee PP. et al. The use of a virtual reality surgical simulator for cataract surgical skill assessment with 6 months of intervening operating room experience. Clin Ophthalmol 2015; 9: 141-149
  • 38 Saleh GM, Theodoraki K, Gillan S. et al. The development of a virtual reality training programme for ophthalmology: repeatability and reproducibility (part of the International Forum for Ophthalmic Simulation Studies). Eye Lond Engl 2013; 27 (11) 1269-1274
  • 39 Balal S, Smith P, Bader T. et al. Computer analysis of individual cataract surgery segments in the operating room. Eye (Lond) 2019; 33 (02) 313-319
  • 40 Waqar S, Park J, Kersey TL, Modi N, Ong C, Sleep TJ. Assessment of fatigue in intraocular surgery: analysis using a virtual reality simulator. Graefes Arch Clin Exp Ophthalmol 2011; 249 (01) 77-81
  • 41 Park J, Waqar S, Kersey T, Modi N, Ong C, Sleep T. Effect of distraction on simulated anterior segment surgical performance. J Cataract Refract Surg 2011; 37 (08) 1517-1522
  • 42 Pointdujour R, Ahmad H, Liu M, Smith E, Lazzaro D. ≡-blockade affects simulator scores. Ophthalmology 2011; 118 (09) 1893-1893.e3
  • 43 Podbielski DW, Noble J, Gill HS, Sit M, Lam W-C. A comparison of hand- and foot-activated surgical tools in simulated ophthalmic surgery. Can J Ophthalmol 2012; 47 (05) 414-417
  • 44 Park J, Williams O, Waqar S, Modi N, Kersey T, Sleep T. Safety of nondominant-hand ophthalmic surgery. J Cataract Refract Surg 2012; 38 (12) 2112-2116
  • 45 Selvander M, Åsman P. Stereoacuity and intraocular surgical skill: effect of stereoacuity level on virtual reality intraocular surgical performance. J Cataract Refract Surg 2011; 37 (12) 2188-2193
  • 46 Modi N, Williams O, Swampillai AJ. et al. Learning styles and the prospective ophthalmologist. Med Teach 2015; 37 (04) 344-347
  • 47 Saleh GM, Lamparter J, Sullivan PM. et al. The international forum of ophthalmic simulation: developing a virtual reality training curriculum for ophthalmology. Br J Ophthalmol 2013; 97 (06) 789-792
  • 48 Bergqvist J, Person A, Vestergaard A, Grauslund J. Establishment of a validated training programme on the Eyesi cataract simulator. A prospective randomized study. Acta Ophthalmol 2014; 92 (07) 629-634
  • 49 Gonzalez-Gonzalez LA, Payal AR, Gonzalez-Monroy JE, Daly MK. Ophthalmic surgical simulation in training dexterity in dominant and nondominant hands: results from a pilot study. J Surg Educ 2016; 73 (04) 699-708
  • 50 Selvander M, Åsman P. Virtual reality cataract surgery training: learning curves and concurrent validity. Acta Ophthalmol 2012; 90 (05) 412-417
  • 51 Thomsen ASS, Kiilgaard JF, la Cour M, Brydges R, Konge L. Is there inter-procedural transfer of skills in intraocular surgery? A randomized controlled trial. Acta Ophthalmol 2017; 95 (08) 845-851
  • 52 Broyles JR, Glick P, Hu J, Lim Y-W. Cataract blindness and simulation-based training for cataract surgeons: an assessment of the HelpMeSee approach. Rand Health Q 2013; 3 (01) 7
  • 53 Gallagher AG, Ritter EM, Satava RM. Fundamental principles of validation, and reliability: rigorous science for the assessment of surgical education and training. Surg Endosc 2003; 17 (10) 1525-1529
  • 54 MacRae HM, Satterthwaite L, Reznick RK. Setting up a surgical skills center. World J Surg 2008; 32 (02) 189-195
  • 55 Lowry EA, Porco TC, Naseri A. Cost analysis of virtual-reality phacoemulsification simulation in ophthalmology training programs. J Cataract Refract Surg 2013; 39 (10) 1616-1617
  • 56 la Cour M, Thomsen ASS, Alberti M, Konge L. Simulators in the training of surgeons: is it worth the investment in money and time? 2018 Jules Gonin lecture of the Retina Research Foundation. Graefes Arch Clin Exp Ophthalmol 2019; 257 (05) 877-881
  • 57 Dooley IJ, O'Brien PD. Subjective difficulty of each stage of phacoemulsification cataract surgery performed by basic surgical trainees. J Cataract Refract Surg 2006; 32 (04) 604-608

Address for correspondence

James P. Winebrake, MD
Department of Ophthalmology, Weill Cornell Medicine
1305 York Avenue, New York, NY 10021

Publication History

Received: 30 April 2020

Accepted: 31 August 2020

Article published online:
30 October 2020

© 2020. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial-License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/).

Thieme Medical Publishers
333 Seventh Avenue, New York, NY 10001, USA.

  • References

  • 1 Meekins LC, Afshari NA. Ever-evolving technological advances in cataract surgery: can perfection be achieved?. Curr Opin Ophthalmol 2012; 23 (01) 1-2
  • 2 Accreditation Council for Graduate Medical Education, Department of Applications and Data Analysis, Ophthalmology Case Logs: National Data Report. Published online 2018. Available at: https://acgme.org/Portals/0/PDFs/240_National_Report_Program_Version_2017-2018.pdf. Accessed October 1, 2019
  • 3 Randleman JB, Wolfe JD, Woodward M, Lynn MJ, Cherwek DH, Srivastava SK. The resident surgeon phacoemulsification learning curve. Arch Ophthalmol 2007; 125 (09) 1215-1219
  • 4 Lam CK, Sundaraj K, Sulaiman MN. A systematic review of phacoemulsification cataract surgery in virtual reality simulators. Medicina (Kaunas) 2013; 49 (01) 1-8
  • 5 Sutherland LM, Middleton PF, Anthony A. et al. Surgical simulation: a systematic review. Ann Surg 2006; 243 (03) 291-300
  • 6 Sikder S, Tuwairqi K, Al-Kahtani E, Myers WG, Banerjee P. Surgical simulators in cataract surgery training. Br J Ophthalmol 2014; 98 (02) 154-158
  • 7 Issenberg SB, McGaghie WC, Hart IR. et al. Simulation technology for health care professional skills training and assessment. JAMA 1999; 282 (09) 861-866
  • 8 OCEBM Levels of Evidence Working Group, . The Oxford 2011 Levels of Evidence. Available at: http://www.cebm.net/index.aspx?o=5653. Accessed July 1, 2019
  • 9 Saleh GM, Gauba V, Mitra A, Litwin AS, Chung AKK, Benjamin L. Objective structured assessment of cataract surgical skill. Arch Ophthalmol 2007; 125 (03) 363-366
  • 10 Cremers SL, Lora AN, Ferrufino-Ponce ZK. Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Ophthalmology 2005; 112 (10) 1655-1660
  • 11 Feudner EM, Engel C, Neuhann IM, Petermeier K, Bartz-Schmidt K-U, Szurman P. Virtual reality training improves wet-lab performance of capsulorhexis: results of a randomized, controlled study. Graefes Arch Clin Exp Ophthalmol 2009; 247 (07) 955-963
  • 12 Daly MK, Gonzalez E, Siracuse-Lee D, Legutko PA. Efficacy of surgical simulator training versus traditional wet-lab training on operating room performance of ophthalmology residents during the capsulorhexis in cataract surgery. J Cataract Refract Surg 2013; 39 (11) 1734-1741
  • 13 Baxter JM, Lee R, Sharp JA, Foss AJ. Intensive Cataract Training Study Group. Intensive cataract training: a novel approach. Eye (Lond) 2013; 27 (06) 742-746
  • 14 Belyea DA, Brown SE, Rajjoub LZ. Influence of surgery simulator training on ophthalmology resident phacoemulsification performance. J Cataract Refract Surg 2011; 37 (10) 1756-1761
  • 15 Pokroy R, Du E, Alzaga A. et al. Impact of simulator training on resident cataract surgery. Graefes Arch Clin Exp Ophthalmol 2013; 251 (03) 777-781
  • 16 McCannel CA, Reed DC, Goldman DR. Ophthalmic surgery simulator training improves resident performance of capsulorhexis in the operating room. Ophthalmology 2013; 120 (12) 2456-2461
  • 17 McCannel CA. Continuous Curvilinear Capsulorhexis Training and Non-Rhexis Related Vitreous Loss: The Specificity of Virtual Reality Simulator Surgical Training (An American Ophthalmological Society Thesis). Trans Am Ophthalmol Soc 2017; 115: T2
  • 18 Staropoli PC, Gregori NZ, Junk AK. et al. Surgical simulation training reduces intraoperative cataract surgery complications among residents. Simul Healthc 2018; 13 (01) 11-15
  • 19 Lucas L, Schellini SA, Lottelli AC. Complications in the first 10 phacoemulsification cataract surgeries with and without prior simulator training. Arq Bras Oftalmol 2019; 82 (04) 289-294
  • 20 Lopez-Beauchamp C, Singh GA, Shin SY, Magone MT. Surgical simulator training reduces operative times in resident surgeons learning phacoemulsification cataract surgery. Am J Ophthalmol Case Rep 2019; 17: 100576 DOI: 10.1016/j.ajoc.2019.100576.
  • 21 Ferris JD, Donachie PH, Johnston RL, Barnes B, Olaitan M, Sparrow JM. Royal College of Ophthalmologists' National Ophthalmology Database study of cataract surgery: report 6. The impact of EyeSi virtual reality training on complications rates of cataract surgery performed by first and second year trainees. Br J Ophthalmol 2020; 104 (03) 324-329
  • 22 Thomsen ASS, Bach-Holm D, Kjærbo H. et al. Operating room performance improves after proficiency-based virtual reality cataract surgery training. Ophthalmology 2017; 124 (04) 524-531
  • 23 Roohipoor R, Yaseri M, Teymourpour A, Kloek C, Miller JB, Loewenstein JI. Early performance on an eye surgery simulator predicts subsequent resident surgical performance. J Surg Educ 2017; 74 (06) 1105-1115
  • 24 Kloek CE, Borboli-Gerogiannis S, Chang K. et al. A broadly applicable surgical teaching method: evaluation of a stepwise introduction to cataract surgery. J Surg Educ 2014; 71 (02) 169-175
  • 25 Puri S, Srikumaran D, Prescott C, Tian J, Sikder S. Assessment of resident training and preparedness for cataract surgery. J Cataract Refract Surg 2017; 43 (03) 364-368
  • 26 Mahr MA, Hodge DO. Construct validity of anterior segment anti-tremor and forceps surgical simulator training modules: attending versus resident surgeon performance. J Cataract Refract Surg 2008; 34 (06) 980-985
  • 27 Privett B, Greenlee E, Rogers G, Oetting TA. Construct validity of a surgical simulator as a valid model for capsulorhexis training. J Cataract Refract Surg 2010; 36 (11) 1835-1838 1
  • 28 Selvander M, Asman P. Cataract surgeons outperform medical students in Eyesi virtual reality cataract surgery: evidence for construct validity. Acta Ophthalmol 2013; 91 (05) 469-474
  • 29 Spiteri AV, Aggarwal R, Kersey TL. et al. Development of a virtual reality training curriculum for phacoemulsification surgery. Eye (Lond) 2014; 28 (01) 78-84
  • 30 Selvander M, Åsman P. Ready for OR or not? Human reader supplements Eyesi scoring in cataract surgical skills assessment. Clin Ophthalmol 2013; 7: 1973-1977
  • 31 Thomsen ASS, Kiilgaard JF, Kjaerbo H, la Cour M, Konge L. Simulation-based certification for cataract surgery. Acta Ophthalmol 2015; 93 (05) 416-421
  • 32 Lam CK, Sundaraj K, Sulaiman MN, Qamarruddin FA. Virtual phacoemulsification surgical simulation using visual guidance and performance parameters as a feasible proficiency assessment tool. BMC Ophthalmol 2016; 16: 88 DOI: 10.1186/s12886-016-0269-2.
  • 33 Bozkurt Oflaz A, Ekinci Köktekir B, Okudan S. Does cataract surgery simulation correlate with real-life experience?. Turk J Ophthalmol 2018; 48 (03) 122-126
  • 34 Banerjee PP, Edward DP, Liang S. et al. Concurrent and face validity of a capsulorhexis simulation with respect to human patients. Stud Health Technol Inform 2012; 173: 35-41
  • 35 Thomsen ASS, Smith P, Subhi Y. et al. High correlation between performance on a virtual-reality simulator and real-life cataract surgery. Acta Ophthalmol 2017; 95 (03) 307-311
  • 36 Jacobsen MF, Konge L, Bach-Holm D. et al. Correlation of virtual reality performance with real-life cataract surgery performance. J Cataract Refract Surg 2019; 45 (09) 1246-1251
  • 37 Sikder S, Luo J, Banerjee PP. et al. The use of a virtual reality surgical simulator for cataract surgical skill assessment with 6 months of intervening operating room experience. Clin Ophthalmol 2015; 9: 141-149
  • 38 Saleh GM, Theodoraki K, Gillan S. et al. The development of a virtual reality training programme for ophthalmology: repeatability and reproducibility (part of the International Forum for Ophthalmic Simulation Studies). Eye Lond Engl 2013; 27 (11) 1269-1274
  • 39 Balal S, Smith P, Bader T. et al. Computer analysis of individual cataract surgery segments in the operating room. Eye (Lond) 2019; 33 (02) 313-319
  • 40 Waqar S, Park J, Kersey TL, Modi N, Ong C, Sleep TJ. Assessment of fatigue in intraocular surgery: analysis using a virtual reality simulator. Graefes Arch Clin Exp Ophthalmol 2011; 249 (01) 77-81
  • 41 Park J, Waqar S, Kersey T, Modi N, Ong C, Sleep T. Effect of distraction on simulated anterior segment surgical performance. J Cataract Refract Surg 2011; 37 (08) 1517-1522
  • 42 Pointdujour R, Ahmad H, Liu M, Smith E, Lazzaro D. ≡-blockade affects simulator scores. Ophthalmology 2011; 118 (09) 1893-1893.e3
  • 43 Podbielski DW, Noble J, Gill HS, Sit M, Lam W-C. A comparison of hand- and foot-activated surgical tools in simulated ophthalmic surgery. Can J Ophthalmol 2012; 47 (05) 414-417
  • 44 Park J, Williams O, Waqar S, Modi N, Kersey T, Sleep T. Safety of nondominant-hand ophthalmic surgery. J Cataract Refract Surg 2012; 38 (12) 2112-2116
  • 45 Selvander M, Åsman P. Stereoacuity and intraocular surgical skill: effect of stereoacuity level on virtual reality intraocular surgical performance. J Cataract Refract Surg 2011; 37 (12) 2188-2193
  • 46 Modi N, Williams O, Swampillai AJ. et al. Learning styles and the prospective ophthalmologist. Med Teach 2015; 37 (04) 344-347
  • 47 Saleh GM, Lamparter J, Sullivan PM. et al. The international forum of ophthalmic simulation: developing a virtual reality training curriculum for ophthalmology. Br J Ophthalmol 2013; 97 (06) 789-792
  • 48 Bergqvist J, Person A, Vestergaard A, Grauslund J. Establishment of a validated training programme on the Eyesi cataract simulator. A prospective randomized study. Acta Ophthalmol 2014; 92 (07) 629-634
  • 49 Gonzalez-Gonzalez LA, Payal AR, Gonzalez-Monroy JE, Daly MK. Ophthalmic surgical simulation in training dexterity in dominant and nondominant hands: results from a pilot study. J Surg Educ 2016; 73 (04) 699-708
  • 50 Selvander M, Åsman P. Virtual reality cataract surgery training: learning curves and concurrent validity. Acta Ophthalmol 2012; 90 (05) 412-417
  • 51 Thomsen ASS, Kiilgaard JF, la Cour M, Brydges R, Konge L. Is there inter-procedural transfer of skills in intraocular surgery? A randomized controlled trial. Acta Ophthalmol 2017; 95 (08) 845-851
  • 52 Broyles JR, Glick P, Hu J, Lim Y-W. Cataract blindness and simulation-based training for cataract surgeons: an assessment of the HelpMeSee approach. Rand Health Q 2013; 3 (01) 7
  • 53 Gallagher AG, Ritter EM, Satava RM. Fundamental principles of validation, and reliability: rigorous science for the assessment of surgical education and training. Surg Endosc 2003; 17 (10) 1525-1529
  • 54 MacRae HM, Satterthwaite L, Reznick RK. Setting up a surgical skills center. World J Surg 2008; 32 (02) 189-195
  • 55 Lowry EA, Porco TC, Naseri A. Cost analysis of virtual-reality phacoemulsification simulation in ophthalmology training programs. J Cataract Refract Surg 2013; 39 (10) 1616-1617
  • 56 la Cour M, Thomsen ASS, Alberti M, Konge L. Simulators in the training of surgeons: is it worth the investment in money and time? 2018 Jules Gonin lecture of the Retina Research Foundation. Graefes Arch Clin Exp Ophthalmol 2019; 257 (05) 877-881
  • 57 Dooley IJ, O'Brien PD. Subjective difficulty of each stage of phacoemulsification cataract surgery performed by basic surgical trainees. J Cataract Refract Surg 2006; 32 (04) 604-608

Zoom Image
Fig. 1 Flow diagram presenting the systematic review process used in this study.
Zoom Image
Fig. 2 Article count by year, stratified by category.