CC BY-NC-ND 4.0 · Journal of Academic Ophthalmology 2018; 10(01): e127-e132
DOI: 10.1055/s-0038-1668574
Research Article
Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

The Pediatric Examination Assessment Rubric (PEAR): A Pilot Project

Michael Langue
1   Department of Ophthalmology, Penn State Eye Center, Hershey, Pennsylvania
,
Ingrid U. Scott
1   Department of Ophthalmology, Penn State Eye Center, Hershey, Pennsylvania
2   Department of Public Health Sciences, Penn State College of Medicine, Hershey, Pennsylvania
,
Ajay Soni
1   Department of Ophthalmology, Penn State Eye Center, Hershey, Pennsylvania
› Author Affiliations
Further Information

Address for correspondence

Michael Langue, MD
Department of Ophthalmology, Penn State Eye Center
200 Campus Drive, Suite 800, Hershey, PA 17033

Publication History

22 April 2018

09 July 2018

Publication Date:
16 August 2018 (online)

 

Abstract

Purpose The pediatric ophthalmic examination is often considered a challenge to ophthalmologists at any level of training. New tools need to be developed and tested to enhance resident and fellow training in pediatric ophthalmology. To our knowledge, this pilot project introduces the first educational rubric designed specifically for a pediatric ophthalmic examination.

Methods Preliminary surveys were completed by 11 ophthalmic residents, of all three postgraduate years (PGY), to gauge comfort level with the pediatric ophthalmic examination. A one-page Pediatric Examination Assessment Rubric (PEAR) was developed and reviewed by 13 content experts (12 pediatric ophthalmologists and a lead developer of the Ophthalmic Clinical Exercise Examination [OCEX] tool) at eight academic institutions. A total of five educators from three academic institutions used the rubric to evaluate a total of six residents during a new strabismus evaluation. Postevaluation surveys were completed by both the five educators and the six residents.

Results Preliminary surveys showed that only 18.2% of residents felt their pediatric examination skills were good. Residents noted higher levels of frustration and less comfort with the pediatric examination when compared with an adult examination. Thirteen experts' comments were incorporated into the rubric to establish content validity. Postevaluation surveys showed that 60% of faculty and 100% of residents found the rubric to be very effective in providing feedback.

Conclusion In this pilot project, we established the need for more concrete educational tools in pediatric ophthalmology, created an educational tool, established content validity, and demonstrated feasibility. The PEAR helps residents identify skills to target for improvement based on the quality of their pediatric ophthalmic examinations. At three academic institutions, the PEAR was shown to be easy to use and a useful tool for training residents to perform the pediatric ophthalmic examination.


#

Previous studies have shown that, compared with other subspecialties in ophthalmology, residents find the pediatric examination to be most challenging and, therefore, a primary reason residents decide against pursuing this subspecialty.[1] [2] Additional educational tools for resident training in pediatric ophthalmology are needed to address this challenge.

In recent years, the Accreditation Council for Graduate Medical Education (ACGME) mandated that all residency programs create tools to assess residents in six-core competencies.[3] To address the patient care skills competency in ophthalmology, a single-page Ophthalmic Clinical Exercise Examination (OCEX) checklist has been established.[4] [5] This checklist can be used to evaluate an encounter between a resident and patient. The checklist focuses on four skill areas including the interview, examination, communication and professionalism, and presentation. While most of these skills can be generalized across all patient encounters, the examination portion of this checklist is not suited for a pediatric eye examination because it does not include an assessment of the critical skills related to checking binocular sensory function, motor alignment, and visual acuity in children.

In this pilot project, we sought to create, validate, and test the feasibility of a standardized tool for assessing resident competency in performing a pediatric ophthalmic examination. Our hope is that this tool can be used by the attending to provide structured feedback to the trainee during the course of his or her rotation in pediatric ophthalmology, thus helping the trainee gain skills and confidence in examining children.

Materials and Methods

A preliminary survey was given to 11 residents in training. The survey was designed to identify residents' general thoughts about their pediatric examination. The questions and response options of the preliminary survey are listed in [Table 1]. Using the OCEX tool as a reference, a one-page Pediatric Examination Assessment Rubric (PEAR) was created. A simple 3-point Likert scale was used, with ample space for the attending to make specific comments and suggestions for improvement. To use the rubric, a faculty member completes the rubric while observing a resident–patient encounter.

Table 1

Resident preliminary survey questions and response options

Question

Response option(s)

 1. Initials

Open response

 2. Year of training and program you attend

Open response

 3. How would you rate your ophthalmic exam skills in general

Poor/Fair/Good/Excellent

 4. How would you rate your ophthalmic exam skills in neuro-ophthalmology

Poor/Fair/Good/Excellent

 5. How would you rate your ophthalmic exam skills in pediatric ophthalmology

Poor/Fair/Good/Excellent

 6. How would you rate your comfort level with an adult exam

Poor/Fair/Good/Excellent

 7. How would you rate your comfort level with an adult exam

Poor/Fair/Good/Excellent

 8. How would you rate your level of frustration during an adult ophthalmic exam

None/Little frustration/Some frustration/Lots of frustration

 9. How would you rate your level of frustration during an pediatric ophthalmic exam

None/Little frustration/Some frustration/Lots of frustration

 10. Which aspects of the pediatric ophthalmic exam do you consider most challenging? (check all that apply)

Rapport building with patient and family/Stereoacuity/Worth four-dot test/Hirschberg's test/Krimsky's test/Cover/Uncover/Alternate cover/Versions/Visual acuity/Pupils/Lids and adnexa/Anterior segment exam/Intraocular pressure measurement/Retinoscopy/Binocular indirect ophthalmoscopy

To establish validity, 13 content experts from eight academic institutions, including 12 pediatric ophthalmologists and a lead developer of the OCEX tool, commented on the rubric. The PEAR was modified based on their feedback. The final rubric is illustrated in [Fig. 1].

Zoom Image
Fig. 1 The one-page Pediatric Examination Assessment Rubric (PEAR).

To assess feasibility, the rubric was completed by five pediatric ophthalmology faculty members while observing residents during a new pediatric strabismus encounter. Six residents underwent an observed clinical encounter utilizing the rubric and then completed a postevaluation survey about their experience. The selection of residents who underwent the observed clinical encounter was based on their availability and not on the results of the preliminary survey. Five faculty members completed postevaluation surveys (one faculty member observed two different residents). The postevaluation survey included questions about the effectiveness of the rubric in providing feedback to the trainee, level of distraction due to the assessment, and suggestions for improvement. The resident and faculty postsurvey questions and response options are listed in [Tables 2] and [3], respectively.

Table 2

Resident postexamination survey questions and response options

Question

Answer(s)

 1. Initials

Open response

 2. Year of training and program you attend

Open response

 3. How effective was the rubric in providing feedback on your exam skills

Very effective/Moderately effective/Slightly effective/Not at all effective

 4. How would you rate your pediatric exam skills following this exercise

Excellent/Good/Fair/Poor

 5. Did this exercise distract you from the patient encounter

Yes/No

 6. Which aspect(s) of the pediatric ophthalmic exam did you consider most challenging? Check all that apply

Rapport building with patient and family/Stereoacuity/Worth four-dot test/Hirschberg's test/Krimsky's test/Cover/Uncover/Alternate cover/Versions/Visual acuity/Pupils/Lids and adnexa/Anterior segment exam/Intraocular pressure measurement/Retinoscopy/Binocular indirect ophthalmoscopy

 7. How would you improve this assessment tool

Open response

 8. Other suggestions

Open response

Table 3

Faculty postexamination survey questions and response options

Question

Answer(s)

 1. Initials

Open response

 2. Year of training and program you attend

Open response

 3. How effective was the rubric in providing feedback on your exam skills

Very effective/Moderately effective/Slightly effective/Not at all effective

 4. How would you rate your pediatric exam skills following this exercise

Excellent/Good/Fair/Poor

 5. Did this exercise distract you from the patient encounter

Yes/No

 6. Which aspect(s) of the pediatric ophthalmic exam did you consider most challenging? Check all that apply

Rapport building with patient and family/Stereoacuity/Worth four-dot test/Hirschberg's test/Krimsky's test/Cover/Uncover/Alternate cover/Versions/Visual acuity/Pupils/Lids and adnexa/Anterior segment exam/Intraocular pressure measurement/Retinoscopy/Binocular indirect ophthalmoscopy

 7. How would you improve this assessment tool

Open response

 8. Other suggestions

Open response


#

Results

Results from the 11 preliminary surveys are summarized in [Table 4]. The preliminary surveys completed by residents show a perceived weakness and lack of confidence in their pediatric ophthalmology examination compared with their general adult examination. While 81.8% of the residents felt their adult ophthalmic examination skills were good, only 18.2% of the residents felt their pediatric ophthalmic examination skills were good. A similar percentage difference, 72.7 versus 36.4%, was seen when comparing comfort levels between adult and pediatric examinations. Compared with an adult general eye examination, residents acknowledged more frustration with the pediatric eye examination.

Table 4

Resident pre-examination survey data (n = 11)

Excellent

Good

Fair

Poor

Pediatric exam skills

0%

18.2%

72.7%

9%

Neuro-ophthalmology exam skills

0%

18.2%

72.7%

9%

General (adult) exam skills

0%

81.8%

18.2%

0%

Pediatric exam comfort

0%

36.4%

45.5%

18.2%

General (adult) comfort

27.3%

72.7%

0%

0%

None

Little

Some

Lots

General (adult) exam frustration

18.2%

81.8%

0%

0%

Pediatric exam frustration

0%

9%

63.6%

27.3%

The 13 content experts who reviewed the PEAR gave both specific and general comments. Specific comments focused on which skills to include, as well as the descriptors on the Likert scale. The reviewers felt three scoring categories were appropriate. A column for general comments was added based on the expert feedback. This helped create a rubric that incorporated both scaled subsection scores and qualitative feedback.

The five faculty members and six residents who participated in the observed resident–patient encounters completed postevaluation surveys. Three faculty members (60%) and six residents (100%) who completed a patient encounter using the PEAR found the tool to be very effective in providing feedback on examination skills. None of the five (0%) faculty members felt the exercise distracted them from the patient encounter. Three of the six (50%) residents who completed the survey felt the exercise distracted them from the patient encounter.

The preliminary and postevaluation surveys also asked residents to identify pediatric examination skills they felt were most challenging. Results are shown in [Fig. 2]. Before the exercise, residents felt weakest in retinoscopy, alternate cover testing, binocular indirect ophthalmoscopy, and intraocular pressure testing. After the exercise, residents listed retinoscopy, alternate cover, cover/uncover, and worth four-dot test as the most challenging examination skills.

Zoom Image
Fig. 2 Pre- and postsurvey data showing the pediatric exam skills residents find most challenging.

#

Discussion

There has been a shift in residency training, from several years approach to a competency-based approach. The importance of competency-based training in ophthalmology has been established.[6] [7]

To evaluate the six core competencies, the ACGME has mandated that residency programs create valid, reliable, and feasible tools. Several models exist to evaluate a resident, each with its own strengths and weaknesses.[6] Traditionally, most programs evaluate a resident's performance using an informal style of qualitative reviews and global ratings. This style of evaluation tends to generalize a resident's performance and provides little information on specific strengths and areas of weakness. It is also difficult to determine the validity and reliability of this form of evaluation.

Models of evaluation that have been tested and validated in many specialties include the objective structured clinical examination (OSCE) and direct resident–patient observation.[8] [9] [10] [11] The OCEX was created in ophthalmology to evaluate residents in the core competency of patient care. It attempts to combine the most useful features of the OSCE and direct observation into one tool. Validity, feasibility, and reliability of the OCEX have been established.[4] [5]

Although the OCEX is an excellent tool to evaluate a resident's competency in performing a general ophthalmic examination, it is not suited for evaluating pediatric examination skills. For example, surveys from our study show that residents find alternate cover, cover/uncover, worth four-dot test, stereoacuity, and retinoscopy to be the most challenging skills of the pediatric ophthalmic examination. These skills are not included in the OCEX rubric.

A tool dedicated to evaluating a resident's pediatric examination skills is important, as trainees' difficulty with this patient population is well documented.[1] [2] To our knowledge, and based on a computerized search of the PubMed database, the PEAR is the first rubric designed specifically to evaluate a resident's pediatric ophthalmic examination skills. The rubric targets examination skills specific to the strabismus encounter, with an assessment of binocular sensory function and motor alignment examination skills, and it also includes sections that assess the entire pediatric ophthalmic examination, making it useful for any type of pediatric examination.

Feedback from 13 content experts, including 12 pediatric ophthalmologists and 1 developer of the OCEX tool, helped establish content validity for our rubric. We followed the Likert scale design, similar to what the OCEX uses. For the alignment section, we modified the rubric, because the appropriate test to choose varies from patient to patient. Additionally, this aspect of the examination has several specific tasks, making the use of checkboxes most appropriate. Most experts agreed with our descriptions of the skills; however, one felt that more detailed descriptions on the Likert scale were warranted. We chose to keep the descriptors for the Likert scale brief and easily distinguishable so as to not distract the evaluator from the observed clinical encounter. Another expert felt that not every skill would be applicable for each patient encounter. Our hope is that the comment section will allow faculty members to elaborate on their scoring or indicate that an examination skill was not applicable.

One expert commented that the order of the examination skills in the rubric is not the order that he typically follows when conducting an examination. The order listed is the order in which we teach the residents at our home institution, but the scale can be modified to change the order of the components to the teaching physician's preference.

The potential usefulness for a tool like the PEAR is evident in [Table 4], which shows residents' perceived weakness, discomfort, and frustration with the pediatric ophthalmic examination. The postevaluation survey results demonstrate that this tool is successful in facilitating structured feedback for the trainee in the clinical setting. Results displayed in [Fig. 2] show that skills specific to the pediatric examination, including cover/uncover, retinoscopy, worth four-dot test, and stereoacuity testing, were considered more challenging after the exercise. Although the small sample size precludes definitive conclusions from being drawn, this may indicate that the rubric helps residents recognize skills to target for improvement.

Since this was a pilot project, one of our goals was to determine if using the rubric was feasible during typical clinic workflow. It was easily implemented at our home institution, and we received positive feedback regarding its use at two other academic centers.

A limitation of our pilot project is that we did not assess reliability of the rubric. This could be done as part of a future study in a manner that was used for the OCEX, where a video-recorded standardized patient encounter was sent to reviewers to determine the interobserver reliability of the assessment tool.[5]

The rubric is not a “one-size-fits-all” tool, since each pediatric encounter presents its unique challenges. This rubric is meant to be a framework for faculty to indicate areas for improvement. Educators may choose to modify the rubric based on their own examination routines and training methods or develop entirely novel tools based on the demonstrated need and appreciation for such formative evaluations.

In conclusion, we created, validated, and established the feasibility of a rubric that can be used to evaluate resident examination skills in pediatric ophthalmology, and to facilitate structured feedback. We hope that the PEAR will help improve skills and comfort level in an area of perceived weakness and generate greater interest in this rewarding subspecialty.


#
#

Conflict of Interest

None declared.

Acknowledgments

We would like to thank our colleagues for providing their expert feedback on the PEAR rubric: Karl Golnik, Janet Alexander, Michelle Cabrera, Amr Elkamshoushy, Amanda Ely, Robert Gross, Roni Levin, Walker Motley, Daniel Neely, Faruk Orge, Mark Preslan, David Rogers, and Derek Sprunger.

For more information on the PEAR rubric, please visit our Web site: https://sites.google.com/view/pedseyeexam/home.

  • References

  • 1 Foo FY, Leo SW. A nationwide survey of ophthalmology residents' interest in pediatric ophthalmology and strabismus (POS) as a career. J AAPOS 2010; 14 (01) e17
  • 2 Hasan SJ, Castanes MS, Coats DK. A survey of ophthalmology residents' attitudes toward pediatric ophthalmology. J Pediatr Ophthalmol Strabismus 2009; 46 (01) 25-29
  • 3 Holmboe ES, Edgar L, Hamstra S. The Milestones Guidebook. 2016. Available at: http://www.acgme.org/Portals/0/MilestonesGuidebook.pdf . Accessed May 30, 2017
  • 4 Golnik KC, Goldenhar LM, Gittinger Jr JW, Lustbader JM. The ophthalmic clinical evaluation exercise (OCEX). Ophthalmology 2004; 111 (07) 1271-1274
  • 5 Golnik KC, Goldenhar L. The ophthalmic clinical evaluation exercise: reliability determination. Ophthalmology 2005; 112 (10) 1649-1654
  • 6 Lee AG, Carter KD. Managing the new mandate in resident education: a blueprint for translating a national mandate into local compliance. Ophthalmology 2004; 111 (10) 1807-1812
  • 7 Lee AG. The new competencies and their impact on resident training in ophthalmology. Surv Ophthalmol 2003; 48 (06) 651-662
  • 8 Jain SS, Nadler S, Eyles M, Kirshblum S, Delisa JA, Smith A. Development of an objective structured clinical examination (OSCE) for physical medicine and rehabilitation residents. Am J Phys Med Rehabil 1997; 76 (02) 102-106
  • 9 Matsell DG, Wolfish NM, Hsu E. Reliability and validity of the objective structured clinical examination in paediatrics. Med Educ 1991; 25 (04) 293-299
  • 10 Petrusa ER, Blackwell TA, Ainsworth MA. Reliability and validity of an objective structured clinical examination for assessing the clinical performance of residents. Arch Intern Med 1990; 150 (03) 573-577
  • 11 Hauer KE. Enhancing feedback to students using the mini-CEX (clinical evaluation exercise). Acad Med 2000; 75 (05) 524

Address for correspondence

Michael Langue, MD
Department of Ophthalmology, Penn State Eye Center
200 Campus Drive, Suite 800, Hershey, PA 17033

  • References

  • 1 Foo FY, Leo SW. A nationwide survey of ophthalmology residents' interest in pediatric ophthalmology and strabismus (POS) as a career. J AAPOS 2010; 14 (01) e17
  • 2 Hasan SJ, Castanes MS, Coats DK. A survey of ophthalmology residents' attitudes toward pediatric ophthalmology. J Pediatr Ophthalmol Strabismus 2009; 46 (01) 25-29
  • 3 Holmboe ES, Edgar L, Hamstra S. The Milestones Guidebook. 2016. Available at: http://www.acgme.org/Portals/0/MilestonesGuidebook.pdf . Accessed May 30, 2017
  • 4 Golnik KC, Goldenhar LM, Gittinger Jr JW, Lustbader JM. The ophthalmic clinical evaluation exercise (OCEX). Ophthalmology 2004; 111 (07) 1271-1274
  • 5 Golnik KC, Goldenhar L. The ophthalmic clinical evaluation exercise: reliability determination. Ophthalmology 2005; 112 (10) 1649-1654
  • 6 Lee AG, Carter KD. Managing the new mandate in resident education: a blueprint for translating a national mandate into local compliance. Ophthalmology 2004; 111 (10) 1807-1812
  • 7 Lee AG. The new competencies and their impact on resident training in ophthalmology. Surv Ophthalmol 2003; 48 (06) 651-662
  • 8 Jain SS, Nadler S, Eyles M, Kirshblum S, Delisa JA, Smith A. Development of an objective structured clinical examination (OSCE) for physical medicine and rehabilitation residents. Am J Phys Med Rehabil 1997; 76 (02) 102-106
  • 9 Matsell DG, Wolfish NM, Hsu E. Reliability and validity of the objective structured clinical examination in paediatrics. Med Educ 1991; 25 (04) 293-299
  • 10 Petrusa ER, Blackwell TA, Ainsworth MA. Reliability and validity of an objective structured clinical examination for assessing the clinical performance of residents. Arch Intern Med 1990; 150 (03) 573-577
  • 11 Hauer KE. Enhancing feedback to students using the mini-CEX (clinical evaluation exercise). Acad Med 2000; 75 (05) 524

Zoom Image
Fig. 1 The one-page Pediatric Examination Assessment Rubric (PEAR).
Zoom Image
Fig. 2 Pre- and postsurvey data showing the pediatric exam skills residents find most challenging.