CC BY-NC-ND 4.0 · Journal of Academic Ophthalmology 2018; 10(01): e127-e132
DOI: 10.1055/s-0038-1668574
Research Article
Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

The Pediatric Examination Assessment Rubric (PEAR): A Pilot Project

Michael Langue
1   Department of Ophthalmology, Penn State Eye Center, Hershey, Pennsylvania
,
Ingrid U. Scott
1   Department of Ophthalmology, Penn State Eye Center, Hershey, Pennsylvania
2   Department of Public Health Sciences, Penn State College of Medicine, Hershey, Pennsylvania
,
Ajay Soni
1   Department of Ophthalmology, Penn State Eye Center, Hershey, Pennsylvania
› Author Affiliations
Further Information

Publication History

22 April 2018

09 July 2018

Publication Date:
16 August 2018 (online)

Abstract

Purpose The pediatric ophthalmic examination is often considered a challenge to ophthalmologists at any level of training. New tools need to be developed and tested to enhance resident and fellow training in pediatric ophthalmology. To our knowledge, this pilot project introduces the first educational rubric designed specifically for a pediatric ophthalmic examination.

Methods Preliminary surveys were completed by 11 ophthalmic residents, of all three postgraduate years (PGY), to gauge comfort level with the pediatric ophthalmic examination. A one-page Pediatric Examination Assessment Rubric (PEAR) was developed and reviewed by 13 content experts (12 pediatric ophthalmologists and a lead developer of the Ophthalmic Clinical Exercise Examination [OCEX] tool) at eight academic institutions. A total of five educators from three academic institutions used the rubric to evaluate a total of six residents during a new strabismus evaluation. Postevaluation surveys were completed by both the five educators and the six residents.

Results Preliminary surveys showed that only 18.2% of residents felt their pediatric examination skills were good. Residents noted higher levels of frustration and less comfort with the pediatric examination when compared with an adult examination. Thirteen experts' comments were incorporated into the rubric to establish content validity. Postevaluation surveys showed that 60% of faculty and 100% of residents found the rubric to be very effective in providing feedback.

Conclusion In this pilot project, we established the need for more concrete educational tools in pediatric ophthalmology, created an educational tool, established content validity, and demonstrated feasibility. The PEAR helps residents identify skills to target for improvement based on the quality of their pediatric ophthalmic examinations. At three academic institutions, the PEAR was shown to be easy to use and a useful tool for training residents to perform the pediatric ophthalmic examination.

 
  • References

  • 1 Foo FY, Leo SW. A nationwide survey of ophthalmology residents' interest in pediatric ophthalmology and strabismus (POS) as a career. J AAPOS 2010; 14 (01) e17
  • 2 Hasan SJ, Castanes MS, Coats DK. A survey of ophthalmology residents' attitudes toward pediatric ophthalmology. J Pediatr Ophthalmol Strabismus 2009; 46 (01) 25-29
  • 3 Holmboe ES, Edgar L, Hamstra S. The Milestones Guidebook. 2016. Available at: http://www.acgme.org/Portals/0/MilestonesGuidebook.pdf . Accessed May 30, 2017
  • 4 Golnik KC, Goldenhar LM, Gittinger Jr JW, Lustbader JM. The ophthalmic clinical evaluation exercise (OCEX). Ophthalmology 2004; 111 (07) 1271-1274
  • 5 Golnik KC, Goldenhar L. The ophthalmic clinical evaluation exercise: reliability determination. Ophthalmology 2005; 112 (10) 1649-1654
  • 6 Lee AG, Carter KD. Managing the new mandate in resident education: a blueprint for translating a national mandate into local compliance. Ophthalmology 2004; 111 (10) 1807-1812
  • 7 Lee AG. The new competencies and their impact on resident training in ophthalmology. Surv Ophthalmol 2003; 48 (06) 651-662
  • 8 Jain SS, Nadler S, Eyles M, Kirshblum S, Delisa JA, Smith A. Development of an objective structured clinical examination (OSCE) for physical medicine and rehabilitation residents. Am J Phys Med Rehabil 1997; 76 (02) 102-106
  • 9 Matsell DG, Wolfish NM, Hsu E. Reliability and validity of the objective structured clinical examination in paediatrics. Med Educ 1991; 25 (04) 293-299
  • 10 Petrusa ER, Blackwell TA, Ainsworth MA. Reliability and validity of an objective structured clinical examination for assessing the clinical performance of residents. Arch Intern Med 1990; 150 (03) 573-577
  • 11 Hauer KE. Enhancing feedback to students using the mini-CEX (clinical evaluation exercise). Acad Med 2000; 75 (05) 524