Keywords
electronic medical record - adult learning principles - user satisfaction
Background and Significance
Background and Significance
High quality electronic medical record (EMR) education for junior doctors (residents,
registrars, and fellows) is critical to ensure patient safety and reduce stress and
burnout for doctors.[1] Junior doctors interact heavily with the EMR and are expected to perform at full
capacity from the day of commencement. A sound understanding of the workflows and
tools within the EMR contributes significantly to a junior doctor's ability to operate
efficiently and deliver safe care.
Our institution is a large pediatric training hospital in Melbourne, Australia, with
approximately 89,500 emergency department presentations, 345,000 outpatient appointments,
52,000 inpatient admissions, and 16,500 surgical operations per year.[2] In April 2016, the organization transitioned from a predominantly paper-based record
to an enterprise-wide EMR.
As one of the major pediatric teaching hospitals in the state, each year, large numbers
of junior doctors (∼300 per year) join our organization and require onboarding EMR
education. Initially, this ongoing demand for onboarding education of junior doctors
was met by a traditional EMR training model. Education was provided by EMR technicians
with excellent knowledge of the EMR but no experience with medical workflows. The
curriculum was designed and developed in accordance with the EMR vendor. The pre-implementation
curriculum contained expansive content aimed at equipping users with knowledge regarding
most aspects of the EMR on day one but lacked any emphasis on clinical application.
The delivery style was highly procedural with many sections of the workbook providing
“follow the trainer” activities, where the learner passively imitated the clicks of
the trainer.
Adult education literature suggests adults are more self-directed in their learning
and have a greater need to understand why they should learn something.[3]
[4] Adult learners learn best when learning is seen as relevant to their work, builds
on existing knowledge, and requires active engagement. Learning activities should
be self-paced using problem-solving exercises covering content that can be applied
to real life situations. It is therefore important that the educator knows and understands
the learner's needs, and designs learning activities relevant to those needs.[3]
Using these principles, our existing curriculum was overhauled with the aims to reduce
the cognitive overload provided by the existing curriculum, and to improve the learning
outcomes.
Objective
We aimed to evaluate the impact of these changes to the EMR on-boarding curriculum
for junior doctors, based on their satisfaction with the training.
Methods
Curriculum re-design: Given the need to make education relevant and case based, the curriculum had to be
informed by junior doctors who had an intimate knowledge of the EMR workflows required
for medical staff. The EMR training team employed three junior doctors to review the
existing curriculum in each of the major settings: outpatient, inpatient, and surgical,
in line with adult learning principles ([Table 1]). Each doctor had extensive knowledge of the workflows in their designated area,
having worked with the EMR in our organization for at least 12 months. A senior doctor
with a background in medical education provided oversight and consistency between
the modules. Existing content was reviewed and classified as “required starting,”
“good-to-know” and “expert” EMR knowledge. Based on the content deemed “required starting
knowledge,” cases representing real-life scenarios were designed so that the doctor
must use the requisite skills to complete the activity. “Pearls” were offered at the
end of each activity, based on the “good-to-know” content. Expert level EMR knowledge
was excluded from initial training. Training sessions were designed to run over 3 hours
each for inpatient and outpatient sessions (with some junior doctors attending both
sessions) and a 7-hour course for surgical trainees covering all settings (inpatient,
outpatient, and operating theaters).
Table 1
Summary of major changes made to curriculum
From
|
To
|
Didactic, passive learning
|
Exploratory, active learning
|
Procedural/IT workflows
|
Case based
|
Content heavy
|
Survival-based knowledge
|
Taught by analysts with strong EMR knowledge
|
Taught by doctors with strong workflow knowledge
|
Generic training—based on setting
|
Relevant to work setting (i.e., inpatient, outpatient, and/or surgical)
|
Trainers set the pace
|
Learners set the pace
|
Abbreviation: EMR, electronic medical record.
In addition, training cohorts were tailored particularly to cater for junior surgical
doctors, who had previously undertaken generic medical training with a half hour add-on
theater component. A junior surgical doctor was engaged to re-develop the curriculum
to provide surgical cases, and to prioritize activities such as theater workflows,
theater case bookings, and procedural consent. The surgical doctor also delivered
the classroom education to their fellow surgical doctors.
To facilitate a case-based exploration model, interactive e-learnings were created
to provide new users with an orientation to the system, and to provide a basic knowledge
platform from which they were able to engage in case-based learning activities in
the classroom ([Fig. 1]).
Fig. 1 Screenshot from e-learning demonstrating an interactive activity.
Due to the change in format from a “follow the trainer” style to a more active learning
style, classroom support was increased with a ratio of one educator for every six
learners, previously a 1:10 ratio. The active learning style incorporated the use
of exercise booklets, which enabled learners to try out activities, simulating the
clinical environment ([Fig. 2]).
Fig. 2 Screenshot from exercise booklet containing an activity that simulates the clinical
environment.
An assessment opportunity was provided, with a filmed scenario (i.e., a prerecorded
video of a ward round discharge for inpatient training) allowing the learning doctors
the opportunity to experience the required workflows in a stress-free environment,
and to troubleshoot areas of difficulty before leaving the classroom.
Finally, we continued to offer a 2-week period of floor support to new users. This
has always been part of our onboarding process but was restructured to enable both
support of new users and to provide “expert” level tips to existing users. Personalization
workshops of an hour duration were also offered during the first few weeks of starting
work. Calls to the EMR help desk and floor support team were monitored to ensure that
the reduction of class content was not leading to critical knowledge gaps.
Data collection: Because it is difficult to measure efficiency and effectiveness, we used learner
satisfaction and confidence as surrogate measures of learning success. Retrospective
review of junior doctor EMR education satisfaction scores from surveys collected in
2018 (pre-implementation) were compared with satisfaction scores from surveys collected
in 2019. There were four questions asked in both 2018 and 2019 and a further four
questions asked in 2019 ([Fig. 3]). Satisfaction was rated on a Likert scale of 1 to 5. All learners who underwent
training and responded to the surveys did not receive prior training. Raw data was
used to calculate mean values and standard deviations, and the student t-test was used to compare the two cohorts.
Fig. 3 Image of survey questions from 2019.
Volume of calls to the EMR Help Desk and average talk-time were compared across both
years using the 4-week period following training for each cohort.
Results
A total of 98 survey responses from 2018 (70% response rate), and 119 survey responses
from 2019 (85% response rate), were collected following two major intakes of junior
medical staff in February and August ([Table 2]). Satisfaction with EMR training ratings increased on a Likert scale from 3.8/5
in 2018 to 4.5/5 in 2019 using the same questions between the 2018 and 2019 trainee
cohorts (p < 0.0001).
Table 2
Summary of main findings
|
2018
|
2019
|
Difference
|
Number of respondents
|
98 (70% response rate)
|
119 (85% response rate)
|
|
Trainer: Learner ratio
|
1:10
|
1:6
|
|
Training duration
|
4 h
|
|
|
Satisfaction rating (out of 5)
|
3.8
|
4.5
|
p < 0.0001
|
E-learnings gave the right amount of preparation
|
3.3
|
4.2
|
p < 0.0001
|
Course was appropriately paced
|
3.3
|
4.4
|
p < 0.0001
|
I feel equipped/confident to use the EMR on day 1
|
3.4
|
3.7 (78 respondents)
|
p = 0.13
|
Trainers had the appropriate knowledge
|
|
4.8
|
|
Able to get support in the classroom
|
|
4.8
|
|
Motivated to learn further in the EMR.
|
|
4.6
|
|
It was useful to have a doctor running the training
|
|
4.8
|
|
Calls to help desk
|
2,885 calls
(total talk time 12,030.50 min)
|
2,705 calls
(total talk time 11,599.15 min)
|
180 less calls
(total talk minutes reduced by 431.4)
|
Abbreviation: EMR, electronic medical record.
The highest-rated factors contributing to satisfaction after the curriculum re-design
were “I found it useful to have a doctor provide the training” (average rating of
4.87/5), and “Doctors have the appropriate knowledge to provide EMR training” (average
rating of 4.88/5). Free-text comments included “It was great that most of the time
was spent practicing skills—high yield overall,” “easy to get help from trainers who
were fantastic,” “it was great to have doctors teaching…they had the knowledge of
what would be clinically relevant.”
Classroom support at a ratio of 1:6 educators to learners was rated as 4.85/5 with
92% of respondents rating this 4 or above. Motivation to engage in further EMR education
was also rated 4.6/5 with 98% of respondents rating this 4 or above. Pacing of the
course was rated 4.4/5 with free-text comments varying from “too long” to “longer
session please.”
The lowest-rated factor was “I feel confident to use the system on day 1” (78 respondents)
with a rating of 3.7/5. This question was modified for the August cohort in an attempt
to better ascertain the primary goal of training to “I feel equipped to use the system
on day 1” (41 respondents) with an overall rating of 4.1/5. Because of the wording
change, only the January cohort was compared with 2018 ([Table 2]). Overall, 25.5% (n = 28) of junior doctors were neutral (rating of ⅗), while 6.4% (n = 7) and 0.9% (n = 1) rated confidence as ⅖ and ⅕, respectively.
Calls to Help Desk did not increase following the modified training. In the 4 weeks
following each training cohort (January/August), there were a total of 2,885 calls
to Help Desk in 2018 and 2,705 calls in 2019. The average talk time was 4.17 minutes
in 2018 (total talk time 12,030 minutes) and 4.29 minutes (total talk time 11,599 minutes)
in 2019.
Discussion
The introduction of the EMR has been one of the largest transformations in the way
we deliver medicine in the last decade. For users to be effective and efficient in
the EMR, the importance of quality onboarding training cannot be understated. We have
adopted adult learning practices and delivered a curriculum designed by doctors, tailored
to doctors' needs. In an observational study design, we have demonstrated that satisfaction
ratings have improved in response to this approach.
As training content was reduced to allow experiential learning, which is more time
intensive; we monitored calls to Help Desk as a surrogate measure of learning gaps.
Calls to Help Desk reduced following the modified curriculum suggesting that while
content was reduced, the appropriate content had been included and was taught effectively.
EMR education is a rapidly evolving area with limited evidence to date on effective
education strategies for the computer learning environment.[5]
[6]
[7] This is the first publication, to our knowledge, to measure pre- and post-intervention
learner satisfaction following the institution of case-based learning informed and
delivered by doctors. The findings are consistent with learnings from the Arch Collaborative
who has compared education strategies to overall organizational EMR satisfaction.
The Arch Collaborative suggests that organizations with higher overall satisfaction
provide onboarding education that puts meaning into training, utilizes case-based
training with patient scenarios, and provides delivery by clinicians.[7] In addition, our training model is in line with many successful organizations that
achieve high EMR satisfaction ratings—a model built on layers of knowledge through
initial online training, followed by in-class training, and subsequently supported
by floor support and personalization laboratories. The current evidence is for a minimum
of 5 to 6 hours broken up into several sessions, which we have mirrored in our adopted
training model.[7] We believe approaching the EMR education process beyond training doctors to input
data allows us to highlight the benefits of using the EMR well for both doctors and
their patients. Our educators emphasize the rationale of why things are required to
be done the way that they are, to achieve good patient care and clear communication
with other clinicians. Employing doctors that are passionate educators and strong
advocates for the EMR as a clinical tool has been critical and is reflected by the
free-text comments to our survey. We acknowledge that while this approach was cost-neutral
and had no negative impact on staffing or hospital activity for our organization,
there are many complex factors that may make this a less feasible approach in other
institutions.
While our findings are positive and encouraging, there is room for further growth
in this area. A successful EMR user should achieve strong mastery of the EMR, feel
a sense of shared ownership, and have the EMR meet their unique needs.[8] There are several aspects within the onboarding process that we can focus on to
set up ongoing successful EMR use for clinicians. These include fostering a sense
of excitement about the benefits of an EMR, demonstrating what is possible and providing
details on how to obtain further training. A key finding that points toward an area
for improvement, is the fact that the lowest-rated factor in our survey was “I feel
confident to use the system on day 1.” In our experience, online learning prior to
in-class training tends to have a poor adherence. Further efforts to encourage learners
to engage in the pre-learning activities will enable a higher platform of knowledge
on which to build upon during classroom activities. This may have an impact on the
level of confidence achieved.
There are several ways that we can continue to improve our specialized and tailored
training for doctors. Our institution has adopted other projects not described in
this paper, including optimization strategies and specialty-specific builds. These
enable users of EMR to see that they are able to shape the changes of EMR, to feel
a sense of ownership, and that the EMR is designed for their unique needs.
The conclusions of our study are limited by the small sample size (1 year of data
only) and the number of changes being introduced simultaneously to our training program.
It should be noted, however, that the case mix of doctors attending training had no
major differences between 2018 and 2019. By introducing multiple improvements at the
same time, it is difficult to know which factor had the largest impact. For example,
it is possible that the improvement in satisfaction ratings was driven predominantly
by the surgeons who for the first time had a tailored education. However, we would
argue that the combination of curriculum re-design to an interactive case-based learning
system, tailoring the education to each craft group, and providing peer educators
have been synergistic, and would be difficult to achieve in isolation. In addition,
our study has predominantly reported user satisfaction as a surrogate measure for
learning success. The in-class assessment ensured basic level efficiency and there
was no escalation in calls to the EMR Help Desk; however, a more formal proficiency
test would be useful in informing further curriculum design.
Conclusion
Computer-based skill acquisition presents unique challenges to the clinical learner.
However, this is critically important for doctors working within EMRs. We provide
evidence of one effective strategy to improve the learning experience for new medical
users. Further research is required to better understand the cost-effectiveness of
this model of training, how to build upon these skills beyond onboarding, and how
to motivate doctors to prioritize EMR skills equal to clinical acumen.
Clinical Relevance Statement
Clinical Relevance Statement
Effective EMR onboarding education is critical in the safety of patients and the prevention
of burnout of junior doctors. Junior doctors perceive an interactive case-based education
session delivered by peers as a superior model to didactic generic classroom teaching.