Keywords
asthma - electronic health records - patient portal - health equity - human
Background and Significance
Background and Significance
The recruitment of patients using electronic health records (EHRs) and accompanying
digital tools offers a promising approach for targeted recruitment of patients into
clinical trials.[1]
[2]
[3]
[4] Investigators conducting human subjects research are increasingly leveraging digital
methods for more efficient cohort identification, recruitment, and data collection.
For example, early studies found patient portal messaging to be almost twice as effective
as traditional methods in recruiting and enrolling study participants.[2] The coronavirus disease 2019 (COVID-19) pandemic has further highlighted the benefits
of using digital tools to minimize the need for in-person contact.[5]
While researchers are increasingly utilizing the EHR and digital tools to screen and
contact patients for recruitment into clinical trials, a key challenge is ensuring
that the sampled cohort accurately represents the eligible population.[6]
[7]
[8] Equitable representation of the sampled cohort is crucial to ensuring validity and
generalizability of the study results. Vulnerable and underserved populations are
underrepresented in clinical trials due to numerous factors, including investigator
bias, medical mistrust, barriers due to differences in health or research literacy,
and lack of access to transportation.[9]
[10]
[11]
[12] In recent years, growing digital divides, including lack of broadband, devices,
and digital literacy,[13] have exacerbated recruitment disparities.[14]
[15] To date, few studies have prospectively evaluated the use of primarily digital-based
strategies to recruit participants from vulnerable and underserved populations.[2]
[16]
The AppS for The Home Monitoring of Asthma (ASTHMA) study is a pragmatic randomized
clinical trial (RCT) focused on recruiting ambulatory patients with asthma to participate
in an app-based remote monitoring intervention. As a clinical trial of a digital health
intervention, a priority was to leverage our EHR to identify an inclusive cohort of
eligible patients for remote recruitment using a multipronged, primarily digital-based
strategy. Our recruitment period started during the second wave of the COVID-19 pandemic
in the United States providing a unique opportunity to rigorously assess the impact
of this strategy on recruitment equity.
Objectives
The aims of this study were to (1) implement a multipronged recruitment strategy using
primarily digital methods to screen, approach, and enroll patients into a clinical
trial of an app-based digital health intervention; (2) describe a structured approach
to routinely assess enrollment equity during recruitment; and (3) use mixed methods
to evaluate recruitment outcomes with regard to “TechQuity,” defined as the strategic
development and deployment of technology to advance health equity.[17] Our findings may inform best practices to equitably recruit patients into clinical
trials.
Methods
Overview and Study Design
The ASTHMA study (clinicaltrials.gov identifier: NCT04401332) is an RCT of a clinically
integrated digital health intervention that uses electronic patient reported outcome
(ePRO) questionnaires to monitor asthma symptoms in adult English- and Spanish-speaking
patients between primary care clinic visits. The intervention was initially designed
and tested in subspecialty pulmonary care, and expanded to the primary care setting
for this study.[18]
[19] We used a user-centered approach to design the application, and an implementation
framework (nonadoption, abandonment, scale-up, spread, sustainability [NASSS]) to
maximize scalability.[20] The patient-facing components of the intervention included an app that prompts patients
to complete weekly ePRO questionnaires and offers patients the option to request a
callback from a clinic nurse when symptoms worsen. The clinician practice model included
an EHR-integrated dashboard showing a summary of the ePRO trends for clinicians, and
previsit reminders to prompt both patients and clinicians to discuss reported asthma
symptoms during clinic appointments.[18] The intervention was designed to be usable and beneficial for any patient with asthma
of varying severity.[21]
We conducted a prospective study of the recruitment methods used to screen, approach,
and enroll patients into our RCT which will evaluate the impact of the digital intervention
on asthma-related quality of life measured by the Mini Asthma Quality of Life Questionnaire
(mini-AQLQ), a validated, patient-reported measure.[22]
[23] We used specific EHR criteria (see below) to identify and categorize potentially
eligible patients into tiers of varying asthma activity. We approached potential participants
using eight different strategies implemented over the course of our RCT's recruitment
period ([Fig. 1]). The study team met monthly to assess and implement changes to the recruitment
strategy.
Fig. 1 Recruitment period strategy implementation.
Study Setting, Sites, and Participants
The study was conducted during a 20-month recruitment period between July 2020 and
March 2022 at seven primary care clinics affiliated with Brigham Health, an academic
medical center affiliated with Mass General Brigham (MGB) in Boston, Massachusetts,
Unites States. All clinics used a commercial EHR system (Epic Systems, Inc.) and were
a part of Brigham Health's Primary Care Practice-Based Research Network. All patients
can enroll in MGB's patient portal, Patient Gateway, which is powered by MyChart (Epic
Systems, Inc.) and is available in Spanish as well as other languages. The institutional
review boards (IRBs) of MGB and the RAND Corporation approved all study procedures.
Potentially eligible adult patients (18 years or older) from these clinics were identified
by querying MGB's electronic data warehouse (EDW) at any time during the 24 months
prior to study initiation (July 2020), and from subsequent data refreshes during the
recruitment period (July 2020 to March 2022) if they were assigned to a primary care
provider (PCP) affiliated with one of the seven primary care clinics and had either
one of the following criteria: (1) a prior diagnosis of asthma (ICD-10 code defined
as J45.xx) either on their EHR problem list or specified during a subspecialty, inpatient,
or emergency department encounter; or (2) a diagnosis of asthma and a referral to
an Allergy or Pulmonary subspecialist. Potentially eligible patients who were not
considered appropriate (e.g., complex mental health or social issues) for the study
per their PCP or clinic medical director were excluded. Of note, patient portal enrollment,
defined by an “activated” status in the EHR, was not used to identify this initial
cohort.
Potentially eligible patients were approached (detailed below) and further screened
using a web-based eligibility questionnaire to confirm that the patient had an asthma
diagnosis, were English- or Spanish-speaking, had a PCP from one of the seven study
clinics, and owned and used a smartphone. The web-based eligibility questionnaire
was accessed by patients either electronically through a hyperlink from a patient
portal message, a link provided in the mailed letter, a QR code provided on a flyer
or letter, or by phone with a research assistant (RA) during the initial recruitment
phone call.
Disease Activity Tiers and Recruitment Approach
We defined tiers of varying levels of disease activity based on a targeted review
of the literature and consultation with two asthma clinicians (D.F. and W.C.—see Acknowledgments).[24]
[25] Disease activity tiers were constructed based on encounter data retrieved from the
EDW using the following criteria (one or more unless otherwise specified) in order
of decreasing disease activity: (1) hospital visit, (2) emergency department visit,
(3) prednisone prescription, (4) urgent care or walk-in visit, (5) specialist visit,
(6) two or more visits to any provider, and (7) visit related to asthma in past 2
years and asthma on the problem list. Patients who met the criteria for more than
one recruitment tier were placed in the higher activity tier. After initiation of
recruitment, we refreshed the cohort at regular 2-month intervals using the above
criteria to identify additional patients. Of note, the seventh tier (lowest disease
activity) was added after initiating recruitment of patients from the first six tiers
(higher disease activity).
Multipronged Recruitment Strategy
Eligible patients were approached using one or more of eight recruitment strategies
([Table 1]) until they consented or declined to participate in the study. Initially, patients
were sent a mailed letter and a patient portal message. For patients who did not respond
to the letter or patient portal message, RAs conducted follow-up phone calls and targeted
phone calls prior to an upcoming appointment at their primary care clinic. Clinician-centered
strategies included “1-click” referrals, a simple digital workflow initiated by referring clinicians from within the EHR,
and entries in electronic “huddle” notes in the EHR to remind clinicians to recruit
specific eligible patients scheduled for a clinic appointment that day ([Fig. 2]). Targeted, in-person recruitment was conducted by a bilingual RA on days where four or more
patients had an appointment at a given clinic or if a patient opted to be recruited
in person when approached using one of the other strategies. At the outset of our
study, we obtained IRB approval for our consent form and protocol, which included
a list of initial recruitment strategies. Subsequently, we submitted amendments to
our IRB protocol to add new recruitment strategies to the original list, all of which
were approved before being implemented.
Fig. 2 Clinician-centered recruitment strategies: “1-click” referral, a button to initiate
a referral accessible from the patient's chart in the EHR (blurred border, Epic Systems,
Inc.), and “huddle” notes, entries in “pre-visit planning” notes in the EHR to remind
clinicians to recruit specific patients scheduled for a clinic visit that day. EHR,
electronic health record.
Table 1
Recruitment strategies
Recruitment strategy
|
Description
|
1. Mailed letters
|
• RA mailed letters to patients through U.S. postal service
• Initially, all patients were mailed a hard copy of the study recruitment letter.
|
2. Patient portal message
|
• RA sent a minimum of two patient portal messages to patients with an activated patient
portal status.
|
3. Clinic-centered strategies
|
• Participating clinics were provided flyers with information for participating in
the study and instructed to give them to potentially eligible patients
• Study investigators presented an overview of study to primary care providers at
participating clinics during clinic staff meetings
|
4. Follow-up phone call
|
• RA made follow-up phone calls to patients who were sent a mailed letter and or patient
portal message.
|
5. Text messages
|
• RA sent patients a text message inviting them to participate in the study.
|
6. Clinician-centered strategies
|
• EHR-based “1-click” referrals[a] (2022 Epic Systems Corporation—see [Fig. 2])
– Study team designed a “1-click” referral button made available from within the
EHR for clinicians to refer a patient to the study.
– The “1-click” referral generates a message containing the patient medical record
number (MRN) that is automatically sent to the study team via email.
– The RAs followed up with referred patients via follow-up phone call and patient
portal messaging if available.
• “Huddle notes”[b]
– RA added a recruitment note to patients' charts before their upcoming appointments
to remind provider to mention the study.
• RA emailed clinicians with multiple upcoming appointments in a week, reminding them
to mention the study to eligible patients
|
7. Targeted phone calls
|
• RAs called patients before and or after appointments scheduled with a provider in
an ambulatory setting to discuss the study
|
8. Targeted, in-person
|
• A bilingual RA recruited patients before or after clinic visit
– RAs recruited in-person if there was a cluster of upcoming appointments, or if
a patient (English- or Spanish-speaking) opted to be consented in person.
|
Abbreviations: EHR, electronic health record; RA, research assistant.
a “1-click” referrals are automatically generated emails initiated by the clinician
to the study team using a button available in the patient's chart in the EHR. The
email contains patient identifying information for the study team to recruit the patient.
b “Huddle notes” are electronic notes within the eligible patient's chart in the EHR
that are reviewed by medical staff during scheduled clinic encounters.
Structured Recruitment Debrief Sessions
During each month of the recruitment period, we conducted a 20-minute session with
research team members using a structured “recruitment debrief” guide ([Supplementary Appendix A1], available in the online version) and recorded all input and feedback. The recruitment
debrief guide was constructed based on review of the literature and consultation with
an expert on health equity (J.R.).[16]
[26] The guide included the following topics: monthly recruitment goals, consented patient
demographics, barriers and facilitators to equitable recruitment, overall recruitment
experience during the preceding month, and possible improvements to the recruitment
process. During each session we tracked and reviewed key demographics based on our
equity variables (described below). Based on our analysis of findings from these discussions
(see below), RAs implemented changes to our approach during subsequent months of the
recruitment period.
Data Collection
RAs collected and tracked all recruitment activities in Microsoft Excel. Collected
data included dates and strategies by which potentially eligible patients were approached,
the recruitment strategy reported to be successful by eligible patients during the
consent phone call, and free text field notes of any barriers or facilitators to recruitment
reported by patients. We retrieved demographic data, patient portal enrollment status
(defined above), Charlson comorbidity indices, structured social determinant of health
(food, housing, transportation, and others), and disease activity (defined above)
from the EHR for eligible patients. For all consented patient participants, RAs asked
patients how they were recruited to the study and recorded the specific strategy by
which the patient was recruited.
Outcomes and Measures
We defined four groups of patients: (1) potentially eligible patients who met inclusion
criteria (identified in the EHR using disease activity tier criteria); (2) approached
patients recruited using one or more strategies; (3) screened patients who completed
the web-based eligibility questionnaire and provided contact information including
a phone number and/or email; and (4) consented patients who provided written informed
consent.
For consented patients, we defined two main outcomes: patient portal recruit, or participants
who reported completing the web-based eligibility questionnaire on their own after
receiving a patient portal message; and nonpatient portal recruit, or participants
who reported being contacted using any nonpatient portal recruitment strategy prior
to completing the web-based eligibility questionnaire. We defined dichotomous equity
variables for age (greater than 65 years of age vs. less than or equal to 65 years
of age), self-identified sex (female vs. male), race (non-White vs. White), ethnicity
(Hispanic vs. non-Hispanic), primary language (Spanish vs. English), median income
by zip code (≤$63,000 vs. >$63,000), education (no college vs. some college or graduate
education), and clinic location (urban vs. suburban). We defined two process outcomes,
the mean number of approaches per patient, and mean number of unique recruitment strategies
used per patient.
Mixed Methods Analysis
We linked data queried from the EHR to corresponding data from our patient recruitment
tracker by medical record number. We used descriptive statistics to report demographics
of patients who were screened (i.e., met EHR inclusion criteria), approached via the
multipronged recruitment strategy, completed the web-based eligibility questionnaire,
and consented; and to report the number and percentages of patient portal recruits
and nonpatient portal recruits. A two-sample t-test was used to compare the mean number
of approaches per patient for consented patients and approached patients who were
not consented to examine any recruitment effort disparities between patients who consented
and those who did not consent to participate in the study. We used Fisher's exact
test to compare our main outcomes, patient portal and nonpatient portal recruits,
by each dichotomized equity variable (above). Multivariable logistic regression was
used to adjust effect size estimates and control for all covariates. All quantitative
analyses were performed using R Studio (Version 2022.02.0 + 443 for Windows).
For our qualitative analysis, two members of the research team trained in grounded
theory (S.P. and J.S.F.) independently reviewed and coded all feedback, notes, and
responses from monthly recruitment debrief sessions, extracted representative quotes,
and identified codes and preliminary themes for key recruitment facilitators and barriers
from the research team perspective.[27] Using a similar process, we compiled and analyzed free-text entries and notes recorded
by RAs in our patient recruitment tracker to identify key facilitators and barriers
from the patient perspective. Preliminary themes were reviewed and reconciled during
a final group consensus meeting. All qualitative analyses were conducted using Microsoft
Excel.
Results
We identified a total of 6,853 patients ([Fig. 3]) from the EDW who met study inclusion criteria; notably, 5,783 (84.4%) were patient
portal enrollees. Of these 6,853 potentially eligible patients, 6,366 (92.9%) were
approached, 627 patients (9.1%) were screened using the web-based eligibility questionnaire,
and 445 patients (6.5%) consented using our multipronged strategy. The demographics
of the 627 patients who were approached and completed the web-based eligibility questionnaire
were similar to the 5,739 patients who were approached but did not complete the eligibility
questionnaire ([Table 2]; see also [Supplementary Appendices A2] and [A3], available in the online version) with several notable exceptions: in absolute percentages,
those who did not complete the eligibility questionnaire were more frequently Hispanic
(+6.7%), more frequently Spanish-speaking (+6.3%), and less frequently patient portal
enrollees (−7.4%). The demographics of the 445 patients who completed the web-based
eligibility questionnaire and consented to participate in the main trial were similar
to the 667 patients who completed the web-based questionnaire. The frequencies of
missing social determinants data([Supplementary Appendix A2], available in the online version) from the EHR were similarly high across all four
groups.
Fig. 3 Patient enrollment flowchart.
Table 2
Patient cohort demographics
Characteristics
|
Met inclusion Criteria, n = 6,853
|
Approached, n = 6,366
|
Completed web-based eligibility questionnaire
|
Consented, n = 445
|
p-Value[a]
|
No, n = 5,739
|
Yes, n = 627
|
Age in years, mean (SD)
|
53.8 (17.1)
|
53.9 (17.1)
|
55.1 (17.3)
|
52.5 (15.5)
|
52.0 (15.5)
|
0.452
|
Female sex, no. (%)
|
5,158 (75.3)
|
4,793 (75.3)
|
4,293 (74.8)
|
499 (79.6)
|
346 (77.8)
|
0.065
|
Race/ethnicity, no. (%)
|
American Indian or Alaska Native
|
9 (0.1)
|
10 (0.2)
|
8 (0.1)
|
2 (0.3)
|
2 (0.4)
|
0.005
|
Asian
|
188 (2.7)
|
183 (2.9)
|
166 (2.9)
|
18 (2.9)
|
12 (2.7)
|
White non-Hispanic
|
3,394 (49.5)
|
3,245 (51.0)
|
2,915 (50.8)
|
331 (52.7)
|
236 (53.0)
|
Black non-Hispanic
|
1,164 (17.0)
|
1,049 (16.5)
|
916 (16.0)
|
130 (20.7)
|
92 (20.7)
|
Hispanic
|
1,837 (26.8)
|
1,642 (25.8)
|
1,519 (26.5)
|
124 (19.8)
|
89 (20.0)
|
Native Hawaiian or other Pacific Islander
|
2 (0.03)
|
3 (0.05)
|
2 (0.03)
|
1 (0.2)
|
1 (0.2)
|
Other
|
123 (1.8)
|
110 (1.7)
|
98 (1.7)
|
13 (2.1)
|
6 (1.3)
|
Declined/unavailable/missing
|
136 (2.0)
|
124 (1.9)
|
115 (2.0)
|
8 (1.3)
|
7 (1.6)
|
Marital status, no. (%)
|
Partnered
|
2,996 (43.7)
|
2,822 (44.3)
|
2,563 (44.7)
|
261 (41.6)
|
189 (42.5)
|
0.693
|
Nonpartnered/single
|
3,775 (55.1)
|
3,465 (54.4)
|
3,102 (54.1)
|
361 (57.6)
|
252 (56.6)
|
Unknown/missing
|
82 (1.2)
|
79 (1.2)
|
74 (1.3)
|
5 (0.8)
|
4 (0.9)
|
Primary language Spanish, no. (%)
|
829 (12.1)
|
705 (11.1)
|
670 (11.7)
|
34 (5.4)
|
20 (4.5)
|
<0.001
|
Education, no. (%)
|
Less than high school
|
659 (9.6)
|
554 (8.7)
|
505 (8.8)
|
47 (7.5)
|
30 (6.7)
|
0.142
|
Graduated high school
|
2,086 (30.4)
|
1,858 (29.2)
|
1,669 (29.1)
|
188 (30.0)
|
132 (29.7)
|
Graduated college
|
2,512 (36.7)
|
2,425 (38.1)
|
2,185 (38.1)
|
241 (38.4)
|
169 (38.0)
|
Graduated higher education
|
741 (10.8)
|
727 (11.4)
|
648 (11.3)
|
80 (12.8)
|
61 (13.7)
|
Unknown/missing
|
855 (12.5)
|
802 (12.6)
|
732 (12.8)
|
71 (11.3)
|
53 (11.9)
|
Socioeconomic status (median income by zip code), no. (%)
|
Less than or equal to $47,000
|
648 (9.5)
|
568 (8.9)
|
510 (8.9)
|
57 (9.1)
|
44 (9.9)
|
0.449
|
$47,001–$63,000
|
776 (11.3)
|
708 (11.1)
|
650 (11.3)
|
58 (9.3)
|
37 (8.3)
|
Greater than $63,000
|
5,399 (78.8)
|
5,060 (79.5)
|
4,554 (79.4)
|
507 (80.9)
|
362 (81.3)
|
Missing
|
30 (0.4)
|
30 (0.5)
|
25 (0.4)
|
5 (0.8)
|
2 (0.4)
|
Insurance status–no. (%)
|
Commercial
|
3,835 (56.0)
|
3,656 (57.4)
|
3,298 (57.5)
|
365 (58.2)
|
257 (57.8)
|
0.719
|
Medicaid
|
1,299 (19.0)
|
1,129 (17.7)
|
1,012 (17.6)
|
116 (18.5)
|
82 (18.4)
|
Medicare
|
1,635 (23.9)
|
1,496 (23.5)
|
1,349 (23.5)
|
141 (22.5)
|
104 (23.4)
|
Self-pay
|
79 (1.2)
|
80 (1.3)
|
75 (1.3)
|
5 (0.8)
|
2 (0.4)
|
Other government
|
5 (0.1)
|
5 (0.1)
|
5 (0.1)
|
0 (0.0)
|
0 (0.0)
|
Patient portal enrollees,[b] no. (%)
|
5,783 (84.4)
|
5,575 (87.6)
|
4,992 (87.0)
|
592 (94.4)
|
423 (95.1)
|
<0.001
|
Charlson comorbidity index, mean (SD)
|
1.8 (1.6)
|
1.7 (1.6)
|
1.7 (1.6)
|
1.7 (1.4)
|
1.7 (1.5)
|
0.217
|
Abbreviation: SD, standard deviation.
a Compared met inclusion criteria, approached, completed web-based eligibility questionnaire
“yes,” and consented.
b Patient portal enrollees = defined as having an “activated” status in the EHR.
We observed no significant difference in the mean (standard deviation) number of approaches
per patient between consented patients (3.2 (2.1)) and approached patients who were
not consented (3.1 (1.7)), t = − 0.1, 95% confidence interval (CI): −0.20 to 0.18, p = 0.92. Of the 445 patient enrollees (of whom, 423 [95.1%] were patient portal enrollees),
241 (54.2%) reported being recruited via the patient portal (i.e., completed the eligibility
questionnaire after receiving a patient portal message). In unadjusted analyses, patient
portal recruits were significantly more likely to be White, non-Hispanic, higher income,
and have some college education compared with nonpatient portal recruits ([Table 3]). There were no significant discrepancies for age, sex, or language among patient
portal recruits compared with nonpatient portal recruits. Patients affiliated with
urban clinics were significantly less likely to be recruited via the patient portal.
In adjusted analyses, non-White participants (odds ratio [OR]: 0.46, 95% CI: 0.28–0.77,
p = 0.003) and participants with no college education (OR: 0.60, 95% CI: 0.39–0.91,
p = 0.016) were significantly less likely to be recruited via the patient portal. The
demographics of consented participants recruited by patient portal message and specific
nonpatient portal recruitment strategies are available in [Supplementary Appendix A4] (available in the online version).
Table 3
Patient portal recruits versus nonpatient portal recruits by equity variable
|
Patient portal recruits, n = 241
|
Nonpatient portal recruits, n = 204
|
Un adjusted OR
|
95% CI
|
p-Value
|
Adjusted OR
|
95% CI
|
p-Value
|
Age, no. (%)
|
Greater than 65
|
56 (23.3)
|
41 (20.3)
|
1.20
|
(0.75–1.95)
|
0.490
|
0.80
|
(0.48–1.32)
|
0.385
|
Less than or equal to 65
|
185 (76.7)
|
163 (79.7)
|
Sex, no. (%)
|
Female
|
182 (75.8)
|
164 (80.2)
|
0.74
|
(0.46–1.19)
|
0.211
|
0.84
|
(0.52–1.36)
|
0.484
|
Male
|
60 (24.2)
|
40 (19.8)
|
Race, no. (%)
|
Non-White
|
84 (34.6)
|
125 (60.9)
|
0.34
|
(0.23–0.51)
|
<0.001
|
0.46
|
(0.28–0.77)
|
0.003[a]
|
White
|
157 (65.4)
|
79 (39.1)
|
Ethnicity, no. (%)
|
Hispanic
|
33 (13.3)
|
56 (26.7)
|
0.42
|
(0.25–0.69)
|
<0.001
|
0.82
|
(0.44–1.52)
|
0.535
|
Non-Hispanic
|
208 (86.7)
|
148 (73.3)
|
Primary language, no. (%)
|
Non-English
|
7 (2.9)
|
14 (6.9)
|
0.41
|
(0.14–1.10)
|
0.071
|
0.91
|
(0.31–2.51)
|
0.856
|
English
|
234 (97.1)
|
190 (93.1)
|
Median income by zip code, no. (%)
|
Low income (≤$63,000)
|
33 (13.8)
|
50 (24.3)
|
0.50
|
(0.29–0.82)
|
0.005
|
0.91
|
(0.52–1.59)
|
0.736
|
High income (>63,000)
|
208 (86.3)
|
154 (75.7)
|
Education, no. (%)
|
No college
|
95 (39.2)
|
120 (58.4)
|
0.46
|
(0.31–0.68)
|
<0.001
|
0.60
|
(0.39–0.91)
|
0.016[a]
|
Some college and above
|
146 (60.8)
|
84 (41.6)
|
Clinic location,[b] no. (%)
|
Urban
|
133 (55.0)
|
173 (84.7)
|
0.22
|
(0.13–0.36)
|
<0.001
|
0.68
|
(0.43–1.04)
|
0.079
|
Suburban
|
108 (45.0)
|
31 (15.3)
|
Abbreviations: CI, confidence interval; OR, odds ratio.
a Hommel-corrected values for race and education were p = 0.026 and 0.115, respectively.
b Urban clinics are those located within Boston city limits.
Table 4
Monthly recruitment debrief findings: key themes and examples
Theme
|
Descriptions and examples
|
Barriers
|
Technological issues
|
• Lack of access to email and smartphones
• Lack of confidence using a smartphone
• Limited data plan availability
|
Caregiver availability
|
• Caregivers unavailable to help patient with recruitment documentation and history.
|
Small pool of Spanish-speaking patients
|
• Diminishing number of Spanish-speaking patients to approach for recruitment toward
the end of the recruitment period.
|
Facilitators
|
Large pool of eligible patients
|
• Data refreshes were necessary to ensure a large pool of eligible patients and corresponded
to an increase in patients who were successfully recruited.
• Addition of two clinics later in the recruitment period increased eligible patient
pool.
|
Bilingual study staff
|
• Bilingual RA facilitated the recruitment of English- and Spanish-speaking patients.
• RA was able to offer in-person recruitment to all patients who requested to be recruited
in person. All patients who were ultimately recruited in-person were Spanish speakers.
• Recruitment of Spanish-speaking patients via phone calls around scheduled clinic
appointments was a successful strategy for enrollment.
|
English and Spanish recruitment materials
|
• Sending patient portal messages, letters, and text messages with improved readability
for health literacy in both English and Spanish.
• Few patients listed as “Spanish-speaking” and “does not need an interpreter” in
the EHR preferred Spanish materials or speaking in Spanish.
|
Targeted recruitment at upcoming patient appointments
|
• Focusing recruitment efforts (patient portal messages, phone calls, huddle notes)
around scheduled patient appointments improved patient engagement.
• Higher success rate of patients answering the phone when calling patients within
a few days of a scheduled appointments.
|
Clinician-initiated referrals
|
• Clinicians sent study referrals using the “1-click” referral button available in
the EHR.
• Patients reported receiving recommendations from their PCP regarding study enrollment.
|
Abbreviations: EHR, electronic health record; PCP, primary care provider; RA, research
assistant.
Eight major themes (three barriers, five facilitators) emerged from the 13 recruitment
debrief sessions. Key barriers included: technological issues related to smartphone
or email access, caregiver availability for patients who expressed needing support
with recruitment procedures, and a small pool of Spanish-speaking patients to recruit
([Table 4]). Key recruitment facilitators included: availability of bilingual study staff,
recruitment from a large pool of eligible patients, use of Spanish-language recruitment
materials, conducting targeted recruitment prior to upcoming patient appointments,
and clinician-initiated referrals.
Of the 6,366 patients approached, 934 (14.7%) patients reported one or more barriers
to participation in the study. The 943 barriers reported by patients were due to the
following reasons: mild or well-controlled asthma (331, 35.1%); not interested in
participating (161, 17.1%); technology barriers such as not having a smartphone or
email access (148, 15.7%); unable to make time commitment (141, 15.0%); health issues
(54, 5.7%); spoke a language other than English or Spanish and required an interpreter
(35, 3.7%); out of network or had a change in PCP (23, 2.4%); skepticism about participating
(22, 2.3%); had cognitive issues or dementia (12, 1.3%); not enough study compensation
(7, 0.7%); ineligible due to age (5, 0.5%); or had asthma complications (4, 0.4%).
Discussion
We conducted a mixed methods study to assess equity in recruitment for a clinical
trial of a digital health intervention aimed at remotely monitoring asthma. Our recruitment
efforts coincided with the start of the second wave of the COVID-19 pandemic in the
United States, which provided an opportunity to rigorously evaluate our digital recruitment
methods during a time of mostly remote health care. Most of the patients who were
potentially eligible for our study were patient portal enrollees, according to our
EHR data. Of those approached, 9.1% completed the electronic screening process, and
ultimately, 6.5% were consented using our multipronged approach. We did observe a
few disparities in potentially eligible patients who did not complete the web-based
eligibility questionnaire. These disparities were more frequently seen in Hispanic
and Spanish-speaking patients, and less frequently in patient portal enrollees, but
these were at most 6 to 8% on an absolute basis. While the majority of participants
who completed the electronic eligibility questionnaire and consented were patient
portal enrollees (95%), only 54% reported being recruited via the patient portal.
In adjusted analysis, we observed significant disparities in race and education for
patient portal recruits compared with nonpatient portal recruits. From our thematic
analysis of monthly structured recruitment debrief sessions, targeted strategies,
such as calling eligible patients prior to scheduled clinic appointments, offering
in-person recruitment, in addition to patient portal messaging and mailed letters
were effective in achieving diversity. Both the research team and patients identified
technological gaps, such as lack of access to email or a smartphone and limited digital
literacy, were identified as major barriers to remote recruitment of Spanish-speaking
individuals.
Our research team utilized monthly recruitment debriefs to identify key barriers (e.g.,
lack of email access) and facilitators (e.g., targeted recruitment strategies) for
equitable recruitment. The debriefs followed a structured approach, allowing us to
assess our recruitment approach in real-time and adopt a flexible strategy to optimize
recruitment from underrepresented groups. Comparing the barriers and facilitators
identified by the research team and patients offered insights into the mechanisms
likely required to ensure successful recruitment equity (i.e., targeted, in-person
recruitment using a bilingual RA). For example, both the research team and patients
identified language as a key barrier. Consequently, we prioritized assignment of a
full-time bilingual RA to actively conduct in-person recruitment for Spanish-speaking
patients. Language concordance between prospective patients and a bilingual RA was
subsequently identified as a critical facilitator for recruiting Spanish-speaking
patients and may explain why we did not observe a significant disparity in the language
category in both unadjusted and adjusted outcomes analyses.
Few studies have utilized a structured process to evaluate and modify recruitment
strategies using multiple digital and nondigital strategies to achieve equity.[16] Additionally, few studies have examined recruitment in the context of a population
with a high patient portal enrollment rate (which is higher than the current rate
[∼70%] reported across the MGB enterprise). Our study presented a unique opportunity
to rigorously evaluate a multipronged recruitment strategy to remotely screen, approach,
and enroll patients into a clinical trial with regard to “TechEquity.” Despite being
one of the earliest investments in telehealth technology, patient portal adoption
has remained low among underserved populations due to unevenly distributed access
and affordability to internet access and devices, as well as limited digital literacy.[28]
[29]
[30]
[31] These factors are likely to exacerbate disparities in equitable recruitment into
research trials and explain why we had a small number of Spanish speaking patients
who ultimately consented to participate. However, the targeted strategies we describe
in this study could help to address these gaps. Our preliminary data ([Supplementary Appendix A4], available in the online version) suggest that specific nonpatient portal strategies
were more frequently successful at recruiting Hispanic and Spanish-speaking patients,
as well as those with lower educational backgrounds.
The COVID-19 pandemic has accelerated digitization and the transition to remote interactions
in the health care industry, but has also exacerbated existing inequalities.[32] It is important to note that while patient portal enrollment was high among our
cohort, it did not necessarily translate to actual patient portal use, including acting
upon the patient portal recruitment message. Our findings are consistent with previous
research which suggests that patient portal enrollment does not equate to continued
patient portal use across various demographics,[33] and particularly among underserved patients who were less likely to use digital
tools during the pandemic.[34]
[35] For instance, all three patients who opted for in-person recruitment by a bilingual
RA were all Spanish speakers, Hispanic, and elderly (mean age of 66 years), even though
two of these patients were enrolled in the patient portal which offered multilanguage
support. Thus, recruitment through electronic screening and patient portal messaging
alone may not be sufficient to achieve recruitment equity in clinical trials. In the
future, ensuring equity in clinical trial recruitment will require assessing digital
literacy and providing support to patients, such as digital navigators.[36]
[37]
[38]
[39] These findings underscore the importance of digital inclusion as an important focus
of health care policies and administration.[40]
Our study has several limitations. First, our study was conducted at an institution
with high patient portal enrollment and utilized a web-based screening process which
may have introduced recruitment disparities for specific demographics. Second, relying
on patient portal enrollment status in the EHR to identify patients to recruit via
patient portal message may not accurately reflect patient portal use, which requires
adequate digital literacy. Third, despite our intention to prioritize patients based
on disease activity ([Supplementary Appendix A3], available in the online version) tiers, we had to recruit all patients due to low
consent rates during the pandemic. To include additional eligible patients, we added
disease activity tier 7 after commencing recruitment. Fourth, we did not engage patient-based
community groups to participate in the recruitment process, though we involved clinic
PCPs and administrative staff as study advocates. Lastly, we did not collect data
to confirm which strategies reached patients, such as how many patient portal messages
were read or phone calls answered by patients.
Conclusion
Our study highlights the importance of intentional and targeted efforts to achieve
diversity in clinical trial recruitment. To improve recruitment equity, researchers
should plan and budget for such efforts in their study design. A practical approach
that researchers can adopt is a multipronged recruitment strategy, including regular
debrief sessions, to identify barriers to equitable recruitment. Future studies could
maximize recruitment equity by staffing projects with multiple bilingual RAs, principal
investigators, and study team members.
Clinical Relevance Statement
Clinical Relevance Statement
With the trend toward digitization of recruitment into clinical trials, strategies
that utilize both digital and nondigital methods will continue to be necessary to
ensure clinical trial equity.
Multiple-Choice Questions
Multiple-Choice Questions
-
When recruiting patients for a digital health clinical trial intervention, which modality
is most likely to contribute to equitable patient enrollment?
Correct Answer: The correct answer is option b. A bilingual research assistant was the modality that
successfully recruited Spanish-speaking patients in this study.
-
Which of the following are barriers to equitable enrollment?
Correct Answer: The correct answer is option d. Lack of smartphone access and email access were common
technical barriers identified in our study. Other types of barriers include lack of
caregiver support and language discrepancy.