Appl Clin Inform 2024; 15(04): 785-797
DOI: 10.1055/s-0044-1788978
Research Article

Evaluation of a Primary Care-Integrated Mobile Health Intervention to Monitor between-Visit Asthma Symptoms

Authors

  • Jorge A. Sulca Flores

    1   Division of General Internal Medicine Primary Care, Brigham and Women's Hospital, Boston, Massachusetts, United States
  • Anuj K. Dalal

    1   Division of General Internal Medicine Primary Care, Brigham and Women's Hospital, Boston, Massachusetts, United States
    2   Harvard Medical School, Boston, Massachusetts, United States
  • Jessica Sousa

    3   Health Care Division, RAND, Boston, Massachusetts, United States
  • Dinah Foer

    2   Harvard Medical School, Boston, Massachusetts, United States
    4   Division of Allergy and Clinical Immunology, Brigham and Women's Hospital, Boston, Massachusetts, United States
  • Jorge A. Rodriguez

    1   Division of General Internal Medicine Primary Care, Brigham and Women's Hospital, Boston, Massachusetts, United States
    2   Harvard Medical School, Boston, Massachusetts, United States
  • Savanna Plombon

    1   Division of General Internal Medicine Primary Care, Brigham and Women's Hospital, Boston, Massachusetts, United States
  • David W. Bates

    1   Division of General Internal Medicine Primary Care, Brigham and Women's Hospital, Boston, Massachusetts, United States
    2   Harvard Medical School, Boston, Massachusetts, United States
  • Adriana Arcia

    5   Hahn School of Nursing and Health Science, University of San Diego, San Diego, California, United States
  • Robert S. Rudin

    3   Health Care Division, RAND, Boston, Massachusetts, United States

Funding This project was supported by grant numbers R18HS026432 and R18HS026432-02S1 from the Agency for Healthcare Research and Quality. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health or the Agency for Healthcare Research and Quality.
 

Abstract

Objectives This study aimed to evaluate implementation of a digital remote symptom monitoring intervention that delivered weekly symptom questionnaires and included the option to receive nurse callbacks via a mobile app for asthma patients in primary care.

Methods Research questions were structured by the NASSS (Nonadoption, Abandonment, Scale-up Spread, and Sustainability) framework. Quantitative and qualitative methods assessed scalability of the electronic health record (EHR)-integrated app intervention implemented in a 12-month randomized controlled trial. Data sources included patient asthma control questionnaires; app usage logs; EHRs; and interviews and discussions with patients, primary care providers (PCPs), and nurses.

Results We included app usage data from 190 patients and interview data from 21 patients and several clinician participants. Among 190 patients, average questionnaire completion rate was 72.3% and retention was 78.9% (i.e., 150 patients continued to use the app at the end of the trial period). App use was lower among Hispanic and younger patients and those with fewer years of education. Of 1,185 nurse callback requests offered to patients, 33 (2.8%) were requested. Of 84 PCP participants, 14 (16.7%) accessed the patient-reported data in the EHR. Analyses showed that the intervention was appropriate for all levels of asthma control; had no major technical barriers; was desirable and useful for patient treatment; involved achievable tasks for patients; required modest role changes for clinicians; and was a minimal burden on the organization.

Conclusion A clinically integrated symptom monitoring intervention has strong potential for sustained adoption. Inequitable adoption remains a concern. PCP use of patient-reported data during visits could improve intervention adoption but may not be required for patient benefits.


Background and Significance

Mobile health (mHealth) and remote patient monitoring (RPM) technologies have become increasingly available to patients.[1] [2] mHealth involves the use of mobile communication devices or wearables to support patient health,[3] and it can support diagnosis, clinical decision-making, health behavior change, and health education.[4] RPM interventions may leverage devices to transmit patient health data to care teams,[5] such as the use of cardiac implantable electronic devices,[2] digital blood pressure monitors,[6] pulse oximeters,[7] or blood glucose monitors.[8] Some evidence suggests that RPM may improve chronic disease management,[9] reduce acute care utilization,[10] and reimbursement rates increasingly support these efforts.[1] [11] [12] [13] However, there is a notable gap in the trend toward increased adoption of mHealth and RPM: monitoring programs that assess subjective measures of patient symptoms between clinical encounters.

Evidence suggests that remote symptom monitoring interventions can greatly improve management of chronic illness by increasing patients' symptom awareness and by facilitating more timely escalation to clinicians when needed.[14] [15] [16] [17] [18] [19] [20] The widespread adoption of smartphones may facilitate access to remote symptom monitoring.[14] [21] [22] [23] [24] [25] [26] [27] However, other than in oncology and surgical specialties within specific care pathways and protocols,[17] [28] [29] patients with chronic conditions are still largely on their own to manage the symptoms of their conditions and decide when they need to call their clinicians between health care visits. Although recently released billing codes for this type of monitoring (called remote therapeutic monitoring) may help motivate adoption, knowledge for how to best leverage these codes do not yet exist.[12] [13]

Implementation studies of scalable interventions can help identify barriers, facilitators, and best practices, thereby facilitating the widespread adoption of symptom monitoring. We implemented an mHealth intervention to remotely monitor patient-reported symptoms of asthma as part of a pragmatic randomized controlled trial (RCT) (ClinicalTrials.gov Identifier NCT04401332).[18] [30] [31] In this study, we evaluated the intervention's implementation and scalability using the Nonadoption, Abandonment, Scale-up Spread, and Sustainability (NASSS) framework, which was designed for evaluating sustainability of health technologies.[32] [33]


Objective

Our objective was to assess the implementation of a remote asthma symptom monitoring intervention using quantitative and qualitative methods to determine the intervention's potential for sustained adoption in primary care. The intervention consisted of an electronic health record (EHR)-integrated mobile app for between-visit symptom monitoring, a clinician dashboard in the EHR, and an integrated practice model ([Table 1]). Specific research questions are shown in [Table 2] and assess the type of patients suitable for the intervention, impact of sociocultural factors, usage of technology, stakeholder value proposition, changes in patient activities and staff roles, and organizational changes required. The results of the RCT evaluating effectiveness of the intervention in terms of asthma-related quality of life and health care utilization will be reported separately.

Table 1

Summary of intervention components and implementation context

Patient-facing mHealth app

Available in Spanish and English, the app allowed patients to complete weekly symptom questionnaires; view their data in graph form; enter notes, triggers and peak flow values; and watch educational videos such as how to use a rescue inhaler. The app was available for iOS and Android devices

Clinician-facing dashboard

All patient-entered data were available to clinicians from the EHR within the patient chart with one click on the “ASTHMA data” tab

Weekly symptom questionnaires

Patients were prompted to complete an initial 5-item baseline ACM questionnaire followed by similar weekly PRO questionnaires. If the PROs reflected problematic symptoms (defined as a 3-point worsening in the ACM compared with their baseline or prior week's score, or severest symptom on any one question), the app prompted the patient with the option to request a call from a nurse and sent an EHR inbox message to a triage nurse in the patient's primary care clinic

In-person visit reminders

PCP's received EHR inbox messages prior to scheduled visits with participating patients reminding them to view the PRO data in the dashboard. Patients also received reminders in the app prior to scheduled PCP visits to bring their smartphone and discuss their asthma data with their clinician

Implementation context

The intervention was implemented as part of an RCT conducted July 2020–April 2023 at seven primary care clinics affiliated with the practice-based research network affiliated with Brigham Health. Consented patients were randomly assigned to receive the intervention and download the mHealth app or usual care. Patients were allowed to use the app for the 12-mo study period as well as poststudy which varied in duration by patient according to when they enrollment. The research team worked closely with patient representatives, PCPs, nurses, and clinic leadership to design, plan, test, and implement the intervention

Abbreviations: ACM, Asthma Control Measure; EHR, electronic health record; PCP, primary care provider; PRO, patient-reported outcome; RCT, randomized controlled trial.


Table 2

Research questions, data sources and methods

Research questions adapted from select domains of NASSS framework

Data sources

Quantitative and qualitative methods

Condition

 How wide a range of asthma control level is suitable for the intervention?

Patient characteristics (REDCap)

Patient app usage logs

Descriptive statistics of weekly questionnaire use by patient asthma control level

Patient and PCP interviews

Content analysis of interview data to identify patient benefits and burdens

 Which sociocultural factors are associated with higher or lower levels of app use?

Patient characteristics (REDCap)

Patient app usage logs

Descriptive statistics of weekly questionnaire use by patient education, race/ethnicity, language, and other variables

PCP interviews

Content analysis to identify sociocultural-related barriers

Technology

 For the various features of the app and dashboard, how frequently are they used and what technical barriers and facilitators do users encounter?

Patient app/clinician dashboard usage logs

Descriptive statistics comparing patient usage of app features and summarizing clinician dashboard usage

Patient and PCP interviews

Content analysis to identify technical barriers and facilitators related to app and dashboard usage

Value proposition

 What is the intervention's desirability and efficacy for patient treatment from stakeholder perspectives?

Patient call request data in EHR

Chart audits to assess clinical efficacy of call requests by characterizing downstream actions and relevance to asthma

Patient, PCP, and nurse interviews

Content analysis to assess perceived impact of intervention on patient treatment, desire to continue using the app/dashboard, and likelihood recommending to others

Adopter system

 What is expected of the patient and is it achievable?

Patient app usage logs

App weekly questionnaire completion data summarized to assess ability of patient to use app

Patient interviews

Content analysis interviews to characterize experience completing tasks in app and reported degree of task complexity

 What changes in staff roles and practices are affected?

PCP interviews

Content analysis to identify nature and significance of workflow/role changes for integration of mHealth app data into patient visit discussions

Nurse interview

Content analysis to identify impact of nurse workflow/role changes required for managing patient call requests

Organization

 What is required of the organization to implement the intervention?

Research team implementation processes

Descriptions how research team engaged clinics, clinicians, and patients

Patient app usage logs

Patient call request data from EHR

Descriptive statistics of call requests

PCP and Nurse interviews

Content analysis of PCP and nurse interviews to identify impact on workload burden

Abbreviations: EHR, electronic health record; NASSS, Nonadoption, Abandonment, Scale-up Spread, and Sustainability; PCP, primary care provider; REDcap, Research electronic data capture.



Methods

Intervention and Implementation Context

A summary of the intervention and implementation context are shown in [Table 1]. Our digital intervention for asthma symptom monitoring has been described previously in detail.[18] [30] [31] [34] Briefly, we selected asthma as the initial focus for our remote symptom monitoring intervention, because it is a prevalent disease (7.7% of the U.S. adult population as of 2017) that impairs quality of life,[35] [36] [37] [38] and because existing guidelines and evidence support monitoring of asthma symptoms.[39] [40] [41] [42] [43] [44] [45] [46] [47] We previously pilot tested the intervention in pulmonary care.[30] [31] For this implementation study, we adapted the intervention and scaled it to primary care, which is where most patients with asthma receive treatment.[18] Starting with the original intervention, we used user-centered design methods informed by the NASSS framework to adapt it to the primary care setting.[32] [33] This framework was specifically developed to inform the development and evaluation of health technology interventions and to consider potential for widespread adoption. We designed the app in Spanish and English and deliberately sought to recruit patients from primary care clinics with populations of Spanish-speaking asthma patients to assess the intervention's ability to be implemented among underrepresented groups.[34] The implementation was conducted as part of an RCT at seven primary care clinics affiliated with two hospitals—Brigham and Women's Hospital and Brigham and Women's Faulkner Hospital—that used the same EHR (Epic Systems).


Study Design and Framework

We used the NASSS framework to structure our specific research questions and inform our data collection and analysis ([Table 2]).[32] [33]


Study Populations and Data Collection

We included intervention patients who completed the baseline Asthma Control Measure (ACM)[48] [49] in the app and who did not withdraw from the study. As part of the RCT, patients were paid $25 to complete baseline surveys and another $25 to complete surveys at study completion.

Our quantitative data for patients included surveys collected as part of the RCT via REDCap[50] (baseline ACM scores, demographics); mHealth app usage and callback request data; and EHR data for encounters of any type during the 4 weeks following any callback request made through app (encounter type, ICD-10 [10th revision of the International Statistical Classification of Diseases] coded primary and secondary diagnoses, and medication changes). To understand primary care provider (PCP) and nurse experiences with implementation, we also collected data on dashboard usage and EHR inbox messages sent as part of the intervention.

For qualitative data, we conducted semi-structured interviews with a subset of patients, purposively sampled to include a diverse range of language, age, gender, race, ethnicity, smartphone type, clinic affiliation, and app usage. We also conducted semi-structured interviews with a sample of participating PCPs, targeting those with the most experience interacting with intervention patients (PCPs with the most intervention patients or with the greatest usage of the dashboard in the EHR) and the nurse with greatest recall of experience responding to callback requests. Empirical assessments suggest that saturation can typically be achieved with 9 to 17 interviews[51]—we planned to conduct 20 or more interviews.

Interviews were conducted December 2021 to March 2023. We developed the interview guides based on the quantitative findings and topics under the NASSS domain relevant to each research question. For research questions aimed at identifying barriers and facilitators, we used interview questions derived from published best practices.[52] Patient interview topics included experience with app features, barriers and facilitators to app usage, desirability of using the app after study completion, interest in recommending the app to someone with asthma, and content of conversations conducted during routine PCP appointments. Interviews with PCPs and our nurse subject covered experience using the dashboard, and impact on clinic workflows and patient care. Interviewees received a $25 gift card. Interviews lasted 30 to 60 minutes via phone or video call.


Data Analysis

Analyses were conducted to support each research question in [Table 2]. Most research questions were addressed using a combination of quantitative and qualitative analyses; some involved qualitative methods only as appropriate for the data type.

For quantitative findings, we used descriptive statistics to report patient demographic characteristics, distribution of patients among PCPs, ACM scores, weekly questionnaire completion rates, portion of patients who were high app users (>50% completion rate), frequency of patients who navigated to the three different tabs in the app (history/graph of symptoms, add triggers/notes/peak flow data, view educational information) in the mHealth app, frequency of app providing callback request option, and number of patient-initiated call requests via the app). For callback requests, we reported descriptive statistics on characteristics of callback requests, patients who requested callbacks, and follow-up appointments. For PCPs and nurses, we calculated frequency of navigations to the EHR-integrated dashboard.

For qualitative findings, we used a combination of deductive codes organized by the NASSS domain and inductive codes that emerged from the interview data. The initial codebook for patients was developed by one research team member (J.S.F.) and revised in discussion by two other research team members (J.S. and R.S.R.) until consensus was reached. The first two interview transcripts were coded independently by two research team members (J.S.R. and J.S.) who compared codes, which were largely in agreement. Differences were discussed and resolved. One research team member (J.S.R.) coded the remaining transcripts, convening regularly with team members to discuss emerging categories and verify a sample of coded excerpts (J.S. and R.S.R.). We followed the same steps for analysis of qualitative data collected from PCPs and nurses. We then used qualitative content analysis to group codes into categories and summarize findings.[53] We refer to “most” respondents to indicate greater than 50%, “several” to mean 4 or more, and “few” as 2 or 3. Because the research team played a role in implementing the intervention, we also described the processes executed by research team members (e.g., training PCPs, recruiting patients) and their experiences.

Based on the quantitative and qualitative findings, we assessed level of complexity as defined by the NASSS framework for each domain as an indicator of potential for sustained adoption. To do this, we applied criteria from the NASSS framework and assigned level of complexity (simple, complicated, or complex) based on consensus among the research team.[32] [33]



Results

Summary results of each research question are shown in [Table 3]. A total of 190 patients enrolled in the intervention and had similar characteristics to the 21 interviewed patients ([Table 4], [Supplementary Table S1], available in the online version). A total of 84 PCPs had at least one patient in the intervention (range = 1–7 patients per PCP; mean = 2.3; standard deviation [SD] = 1.5). Interviews were also conducted with four PCPs from three unique clinics and one nurse who worked at three different clinics during the study. Additional quotes for each research question are presented in [Supplementary Table S2] (available in the online version ).

Table 3

Summary results of research questions within NASSS domains

Research question by NASSS domain

Summary of results

Level of complexity[a]

1. The condition

 How wide a range of asthma control level is suitable for the intervention?

Patients at any level of control may benefit from the intervention

Simple (easy to determine patient suitability)

 Which sociocultural factors are associated with higher or lower levels of app use?

Questionnaire completion rates varied by education (43.0% completion for no high school degrees, 75.2% for graduate degree) and ethnicity (44.7% for Hispanic, 78.7% for non-Hispanic). Older users had higher adherence (83.0% for 65+ y old) compared with younger (59.8% for 18–33 y old)

Complicated (digital health and language equity and inclusion may need to be factored into service model)

2. The technology

 For the various features of the technology, how frequently are they used and what technical barriers and facilitators do users encounter?

For patients, the weekly questionnaires were used the most frequently with 72.3% adherence, followed by the History tab (graph of symptoms), Learn tab (education) and Add data tab (notes, triggers, peak flows). No technical barriers were found for patients. For clinicians, no technical barriers were found related to nurse callback requests. PCPs has some technical challenges installing the asthma study data tab in their EHR workflow space and required assistance from a research assistant

Simple (minimal technical barriers, substantial use of weekly questionnaire feature)

3. The value proposition

 What is its desirability and efficacy for patient treatment?

PRO questionnaires of asthma control showed trends toward improvement. Most patients given the opportunity to continue using the app in the post-study period did so. Most interviewed patients would continue to use the app and recommend it to others. Interviewed PCPs and nurse believed the intervention helped improve asthma care

Simple (evidence and perception of high value and efficacy for patients)

4. The adopter system

 What is expected of patient and is it achievable?

Adherence and retentions rates, and patient interview data, suggest that most patients found completion of weekly questionnaires was achievable

Simple (tasks for patients were achievable)

 What changes in staff roles and practices would be affected?

Clinical role changes were modest. PCPs helped recruit patients and could review and discuss the patient-reported data with their patients. The nurse called patients who requested a call, similar to existing callback workflows

Complicated (modest changes to nurse triage and clinician interactions with patients)

5. The organization

 What is required of the organization to implement and support the intervention?

EHR integration, patient recruitment, and training PCPs and nurse were required for implementation

Ongoing support was minimal. Organization was able to easily handle the 33 nurse callback requests during the 12-mo study

Implementation: Complicated (multiple components including EHR integration, and digital health equity considerations)

Ongoing support: Simple (low volume/burden of callback requests, other intervention features were optional)

Abbreviations: EHR, electronic health record; NASSS, Nonadoption, Abandonment, and Challenges to the Scale-Up, Spread, and Sustainability; PCP, primary care provider; PRO, patient-reported outcome.


a Complexity level was determined by the research team based on NASSS definitions: “simple (straightforward, predictable, few components), complicated (multiple interacting components or issues), or complex (dynamic, unpredictable, not easily disaggregated into constituent components). Modified from Greenhalgh et al 2017.[33]


Table 4

Weekly asthma app questionnaire adherence and retention[a]

Category

Number of patients

Monthly adherence

Number of patients who completed ≥ 1 questionnaire in every study month (% of patients)

Retention

Number of patients who completed ≥ 1 questionnaire within final 4 wk of study (% of patients)

Summary adherence

Number of questionnaires completed/available questionnaires (%)

Overall

190

130 (68.4)

150 (78.9)

7,141/9,880 (72.3)

Age (y)

 18–33

36

20 (55.6)

22 (61.1)

1,120/1,872 (59.8)

 34–48

48

32 (66.7)

35 (72.9)

1,768/2,496 (70.8)

 49–64

64

44 (68.8)

52 (81.3)

2,440/3,328 (73.3)

 65+

42

34 (81)

41 (97.6)

1,813/2,184 (83.0)

Ethnicity

 Hispanic

45

18 (40)

21 (46.7)

1,047/2,340 (44.7)

 Non-Hispanic

136

103 (75.7)

118 (86.8)

5,566/7,072 (78.7)

 Unknown/missing

21

9 (42.9)

11 (52.4)

528/1,092 (48.4)

Education

 No high-school degree

8

3 (37.5)

3 (37.5)

179/416 (43.0)

 High-school degree or GED

15

5 (33.3)

7 (46.7)

381/780 (48.8)

 Some college

23

16 (69.6)

18 (78.3)

948/1,196 (79.3)

 2-y college

15

9 (60)

10 (66.7)

422/780 (54.1)

 4-y college

52

39 (75)

46 (88.5)

2,132/2,704 (78.8)

 More than 4-y college

77

58 (75.3)

66 (85.7)

3,079/4,004 (76.9)

Sex

 Male

57

41 (71.9)

45 (78.9)

2,172/2,974 (73.3)

 Female

133

89 (66.9)

105 (78.9)

4,969/6,916 (71.8)

Language

 English

184

127 (69)

145 (78.8)

6,931/9,568 (72.4)

 Spanish

6

3 (50)

5 (83.3)

210/312 (67.3)

Race

 American Indian or Alaska Native

1

1 (100)

1 (100)

27/52 (51.9)

 Asian

8

6 (75)

6 (75)

355/416 (85.3)

 Black

36

20 (55.6)

31 (86.1)

1,317/1,872(70.4)

 More than one race

15

7 (46.7)

9 (60)

395/780 (50.6)

 Native Hawaiian or Other Pacific Islander

1

1 (100)

1 (100)

52/52 (100)

 White

115

92 (80)

97 (84.3)

4,718/5,980 (78.9)

 Unknown/not reported/missing

14

3 (21.4)

5 (35.7)

277/728 (38)

Smartphone

 Android

45

30 (66.7)

34 (75.6)

1,756/2,340 (75)

 iPhone

135

99 (73.3)

114 (84.4)

5,272/7,020 (75.1)

 Other

10

1 (100)

2 (20)

113/520 (21.7)

Asthma control at baseline[b]

 Controlled

66

51 (77.3)

60 (90.9)

2,743/3,432 (79.9)

 Not well controlled

84

57 (67.9)

65 (77.4)

3,113/4,368 (71.3)

 Very poorly controlled

40

22 (55)

25 (62.5)

1,285/2,080 (61.8)

a All data are for 12-month study period.


b Asthma control was assessed at baseline using the Asthma Control Measure (ACM).


NASSS Domain: The Condition

How Wide a Range of Asthma Control Level is Suitable for the Intervention?

Weekly questionnaire completion rates were the highest among patients with higher levels of baseline asthma control as measured by the ACM ([Table 4]). However, interviews with patients suggest that level of asthma control may be a poor indicator of suitability for the intervention. While a few patients expressed that they did not need the app, because their asthma was already well-controlled, others still found it useful to improve their awareness and remind them to continue their medications despite reporting that their asthma was well-controlled. Similarly, among patients with uncontrolled asthma, some found the app helpful and reported they would continue using it to help them improve their asthma control, whereas others said they would likely not continue, despite finding benefits such as improving awareness of their asthma. In one of the latter cases, the patient already felt “very much in touch with my asthma” (Patient 362). Another patient with poor asthma control said they had too many other health apps collecting their health data (e.g., blood pressure and glucose) and they would be more likely to use one app for all their conditions. Notably, the latter two patients were not aware of or used features of the app other than the weekly questionnaires.


Which Sociocultural Factors are Associated with Higher or Lower Levels of App Use?

Patients with more years of formal education, non-Hispanic patients, and older patients had generally higher weekly questionnaire completion rates ([Table 4]). Despite designing the intervention with input from Spanish speakers and employing a multimodal recruitment strategy, few Spanish-speaking patients enrolled in the study. PCP's interviewed from the study clinics with higher populations of Spanish-speaking patients suggested this population could be further engaged with greater attention to digital literacy.



NASSS domain: The Technology

For the Various Features of the technology, How Frequently are They Used and What Technical Barriers and Facilitators do Users Encounter?

The weekly symptom questionnaire was the primary feature of the patient-facing app and saw substantial overall usage, with 72% adherence and 79% (150/190) retention over the course of the 12-month study ([Table 4]). Of the 40 patients who did not complete questionnaires during the final month of the study, their average final questionnaire was completed at 27.85 weeks (SD = 15.3) after they started using the app. Callbacks were requested via the app by 28 unique patients on 33 different occasions ([Supplementary Table S3], available in the online version). Other features of the app were used less often ([Supplementary Table S4], available in the online version), largely because patients were unaware of them, according to patient interviews. Most patients reported no technical barriers to installing and using the app or accessing the various features.

Although patients faced few challenges with installing the app, PCPs did report start-up challenges associated with the EHR dashboard. Several PCPs who we invited to install the asthma study data tab to their EHR workspace at the start of the study encountered technical challenges and so a research assistant manually added it for them and most other PCPs. Once added, 14 of 84 (16.7%) PCPs accessed the tab at least once while the study was active, nine accessed it 1 to 2 times, 3 accessed it 3 to 4 times, one accessed it 6 times, and one accessed it 1 time. Interviewed PCPs cited nontechnical reasons for minimal use of this feature (see “The adopter system” domain below).



NASSS Domain: The Value Proposition

What is Its Desirability and Efficacy for Patient Treatment?

Average weekly reported patient-reported outcome (PRO) scores showed an improving trend over the course of the study ([Fig. 1]). Of patients given the opportunity to continue using the app for 3 or more months after completing the 12-month study, 67% completed more than half of available questionnaires.

Zoom
Fig. 1 Average Asthma Control Measure (ACM) score by study week. Data include only patients who completed the questionnaires. Lower scores mean better asthma control, fewer asthma symptoms. The scoring range is 0 to 19.

Interviewed patients largely found the app desirable and believed it helped them be more aware of and manage their asthma symptoms. Several patients reported that the simple act of reflecting on their asthma symptoms on a weekly basis was useful for identifying patterns and avoiding triggers. However, a few patients felt weekly questions were not detailed enough to fully reflect their asthma symptoms. One of the interviewed PCPs believed the intervention was efficacious in improving asthma care and referenced the patient-reported data several times during patient visits in the EHR dashboard. The other three interviewed PCPs used the asthma data less often but still believed the intervention was helpful for asthma care.

With respect to the callback request feature, most patients found it valuable, even if they declined the option to request a call when prompted. Some patients felt confident self-managing their asthma or preferred other means of communicating with their care team if needed. One patient had a close relationship with their pulmonologist and preferred to contact that doctor directly instead of going through their PCP. None of the interviewed PCPs were aware of callback requests that occurred via the app—this was not surprising given the design of the intervention to fit within existing workflows of nurses who would then contact PCPs similar to existing protocols for managing patient medical queries. The interviewed nurse believed the callback requests improved patient care.

Most patients viewed the history tab, which shows the patients a graph of their reported weekly questionnaire scores over time, at least once. Patients were more likely to navigate to this tab right after completing a weekly questionnaire (when prompted to do so) compared with other times (i.e., unprompted) ([Supplementary Table S4], available in the online version). Many interviewed patients were unaware of or did not recall using the history tab. However, most of those that viewed it found it helpful for their asthma self-management, such as by helping them understand the seasonal or situational causes of their symptoms, anticipate when they are more likely to need medicine, and assess the effect of taking their medicine on their asthma symptoms.

Most patients accessed the learn tab at least once to watch videos or read information related to asthma and how to use the app, but many interviewed patients reported they were unaware of this tab. Those that utilized the learn tab found it helpful. The app features that allowed for adding notes, triggers, and peak flows were used less frequently and interviews showed that most patients were not aware of these features. Notes recorded by patients included descriptions of symptoms, triggers, interactions with health care providers, treatment plans given to the patient, and instance of medications taken or missed.



NASSS Domain: The Adopters

What is Expected of Patient and is It Achievable?

The only expectation for patients was to complete the weekly questionnaires. Completion rates show this was largely achievable ([Table 4]) and did not diminish for the duration of the 12-month study ([Fig. 2]). Interviews confirmed these results—said one patient: “I like the consistency of the questions so I knew what to expect. I didn't really dislike anything—it was very straight-forward” (Patient 23). Among interviewed patients with missed questionnaires, some reported they had planned to do it later and then forgot or became too busy (e.g., with multiple jobs). Many interviewed patients commented on their appreciation of the feature that prompted them with only 1 question (instead of 5) when they had reported no symptoms in the prior week, indicating that this reduced the burden associated with completing the questionnaire. Patients largely relied on the reminders (push notifications and e-mail) to prompt them to complete the weekly questionnaires. Most interviewed participants found the 48-hour deadline to be acceptable, but some wanted more time or the ability to retrospectively complete questionnaires they missed.

Zoom
Fig. 2 Patient adherence to weekly app questionnaires by study week. Reported problematic asthma symptoms are defined as worse than previous week or baseline by 3 points in the Asthma Control Measure or the most severe response option on any question.

What Changes in Staff Roles and Practices Would Be Affected?

PCP role changes during the implementation of the intervention involved reviewing lists of patients to confirm eligibility and, if possible, telling patients about the study during medical visits to support recruitment and reviewing PRO data in the EHR before or during a visit and discussing it with the patient. Interviewed physicians described the recruitment process to be similar to how they recruit patients for other studies or interventions. One was more willing to engage in this intervention because of its importance to patient care: “I'm willing to go through a long list [of candidate patients] because it matters” (PCP 1). Three of the four interviewed PCPs used the patient-reported asthma data as part of care, which they said did not require any training. One PCP found the data helpful for prompting conversations with patients who had not remembered their asthma symptoms in the past. Another PCP found that their review of the PRO data in the EHR confirmed what they already knew about the patient's symptoms. When PCPs did not discuss the asthma data with patients during a visit, they said it was due to other visit priorities or that the EHR inbox reminder failed to remind them at the right time (several suggested using an EHR-based alert instead of an inbox message). The nurse interview and discussions with institutional nursing leadership indicated that responding to app-generated callback requests was a minimal change in practice because the callback process was like nurses' response to any patient-initiated phone call.



NASSS Domain: The Organization

What is Required of the Organization to Implement and Support the Intervention?

To implement this intervention, the research team worked with the health system's technology team to integrate the app into the EHR, as described previously[18]; met with PCPs at each clinic to describe the intervention[34]; worked with study clinics to identify and recruit eligible patients within PCP panels; installed the asthma study data tab in each PCP's EHR (few PCPs installed the tab after we sent them instructions so we had a research team member meet briefly with each PCP for this task which took a few seconds); and worked with the nurse leadership and asthma experts to develop a nurse-driven triage protocol (for use during the study as well as for any phone call from patients with asthma symptoms) and train the nurses how to use it and handle requests for calls. To support the intervention during the study period, the research team worked with the health system's technology team to maintain software integration and ensure proper functioning and monitored the callback requests to ensure the health system had adequate capacity for handling them. Few callback requests were made—patients declined 97.2% of instances when the app offered them the opportunity to request one. Most callback requests resulted in an asthma-related treatment change (see [Supplementary Table S3], available in the online version, for detailed analysis of callback requests).




Discussion

We evaluated an EHR-integrated between-visit remote symptom monitoring intervention for asthma and found that it can be implemented sustainably in primary care. Across five NASSS domains, we found minimal barriers to implementation. The intervention may be helpful for patients at any level of asthma control, simplifying implementation (Condition domain). High completion rates of weekly questionnaires and moderate use of the other features suggests the intervention was simple from a technical perspective and technical barriers were not a major challenge for patients, although barriers to digital health equity (e.g., digital access, digital literacy) may still impede adoption by some patients as illustrated by lower usage by Hispanic patients those with fewer years of education (Technology).[54] [55] Most patients with the opportunity to continue to use the app after the study period did so and the few PCPs who used the asthma data in the EHR found it helpful for care (Value proposition). Expectations of patients were achievable, and the new activities required for clinicians were generally consistent with their existing roles (adopter system). The minimal number of callback requests and qualitative data suggest that once this intervention is up and running, minimal effort was needed on the part of the health system to continue its operation (Organization). The most salient sources of complexity were the need to address digital health equity, including how digital literacy and language contributes to patient uptake, which should be a focus of future adaptions to the intervention[56]; EHR integration, which may become easier as interoperability improves[57]; and modest changes in PCP and nurse roles. These results suggest that this clinically integrated mHealth intervention can empower a substantial number of asthma patients to better care manage their medical conditions get help from clinicians when needed with minimal burden on clinicians or the health system.

This intervention likely achieved these results because of the user-centered design approach we used to develop it and from refinements made because of prior pilot testing.[18] [30] [31] It is notable that a substantial number of patients continued to use the intervention and found it helpful despite largely not discussing it with their PCP. Still, our high adherence and retention may be due to the intervention's connection with patients' existing clinics and care teams; most mHealth studies that lack this connection experience far higher attrition.[26] [27] Although increased PCP involvement may further improve adoption and use,[58] we designed the intervention so that PCP's are not required to change practice in order for patients to benefit. Likewise, the minimal volume of callbacks we found can be attributed to changes we made to the intervention because of our pilot testing. Integrating the technology into the EHR helped minimize clinical burden by allowing callback requests and patient-reported data to be incorporated in existing clinical workflows.

To our knowledge, this is the first evaluation of an EHR-integrated between-visit symptom monitoring intervention for a chronic condition designed systematically for sustained adoption.[23] [45] [59] Other asthma interventions may be less scalable than this one because they contain components that are costly for health care providers, such as additional staff (e.g., dedicated nurses, population health managers)[60] or for patients or payers (e.g., devices).[61] [62] [63] [64] Although less-scalable interventions may also improve patient care, we have demonstrated the potential of a more scalable model for one medical condition that relies solely on software-based questionnaires and existing medical staff.

Although we did not formally assess two additional NASSS domains, they may influence adoption and sustainability. The Wider Context domain covers the financial and regulatory requirements and support from professional bodies. Symptom monitoring is already well-supported by asthma guidance. Value-based care arrangements and recently introduced remote therapeutic monitoring billing codes may provide additional financial incentives.[12] The Embedding and Adaption domain suggests that interventions with scope for evolving as local conditions change are more scalable. Additional implementations in new settings are needed to assess this domain.

Limitations

The intervention was implemented as part of a clinical trial and differed from implementation in routine care. For example, because PCPs knew the study was time-limited and had few patients using the app, they may have been less willing to adapt their workflows. Had the app been implemented as part of routine care and offered to all asthma patients, PCP engagement may have been substantially greater. We may not have achieved data saturation with PCPs, but similarity of PCP workflows suggest we identified most key concepts.[51] Our evaluation focused on primary care clinics in one academic health center—other settings may have different challenges. Only one nurse agreed to our interview requests; others declined because they could not recall much about their involvement in the study, which is not surprising given that 97.9% of offered callback requests were declined by patients. Despite our efforts to recruit Spanish-preferring patients, none responded, which may be explained by the low number of Spanish-preferring participants in the trial.

The results also suggest some limitations in terms of scalability due to the intervention design, which may be addressed with additional input from users and further testing. For example, there may be opportunities to improve adherence and retention for patients with worse baseline asthma control. These are the patients with the greatest potential benefit from the intervention; however, they demonstrated worse adherence and retention compared with those with better asthma control. Confounding may explain this result: factors that cause generally worse adherence may cause both their worse baseline asthma control and lower adherence to questionnaires. Improving discoverability of app features, including asthma specialists as well as PCPs in the intervention, and incorporating asthma functionality into a more comprehensive, multicondition monitoring intervention may better engage these patients. Lower adherence among patients with fewer years of formal education and challenges recruiting Spanish speakers suggest additional effort is needed to engage these groups to ensure equitable access, such as by addressing digital access and literacy challenges, considering patient preferences for technology, and offering multimodal intervention options (e.g., a text message-based version of the intervention).



Conclusion

This study provides encouraging evidence that an EHR-integrated remote symptom monitoring intervention for asthma has strong potential for sustained adoption in primary care and offers lessons for successful implementation. The intervention shows evidence of desirability and efficacy for patient treatment, involves achievable tasks for patients, and requires minimal role changes for clinicians. Designing the intervention with user-centered design methods was critical to ensure it met the needs of users. Implementing it within the EHR was necessary to minimize clinical burden by facilitating its use as part of existing clinical workflows. The intervention may be used more by older patients of those with more years of education. Additional effort is needed to understand the nature and magnitude of potential digital health equity considerations (i.e., patient access to technology, digital literacy supports, patient preference, or readiness to use technology for their health). Findings from this work will inform future refinements to the app and intervention to further improve patient and clinician satisfaction and engagement. Although we focused on asthma, the intervention components and implementation lessons may apply to other conditions as well.[65] For patients with multiple chronic conditions, a single app that integrates all symptom monitoring may be necessary for sustained used. With additional demonstration of this intervention in different settings, and making the technology more widely available, symptom monitoring may finally become the standard of care so that patients will no longer be on their own between health care visits.


Clinical Relevance Statement

Despite advances in remote patient monitoring, most patients are still on their own between visits to manage their symptoms. Clinically integrated interventions based on simply weekly symptom questionnaires can be implemented so that patients will engage and the burden on the clinicians is minimal.


Multiple-Choice Questions

  1. When asthma patients are completing weekly symptom monitoring questionnaires and are given the option to request a callback from a nurse, what portion of the time will they request the call?

    • >50% of the time

    • 25–50% of the time

    • 10–25% of the time

    • 5–10% of the time

    • <5% of the time

    Correct Answer: The correct answer is option e. A few patients request a callback when offered the option.

  2. Which of the following changes to staff roles are needed to integrate asthma symptom monitoring into primary care?

    • New clinical responsibilities for primary care clinicians to treat asthma

    • Increased staffing to handle patient monitoring demands

    • Having nurses receive callback requests in the electronic health record

    • Add the expectation for primary care clinicians to review patient-reported data during patient visits

    • c and d

    Correct Answer: The correct answer is option e. The only changes to staff roles are for nurses to respond to callback requests, which are similar to how they handle patient phone calls, and primary care clinicians using the patient-reported data during visits.



Conflict of Interest

D.W.B. reports grants and personal fees from EarlySense, personal fees from CDI Negev, equity from ValeraHealth, equity from Clew, equity from MDClone, personal fees and equity from AESOP, personal fees and equity from Feelbetter, equity from Guided Clinical Solutions, and grants from IBM Watson Health, outside the submitted work.

Acknowledgments

The authors would like to thank Monique Martineau for assistance with the figures.

Protection of Human and Animal Subjects

This study was approved by the Mass General Brigham and RAND Institutional Review Boards.


Note

Preliminary results were presented at the American Medical Informatics Annual Symposium, November 2023, New Orleans, Louisiana, United States.



Address for correspondence

Robert S. Rudin, PhD
Health Care Division
RAND, 20 Park Plaza, Suite 910, Boston, MA 02116
United States   

Publication History

Received: 22 April 2024

Accepted: 17 July 2024

Article published online:
02 October 2024

© 2024. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany


Zoom
Fig. 1 Average Asthma Control Measure (ACM) score by study week. Data include only patients who completed the questionnaires. Lower scores mean better asthma control, fewer asthma symptoms. The scoring range is 0 to 19.
Zoom
Fig. 2 Patient adherence to weekly app questionnaires by study week. Reported problematic asthma symptoms are defined as worse than previous week or baseline by 3 points in the Asthma Control Measure or the most severe response option on any question.