Appl Clin Inform 2022; 13(01): 001-009
DOI: 10.1055/s-0041-1740481
Research Article

Facilitators and Barriers to Implementing a Digital Informed Decision Making Tool in Primary Care: A Qualitative Study

Nicole Puccinelli-Ortega
1   Department of Internal Medicine, Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, North Carolina, United States
,
Mark Cromo
2   Department of Internal Medicine, University of Kentucky, Lexington, Kentucky, United States
,
Kristie L. Foley
1   Department of Internal Medicine, Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, North Carolina, United States
,
Mark B. Dignan
2   Department of Internal Medicine, University of Kentucky, Lexington, Kentucky, United States
,
Ajay Dharod
1   Department of Internal Medicine, Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, North Carolina, United States
,
Anna C. Snavely
1   Department of Internal Medicine, Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, North Carolina, United States
,
David P. Miller
1   Department of Internal Medicine, Wake Forest School of Medicine, Medical Center Boulevard, Winston-Salem, North Carolina, United States
› Author Affiliations
Funding The study was funded by U.S. Department of Health and Human Services, National Cancer Institute (R01CA218416), and Wake Forest Comprehensive Cancer Center (P30CA012197).
 

Abstract

Background Informed decision aids provide information in the context of the patient's values and improve informed decision making (IDM). To overcome barriers that interfere with IDM, our team developed an innovative iPad-based application (aka “app”) to help patients make informed decisions about colorectal cancer screening. The app assesses patients' eligibility for screening, educates them about their options, and empowers them to request a test via the interactive decision aid.

Objective The aim of the study is to explore how informed decision aids can be implemented successfully in primary care clinics, including the facilitators and barriers to implementation; strategies for minimizing barriers; adequacy of draft training materials; and any additional support or training desired by clinics.

Design This work deals with a multicenter qualitative study in rural and urban settings.

Participants A total of 48 individuals participated including primary care practice managers, clinicians, nurses, and front desk staff.

Approach Focus groups and semi-structured interviews, with data analysis were guided by thematic analysis.

Key Results Salient emergent themes were time, workflow, patient age, literacy, and electronic health record (EHR) integration. Saving time was important to most participants. Patient flow was a concern for all clinic staff, and they expressed that any slowdown due to patients using the iPad module or perceived additional work to clinic staff would make staff less motivated to use the program. Participants voiced concern about older patients being unwilling or unable to utilize the iPad and patients with low literacy ability being able to read or comprehend the information.

Conclusion Integrating new IDM apps into the current clinic workflow with minimal disruptions would increase the probability of long-term adoption and ultimate sustainability.

NIH trial registry number R01CA218416-A1.


#

Background and Significance

Patients receive the best care when they understand their options and participate in decisions. This process is often referred to as informed decision making (IDM).[1] The use of decision aids that provide information in the context of the patient's values improves IDM.[2] IDM is particularly important for colorectal cancer (CRC) screening because national guidelines recommend patients choose one of the several effective tests.[3] [4] Unfortunately, clinicians often lack the time to discuss preventive care[5] and using decision aids takes time. Therefore, it is not surprising that IDM rarely happens in practice.[6] [7]

To overcome time barriers and improve IDM, our team developed an innovative iPad based application (aka “app”) for CRC screening called mPATH (mobile Patient Technology for Health). The mPATH app assesses patients' eligibility for CRC screening, provides education about screening options, and let them request a screening test via the program. In a prior randomized-controlled trial (RCT) conducted in six community-based primary care practices, patients who used mPATH were twice as likely to complete CRC screening.[8] In the prior RCT, participants were asked to arrive to the practice 45 minutes early to meet with a research team member and use the iPad program. While mPATH was efficacious in this trial, it was unclear how best to integrate the IDM mobile health tool into the workflow of a busy practice using only clinical staff.

Incorporating apps into outpatient health care settings, often referred to as mobile health, or mHealth has become more a commonplace in recent years but few studies have tested strategies for incorporating patient-facing apps in primary care settings. A Cochrane review examined strategies for encouraging the adoption of information and communication technologies by health professionals; however, none of the 10 studies identified by the review focused on patient-facing apps, and the results of the studies make it difficult to draw conclusions about the effectiveness of one strategy over another.[9] The theoretical model for implementing mPATH is based on the Technology Acceptance Model (TAM) to guide initial implementation. The TAM theorizes that implementation of a new technology is determined by social influences and characteristics of the technology (which determine its perceived usefulness) and individuals' characteristics and experiences (which determine its perceived ease of use).[10]

Our team is conducting a cluster-randomized implementation trial to test different strategies for embedding the mPATH app in usual care. To inform our implementation strategy, we conducted a qualitative study to answer the primary research question of what do primary care practice staff, practice administrators, and providers identify as the facilitators and barriers to implementing digital health tools in primary care practice settings?


#

Methods

We invited primary care providers, practice managers, and clinic personnel from four practices to participate in focus groups or key informant interviews. We selected these participants because their roles would be critical to the eventual implementation and sustainment of mPATH in actual care. We did not include patients in our sample because we have previously demonstrated its usability and acceptability with items from the System Usability Scale.[11] In a prior study of 450 patients out of which more than one-third had limited health literacy, over 90% agreed the program was easy to use, that most people would learn to use the program very quickly, and that they felt confident using it.[12]

To increase the likelihood that the opinions and perspectives provided by respondents in our sample could be generalized to other primary care settings, we selected the four practices to represent diversity in location (rural vs. suburban/urban) and organizational structure (practices affiliated with an academic health system vs. a Federally Qualified Health Center [FQHC]). Three clinics were suburban community-based primary care practices in central North Carolina, and one was a FQHC in the Appalachia region of rural Kentucky.[13] We chose a qualitative approach because the facilitators and barriers to implementing digital health tools in primary care workflow are unknown, and a qualitative approach allows for in-depth exploration of themes.[14]

The Wake Forest School of Medicine Institutional Review Board and the University of Kentucky, through a reliance agreement, approved the study with a waiver of signed consent. Prior to beginning the focus groups and interviews, all participants were informed about the purpose of the study, their rights, and the risks and benefits for participating. Each participant received a $50 gift card for their time. To protect confidentiality, we assigned a unique study identifier to each participant.

Data Collection

At each practice, we invited front desk and nursing staff to participate in a focus group of six to 10 people. We also invited two providers (physicians or advanced practice professionals) and one administrator from each practice to participate in individual semi-structured interviews. We decided to conduct individual interviews for the practice administrator and providers for practical reasons, as it was less feasible to try to schedule focus groups for these roles. Combining these viewpoints from the different roles gave us perspectives we needed to understand the facilitators and barriers from the whole practice. A research study team member with qualitative expertise moderated the focus groups and conducted the interviews between September and November 2018. The focus group and interview guides ([Supplementary Appendix 1], available in the online version) were structured to elicit stakeholder perspectives about the facilitators and barriers to using mPATH, clinic workflow, and training and support.

Focus groups were conducted in the clinic setting and interviews were conducted in-person at the clinic or by phone. The trained moderator ensured that all focus group participants contributed to the discussions. To spur conversation, materials were presented to the participants that demonstrated the iPad program and proposed workflow for embedding it in practice. Kentucky participants were presented a slightly different technical workflow because electronic health record (EHR) integration in that clinic system was not feasible. Following each focus group or interview, data were summarized and preliminary themes emerged. We used this process to determine whether we were hearing similar or new themes, and to confirm we reached data saturation indicating that further data collection was unnecessary.[15]


#

Proposed Clinic Workflow

Our clinic workflow plan ([Fig. 1]) proposed that front desk staff hand patients an iPad upon check-in. The iPad app would conduct mandated screening for depression, fall risk, and intimate partner violence that all practices were currently doing verbally. Age appropriate patients would also be asked about CRC screening, and if screening was due, the app would display a brief educational video and allow patients to request a colonoscopy or stool blood test via the program. The app would alert the rooming nurse of the desired test, after which the rooming nurse would then enter into the EHR as a co-signature required order for the clinician to later review. The iPad would stay with the patient during rooming and a nurse would walk with the iPad back up to the front desk area once the visit was completed.

Zoom Image
Fig. 1 Original clinic flow presented.

#

Data Analysis

All focus groups and interviews were audio recorded, professionally transcribed, de-identified and reviewed against original audio to ensure accuracy. Transcripts were imported into ATLAS.ti[16] to store and manage the data. Two study team members (N.P., M.C.) inductively developed a codebook to identify meaningful categories of the databased on study aims. Each transcript was independently coded by two study team members who met periodically to resolve discrepancies in coding and revise the codebook as needed. The data from focus groups and interviews were analyzed together and any differences between participant groups were labeled. After the data were coded, the data within each category were abstracted and synthesized into themes. Themes were determined inductively by their prevalence and salience in the data, per the principles of thematic analysis.[17]


#
#

Results

A total of 48 individuals participated in a focus group or individual interview ([Table 1]). Five focus groups consisting of nurses, nursing assistants, front desk staff, and clinic coordinators were conducted at the primary care clinics in North Carolina (N = 3) and rural Kentucky (N = 2). One focus group (Kentucky) included one provider and one practice manager. Focus groups lasted 45 minutes on average.

Table 1

Participant characteristics (N = 48)

Providers

(N = 7)

Practice managers

(N = 4)

Clinic staff

(N = 37)

Gender

Male

Female

1

6

1

3

1

36

Age

(in years)

Range

Mean

22–64

40.6

34–62

48

27–65

42.8

Time at clinic

(in years)

Range

1 to >10

5 to >10

<1 to >10

We conducted 11 interviews (seven providers and four administrators). Two providers from each practice were interviewed with the exception of one clinic in North Carolina where only one provider was interviewed due to the lack of response to interview requests. Focus groups were conducted in person and interviews either in person or by phone.

Because insights that emerge from a focus group represent views in the group context as a whole,[18] as opposed to individual views, group perspectives are represented using direct quotes and identified as “clinic staff,” location and group number. Individual interviews perspectives are represented using direct quotes and identified as “provider” or “practice manager,” location and participant identification number (PID).

Participant Identified Facilitators for App Implementation

Providers and staff were asked about factors that would motivate them to use the app. All participants indicated that clinic staff would be motivated to use the program if the technology would improve efficiency by asking patients routine screening questions ahead of time and automatically transfer the answers into the patient's chart.

“So cutting down our time in getting the patient back, being able to do the vitals. That would cut down our time. “(Clinic Staff, NC FG2)

“... the nursing staff already has so many responsibilities when it comes to refills and medication refills and all this stuff, that taking something off their plate to—I don't know, more automating that might make them feel a little bit better.” (Provider, NC11)

Practice managers said they would be motivated to use the program if it helped to improve quality metrics for routine screening. Similarly, providers said they would be more likely to use mPATH if it helped educate patients about CRC screening and increase their interest in screening. Both providers and clinical staff said they would be motivated to use the program if they saw patients answered the depression screening questions more honestly.

“They have quality metrics that they have to meet that those things are going to pop up in the system. And so I think what intrigues me with this is it's focusing on one idea right now. Colonoscopy is the idea that it's focusing on right now. It does have the PHQ-9 screening, so really, those two.” (Practice Manager, NC12)

“I think if it helped with the work flow and got more of our patients educated about colorectal screening, we would be for that…I don't know that a metric would impress me as much as maybe a patient who had been resistant to colorectal screening goes through that and changes their minds or at least considers it in the future.” (Provider, NC13)


#

Participant Identified Barriers for App Implementation

Participants said their number one concern was time. Participants said they would be discouraged to use the program if it added additional work or time to their current responsibilities or if it took too long for patients to complete, thus slowing down patient throughput. Participants in both North Carolina and Kentucky said the program must integrate well into their current processes and EHR. If it slowed down the system in place, they would not want to continue to use it. Examples they gave included manually entering data into the EHR, technical malfunctions, or if it were too complex for patient use. They also expressed concern that if patients provided negative feedback for any reason about the iPads, clinics would be discouraged to use them.

“If it was taking the patients way too long to get through it before the doctor could actually go in and see the patient and treat them.” (Clinic Staff, FGNC)

“If it got to be overly burdensome to either the front staff, the nursing staff, or was taking a lotta time.” (Provider, NC13)

“I think it's a good program if you can get it to merge with your EHRs, because that would be the only negative comment that I would have so far, is just will it merge into the system for the docs and the nurses or whatever, so there's no outside paperwork that has to be scanned into the system kind of thing. And the order and all of that stuff, that would be great. That would save some time for the nurses.”(Practice Manager, KY06)


#

Participant Identified Barriers for Patients

Both focus groups and interview participants were asked about barriers that patients were likely to encounter. They said that older patients may be unwilling to use the iPads because they are uncomfortable using technology. They had other concerns regarding language barriers, data security, and patient concerns about the iPads spreading infections. Participants who had experience using a prior iPad program at their clinic expressed the most concerns about patients being unwilling to use iPads in a clinical setting because they received negative feedback from patients about a prior program, which ultimately led to those clinics ending its use. Participants from the practice in Appalachia said that they were worried about their patients being able to read the material because of low literacy levels.

“I think the biggest thing would be ‘cause our population of patients is older, that concern if they’re not being able to use the actual iPad. Sort of them not wanting to go down that whole path, like the patients themselves.” (Provider, NC11)

“I guess my pause with this is always that if there's literacy on the part of patients. Not only literacy level, but being able to—being comfortable using an iPad, which may be true or you know, of most of our patients.” (Provider, KY10)

“A lot of ‘em can’t read and write.” (Clinic Staff, FGKY)

Participants said that in their experience patients are reluctant to use a program at follow-up or subsequent visits because to them it seems redundant.

”But if they've done it last week, they don't wanna do it again. If they done it last month, they don't wanna do it again. So what we run into is the patient saying: “Well, I done that last time. I don't wanna.” (Clinic Staff, FGKY)

“Are we going to ask them every time if they want a colonoscopy?” (Clinic Staff, FGNC)


#

Clinic Workflow

Participants were clear that any new program must fit into the current clinic workflow.

Participants said front desk staff were the best people to hand the iPads out to patients because it made the best use of time for patients to complete the program while waiting to be roomed and not interfere with the provider time with patients. At the same time, they were concerned that patients may not have time to complete both the screening questions and the CRC educational module before being called back to the exam room because patients typically wait only 5 minutes in the waiting room. They said it was unlikely that nurses would have time to return the iPads back to the front desk.

“I'd say if it took more than I would say five to seven minutes…the nurses are usually out there [waiting room] by that time. They don't usually have to wait that long.” (Practice Manager, KY06)

“Everything flowed until you got to the nurse returning the iPad back to the receptionist...I think that that's gonna be a step that they would consider an extra step and they're not used to doing that. Once they've taken the patient back, there's not if any interaction with the front desk.” (Practice Manager, NC15)

Given the choice of having the patient usage summary printed out or electronically produced, most participants said that they would favor the electronic summary. They also said it was important to alert them when a test needed to be ordered. Most respondents said that in their current workflow the provider ordered colonoscopies, not the nurses. They also said that the summary must be in an easy-to-read format that providers could easily skim for key pieces of information.

I think it looks good aside from the printed page. If I'm already going into Epic into my in-basket to sign off on the order, I could just as easily open an electronic communication.” (Provider, NC14)

“I was a little taken aback on: you have to hand the provider a piece of paper and they have to check in their in-basket for something. Sometimes providers don't have enough time to be able to look through all that” (Clinic Staff, FGNC)


#

Functionality

All participants said it would be important that the program integrated into the EHR to automate the process and eliminate duplicate work for staff. Participants' experience with prior EHR systems, particularly for the Kentucky clinics, was an important factor in how respondents answered because the technology shortcoming in the past directly affected their productivity.

“I think if it crossed over into our records and updated the patient's chart, it would be great.” (Clinic Staff, FGKY)

“So it's gonna cut down from the questions you're gonna ask but yet you're still gonna have to do the inputting. So really no. I mean, you just don't have to ask the questions, correct? They're still gonna have to type it in.” (Clinic Staff, FGKY)

We asked participants for their thoughts about how we could mitigate noise if patients viewed the CRC video in the waiting room. They said earbuds may be a good idea if they were not cost prohibitive and also liked the idea of using closed captions. They suggested that we allow patients to fast forward the video or skip it entirely.

Now, who would cover the cost for the earbuds ‘cause I know eventually I know those will probably be our—mainly disposable earbuds—I can only imagine they’re kind of expensive.” (Practice Manager, NC04)


#

Suggestions for Additional iPad Features

We asked participants for additional features ([Table 2]) to add to the mPATH program in future versions. Although time and efficiency were underlying themes across feature suggestions, improving quality metrics and increasing routine screening were also important.

Table 2

Additional features participants requested

Providers

Practice managers

Clinic staff

Medicare wellness questions

Standardized physical forms

HIPAA authorizations

Additional cancer screenings

Emergency contacts

Patient contact information

Quality metrics

Patient portal enrollment

Medication refills

Medication updates

Allergies to medications

Medicare wellness questions

Current symptoms

Review of systems


#

Training and Support

Participants were presented with a draft plan for training and support that included a 45-minute lunch and learn session, identification of a clinic champion, and technical support. Most respondents said that 30 to 45 minutes would be appropriate, and providers said they would only require limited training or a handout.

“45 minutes should be okay, as long as it's not something that needs to be done every three months or when you guys update new stuff and stuff like that.” (Practice Manager, NC04)

Participants said that it would be important to have someone respond to technical issues within 24 hours while others said it would depend on the severity of the issue that requires a quicker response time. They also said it would be important for the program to have few technical problems to instill confidence in users. When asked about the best methods for communicating an issue, respondents said that email, phone, and instant message/text should be the available options.

“I do think from a sort of buy-in perspective it is super important that it work well from the beginning and that things aren't that bumpy. And so I think part of that is having good support that responds quickly. Especially at the beginning I think it inspires confidence.” (Provider, NC03)

We asked participants the best way to identify a “clinic champion,” who would help keep the program going and serve as the primary clinic contact. They suggested someone who is already in a training role, someone who is comfortable with technology, and available onsite most days. Roles they suggested included the clinic coordinator, practice manager, and front desk team leader.

“The biggest things is just make sure that your focus is don't overwhelm them. If you try to give them too much information at any one time, they'll shut down. That's the honest truth. It has to be sold as a patient satisfier and a clinical-staffing satisfier.” (Practice Manager, NC12)


#

Comparisons of Responses by Setting

The Kentucky participants worked in a standalone FQHC in rural Kentucky, while the North Carolina participants worked in primary care clinics affiliated with a large academic medical center in suburban Winston-Salem, NC. These areas serve very different patient populations. While the Kentucky participants were concerned about their patients being unable to read and use technology, the North Carolina participants had concerns about language barriers. The EHR integration was a concern for the Kentucky participants because they said they do not like their system and it frequently changes and the North Carolina EHR is more robust.

These differences in clinic and patient characteristics may have led to the differences in participant feedback ([Table 3]).

Table 3

Patient barriers and concerns by setting

Barrier

Suburban NC

Appalachian KY

Literacy level

No

Yes

Confidentiality of data

No

Yes

Language barriers

Yes

No

Requiring reading glasses

Yes

No

Quality metrics

Yes

No

EHR Integration

No

Yes

Stealing iPads

Yes

Yes

Frequent EHR vendor changes

No

Yes

Positive EHR experience

Yes

No

Abbreviation: EHR, electronic health record.



#
#

Discussion

Overall, participants said the most important factors to facilitate implementation of the mPATH IDM tool in their clinics were that it fit into their current workflow, saved them time and increased efficiency. These factors directly relate to the “perceived usability” and “perceived usefulness” of the application as outlined in the TAM.[10] Primary care clinic staff are often burdened with competing demands and priorities, time constraints, and numerous administrative and reporting requirements.[19] [20] [21] Participants' initial feedback of the tool was consistent with the literature in that providers and staff would be motivated to use it if it easily fit into their current workflow and improved efficiency with tasks.[13] [22] [23] Similar to other studies, participants said barriers to using the tool included interfering with patient flow in the clinic, requiring too much time for patients to use the tool in the waiting room and return it to the front desk. These concerns were expressed more often by front desk and nursing staff than providers or practice managers because the responsibility to maintain patient flow lies with them.[24]

Interestingly, in our study, the provider role was not noted as a significant factor to facilitate implementation of mPATH. This may be because the focus of discussion was primarily on the front-end implementation of mPATH into the workflow rather than the back-end step of the provider ultimately ordering the CRC screening test; alternatively, this may reflect the changing role of providers from sole “decision maker” to “partner” in the overall clinical care team. However, another implementation study found that the provider role was the most important factor in ultimately determining the success of integrating decision support interventions.[25]

Participants said an additional incentive for them to use mPATH would be the ability to customize the app to incorporate other routine tasks such as Medicare wellness questions and patient self-registration to their health record, but only if the answers could be automatically uploaded into the EHR without the need to manually re-enter any of the data. This relates back to the significance of time savings and efficiency for clinic staff and providers with clinical teams often spending more time working in the EHR than on face-to-face time with patients.[26] [27] [28] This also suggests that improving IDM alone would be an insufficient motivator for practices to adopt a new program.

The rural Kentucky clinic participants, in particular, expressed dissatisfaction with their EHR overall and concern with how mPATH would be able to integrate with their EHR. Rural clinics in general have been slower to adopt and adapt to EHR requirements than more urban and academic-based clinics and systems,[29] [30] such as the North Carolina clinics, and this disparity was evident in this qualitative study.

Participants at both sites expressed concern about older patients being unable to or apprehensive about using the iPad app; however, several studies contradict this concern. In our prior work that included low health literacy patients, we showed that 93% of patients aged 50 to 74 years were able to use mPATH without any assistance.[8] [12] Another recent study found that half of patients aged 55 and older had high eHealth literacy[31] and Xie also had success implementing an eHealth program with older adults age 56 to 91.[32]

We altered our strategy for incorporating mPATH in clinics based on what we heard from participants in this study. Given concerns about slowing clinic flow and the short time patients spend in the waiting room, we divided the program into a modular structure. Patients now complete only the check-in questions in the waiting room, and then if CRC screening is needed, they complete the CRC module while waiting for the provider in the exam room. This modular structure takes advantage of where patients spend most of their time waiting for an appointment.

We also purchased antimicrobial cases for the iPads and created a cleaning protocol in response to concerns about infection risk. This infection control strategy worked well until the COVID-19 pandemic started, at which time some practices again expressed concern. Shortly after the pandemic started, we placed mPATH on hold at all sites.

There were some limitations to this study. While we selected participating clinics to represent diversity in terms of rural/urban location and organizational structure, all clinics were located in the southeastern United States and only one rural FQHC was included in our sample. Clinics from other regions of the country may perceive different barriers and facilitators than what we captured. In addition, one focus group in Kentucky included not only clinic staff but also one provider and one practice manager, which may have affected how freely clinic staff responded to questions. As creators of the mPATH program, two of the coauthors would have an ownership interest if it were commercialized; however, the program has not been commercialized and these two coauthors did not participate in data collection or analysis.


#

Conclusion

Successful implementation of technology is largely dependent upon its effects on staff and provider workload, clinic workflow, and functionality and integration with the clinic-based EHR system. A positive impact in one or more of these areas is likely to lead to success and buy-in from clinic personnel, leading to long-term utilization of the program. Building confidence and trust in the technology itself is also an important aspect that should not be overlooked in working toward long-term sustainability.


#

Clinical Relevance Statement

While we know the value of patients who have the autonomy to make informed decisions about their health care, the implementation, itself, of an informed decision aide into routine clinical care is often the challenge that prevents patients from benefitting from these helpful tools. Our study sought to learn from the staff and clinical providers in primary care practices how to best implement an IDM mHealth tool into routine care. We chose to use qualitative methods to learn from the experiences of those who work in the clinics day-to-day to tell us how to best implement the mPATH tool into their workflow to inform our implementation strategy, and have shared this information so that others who may be interested in learning from our experience may also learn from this experience.


#

Multiple Choice Questions

  1. Which of the following did participants say was important to implementing this digital health tool into routine practice at primary care clinics?

    • It integrated with their clinic's electronic health record.

    • It saved staff time.

    • It was accessible for patients who speak Spanish.

    • All of the above.

    Correct Answer: The correct answer is option d; Participants in this study said that all of these criteria would be important to implement this tool into routine practice.

  2. Which of the statements is NOT true for successful implementation of new digital health tools?

    • It is important they do not interfere in patient throughput.

    • Provider materials must be in an easy-to-read format they can skim for key pieces of information.

    • There is not a one-size-fits all strategy for implementation into clinics.

    • Clinic staff and nurses have extra time to help patients use new digital health tools.

    Correct Answer: The correct answer is option d; Participants said that clinic staff do not have extra time to assist patients with using this tool.


#
#

Conflict of Interest

D.P.M. and A.D. are co-inventors of mPATH. D.P.M., A.D., and Wake Forest University Health Sciences would have an ownership interest in the mPATH app if it were commercialized in the future. Neither D.P.M. nor A.D. participated in data collection or analysis. K.L.F. reports grants from National Cancer Institute, during the conduct of the study; A.C.S. reports grants from NIH/NCI, during the conduct of the study; grants and personal fees from Shattuck Labs, outside the submitted work; the other authors declare that they do not have a conflict of interest to report. A.D. reports grants from National Cancer Institute, during the conduct of the study; In addition, A.D. has an ownership interest in the mPATH app. D.P.M. reports grants from National Cancer Institute, during the conduct of the study; In addition, D.P.M. has an ownership interest in the mPATH app.

Acknowledgments

The authors wish to thank Diana E. Flores, MPS at Wake Forest School of Medicine for her important contributions to this manuscript.

Author Contributions

N.P.O. developed the interview and focus group guides, conducted interviews and focus groups in Winston-Salem, conducted complete analysis of the data, and was a major contributor in writing the manuscript. M.C. contributed to the interview and focus group guides, conducted interviews and focus groups in Kentucky, conducted complete analysis of the data, and was a major contributor in writing the manuscript. K.L.F. contributed to the interview and focus group guides and was a major contributor in writing the manuscript. M.B.D. and A.D. contributed to the interview and focus group guides and contributed to writing the manuscript. A.C.S. contributed to writing the manuscript. D.P.M. contributed to the interview and focus group guides, and was a major contributor in writing the manuscript. All authors read and approved the final manuscript.


Protection of Human and Animal Subjects

Study participants were informed of their rights to participate; risks and benefits and financial disclosures were declared. A waiver of signed informed consent was approved by the IRB. The Wake Forest School of Medicine Institutional Review Board approved this study IRB00048919.


Contributions to the Literature

While there is a body of evidence that suggests strategies for incorporating mHealth tools into health care, there is not a one-size-fits-all strategy for implementation of patient-facing shared decision making tools that may occur in diverse clinic settings and populations.


The few studies that have examined implementation strategies for incorporating health apps into primary care have yielded mixed results, and the optimal strategies remain unknown.


Although we found strategies for general implementation of mHealth tools in the literature, researchers must recognize that there are a wide variety of nuances in clinic and patient barriers which should be identified to better adapt tools for more successful implementation.


We found that the ability to adapt the implementation strategy to protect or improve patient throughput is critical for successful implementation and maintenance. This finding contributes to the literature and will guide others seeking to implement new interventions in busy clinical environments.


Note

Due to the nature of this qualitative study, data sharing is not applicable to this article as no datasets other than transcripts were generated or analyzed during the current study.


Supplementary Material

  • References

  • 1 Hughes TM, Merath K, Chen Q. et al. Association of shared decision-making on patient-reported health outcomes and healthcare utilization. Am J Surg 2018; 216 (01) 7-12
  • 2 Stacey D, Légaré F, Lyddiatt A. et al. Translating evidence to facilitate shared decision making: development and usability of a consult decision aid prototype. Patient 2016; 9 (06) 571-582
  • 3 Rex DK, Boland CR, Dominitz JA. et al. Colorectal cancer screening: recommendations for physicians and patients from the U.S. Multi-Society task force on colorectal cancer. Gastroenterology 2017; 153 (01) 307-323
  • 4 Bibbins-Domingo K, Grossman DC, Curry SJ. et al; US Preventive Services Task Force. Screening for colorectal cancer: US preventive services task force recommendation statement. JAMA 2016; 315 (23) 2564-2575
  • 5 Yarnall KSH, Pollak KI, Østbye T, Krause KM, Michener JL. Primary care: is there enough time for prevention?. Am J Public Health 2003; 93 (04) 635-641
  • 6 Ling BS, Trauth JM, Fine MJ. et al. Informed decision-making and colorectal cancer screening: is it occurring in primary care?. Med Care 2008; 46 (09, Suppl 1): S23-S29
  • 7 Chen J, Mullins CD, Novak P, Thomas SB. Personalized strategies to activate and empower patients in health care and reduce health disparities. Health Educ Behav 2016; 43 (01) 25-34
  • 8 Miller Jr DP, Denizard-Thompson N, Weaver KE. et al. Effect of a digital health intervention on receipt of colorectal cancer screening in vulnerable patients: a randomized controlled trial. Ann Intern Med 2018; 168 (08) 550-557
  • 9 Gagné ME, Boulet LP. Implementation of asthma clinical practice guidelines in primary care: a cross-sectional study based on the knowledge-to-action cycle. J Asthma 2018; 55 (03) 310-317
  • 10 Venkatesh V, Bala H. Technology acceptance model 3 and a research agenda on interventions. Decis Sci 2008; 39 (02) 273-315
  • 11 Bangor A, Kortum P, Miller J. An empirical evaluation of the system usability scale. Int J Hum Comput Interact 2008; 24 (06) 574-594
  • 12 Miller Jr DP, Weaver KE, Case LD. et al. Usability of a novel mobile health iPad app by vulnerable populations. JMIR Mhealth Uhealth 2017; 5 (04) e43
  • 13 Fernald DH, Jortberg BT, Hessler DM. et al. Recruiting primary care practices for research: reflections and reminders. J Am Board Fam Med 2018; 31 (06) 947-951
  • 14 Sandelowski M, Leeman J. Writing usable qualitative health research findings. Qual Health Res 2012; 22 (10) 1404-1413
  • 15 Saunders B, Sim J, Kingstone T. et al. Saturation in qualitative research: exploring its conceptualization and operationalization. Qual Quant 2018; 52 (04) 1893-1907
  • 16 ATLAS. ti Version 7.5.18 [Computer Software] [computer program]. Version 7.5.18. Berlin: Scientific Software Development GmbH. 2017
  • 17 Green J, Thorogood N. Qualitative methods for health research. London: Los Angeles: SAGE; 2018
  • 18 Sim J, Waterfield J. Focus group methodology: some ethical challenges. Qual Quant 2019; 53 (06) 3003-3022
  • 19 Shires DA, Stange KC, Divine G. et al. Prioritization of evidence-based preventive health services during periodic health examinations. Am J Prev Med 2012; 42 (02) 164-173
  • 20 Harry ML, Truitt AR, Saman DM. et al. Barriers and facilitators to implementing cancer prevention clinical decision support in primary care: a qualitative study. BMC Health Serv Res 2019; 19 (01) 534
  • 21 Lafata JE, Shay LA, Brown R, Street RL. Office-based tools and primary care visit communication, length, and preventive service delivery. Health Serv Res 2016; 51 (02) 728-745
  • 22 Cresswell K, Majeed A, Bates DW, Sheikh A. Computerised decision support systems for healthcare professionals: an interpretative review. Inform Prim Care 2012; 20 (02) 115-128
  • 23 Granja C, Janssen W, Johansen MA. Factors determining the success and failure of eHealth interventions: systematic review of the literature. J Med Internet Res 2018; 20 (05) e10235
  • 24 Nápoles AM, Appelle N, Kalkhoran S, Vijayaraghavan M, Alvarado N, Satterfield J. Perceptions of clinicians and staff about the use of digital technology in primary care: qualitative interviews prior to implementation of a computer-facilitated 5As intervention. BMC Med Inform Decis Mak 2016; 16: 44
  • 25 Frosch DL, Singer KJ, Timmermans S. Conducting implementation research in community-based primary care: a qualitative study on integrating patient decision support interventions for cancer screening into routine practice. Health Expect 2011; 14 (Suppl. 01) 73-84
  • 26 Young RA, Burge SK, Kumar KA, Wilson JM, Ortiz DF. a time-motion study of primary care physicians' work in the electronic health record era. Fam Med 2018; 50 (02) 91-99
  • 27 Kroth PJ, Morioka-Douglas N, Veres S. et al. Association of electronic health record design and use factors with clinician stress and burnout. JAMA Netw Open 2019; 2 (08) e199609
  • 28 Arndt BG, Beasley JW, Watkinson MD. et al. Tethered to the EHR: primary care physician workload assessment using EHR event log data and time-motion observations. Ann Fam Med 2017; 15 (05) 419-426
  • 29 Mack D, Zhang S, Douglas M, Sow C, Strothers H, Rust G. Disparities in primary care EHR adoption rates. J Health Care Poor Underserved 2016; 27 (01) 327-338
  • 30 Skillman SM, Andrilla CH, Patterson DG, Fenton SH, Ostergard SJ. Health information technology workforce needs of rural primary care practices. J Rural Health 2015; 31 (01) 58-66
  • 31 Arcury TA, Sandberg JC, Melius KP. et al. Older adult internet use and eHealth literacy. J Appl Gerontol 2020; 39 (02) 141-150
  • 32 Xie B. Effects of an eHealth literacy intervention for older adults. J Med Internet Res 2011; 13 (04) e90

Address for correspondence

Nicole Puccinelli-Ortega, MS
Department of Internal Medicine, Medical Center Boulevard
Winston-Salem, NC 27157-1063
United States   

Publication History

Received: 09 March 2021

Accepted: 27 October 2021

Article published online:
05 January 2022

© 2022. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • References

  • 1 Hughes TM, Merath K, Chen Q. et al. Association of shared decision-making on patient-reported health outcomes and healthcare utilization. Am J Surg 2018; 216 (01) 7-12
  • 2 Stacey D, Légaré F, Lyddiatt A. et al. Translating evidence to facilitate shared decision making: development and usability of a consult decision aid prototype. Patient 2016; 9 (06) 571-582
  • 3 Rex DK, Boland CR, Dominitz JA. et al. Colorectal cancer screening: recommendations for physicians and patients from the U.S. Multi-Society task force on colorectal cancer. Gastroenterology 2017; 153 (01) 307-323
  • 4 Bibbins-Domingo K, Grossman DC, Curry SJ. et al; US Preventive Services Task Force. Screening for colorectal cancer: US preventive services task force recommendation statement. JAMA 2016; 315 (23) 2564-2575
  • 5 Yarnall KSH, Pollak KI, Østbye T, Krause KM, Michener JL. Primary care: is there enough time for prevention?. Am J Public Health 2003; 93 (04) 635-641
  • 6 Ling BS, Trauth JM, Fine MJ. et al. Informed decision-making and colorectal cancer screening: is it occurring in primary care?. Med Care 2008; 46 (09, Suppl 1): S23-S29
  • 7 Chen J, Mullins CD, Novak P, Thomas SB. Personalized strategies to activate and empower patients in health care and reduce health disparities. Health Educ Behav 2016; 43 (01) 25-34
  • 8 Miller Jr DP, Denizard-Thompson N, Weaver KE. et al. Effect of a digital health intervention on receipt of colorectal cancer screening in vulnerable patients: a randomized controlled trial. Ann Intern Med 2018; 168 (08) 550-557
  • 9 Gagné ME, Boulet LP. Implementation of asthma clinical practice guidelines in primary care: a cross-sectional study based on the knowledge-to-action cycle. J Asthma 2018; 55 (03) 310-317
  • 10 Venkatesh V, Bala H. Technology acceptance model 3 and a research agenda on interventions. Decis Sci 2008; 39 (02) 273-315
  • 11 Bangor A, Kortum P, Miller J. An empirical evaluation of the system usability scale. Int J Hum Comput Interact 2008; 24 (06) 574-594
  • 12 Miller Jr DP, Weaver KE, Case LD. et al. Usability of a novel mobile health iPad app by vulnerable populations. JMIR Mhealth Uhealth 2017; 5 (04) e43
  • 13 Fernald DH, Jortberg BT, Hessler DM. et al. Recruiting primary care practices for research: reflections and reminders. J Am Board Fam Med 2018; 31 (06) 947-951
  • 14 Sandelowski M, Leeman J. Writing usable qualitative health research findings. Qual Health Res 2012; 22 (10) 1404-1413
  • 15 Saunders B, Sim J, Kingstone T. et al. Saturation in qualitative research: exploring its conceptualization and operationalization. Qual Quant 2018; 52 (04) 1893-1907
  • 16 ATLAS. ti Version 7.5.18 [Computer Software] [computer program]. Version 7.5.18. Berlin: Scientific Software Development GmbH. 2017
  • 17 Green J, Thorogood N. Qualitative methods for health research. London: Los Angeles: SAGE; 2018
  • 18 Sim J, Waterfield J. Focus group methodology: some ethical challenges. Qual Quant 2019; 53 (06) 3003-3022
  • 19 Shires DA, Stange KC, Divine G. et al. Prioritization of evidence-based preventive health services during periodic health examinations. Am J Prev Med 2012; 42 (02) 164-173
  • 20 Harry ML, Truitt AR, Saman DM. et al. Barriers and facilitators to implementing cancer prevention clinical decision support in primary care: a qualitative study. BMC Health Serv Res 2019; 19 (01) 534
  • 21 Lafata JE, Shay LA, Brown R, Street RL. Office-based tools and primary care visit communication, length, and preventive service delivery. Health Serv Res 2016; 51 (02) 728-745
  • 22 Cresswell K, Majeed A, Bates DW, Sheikh A. Computerised decision support systems for healthcare professionals: an interpretative review. Inform Prim Care 2012; 20 (02) 115-128
  • 23 Granja C, Janssen W, Johansen MA. Factors determining the success and failure of eHealth interventions: systematic review of the literature. J Med Internet Res 2018; 20 (05) e10235
  • 24 Nápoles AM, Appelle N, Kalkhoran S, Vijayaraghavan M, Alvarado N, Satterfield J. Perceptions of clinicians and staff about the use of digital technology in primary care: qualitative interviews prior to implementation of a computer-facilitated 5As intervention. BMC Med Inform Decis Mak 2016; 16: 44
  • 25 Frosch DL, Singer KJ, Timmermans S. Conducting implementation research in community-based primary care: a qualitative study on integrating patient decision support interventions for cancer screening into routine practice. Health Expect 2011; 14 (Suppl. 01) 73-84
  • 26 Young RA, Burge SK, Kumar KA, Wilson JM, Ortiz DF. a time-motion study of primary care physicians' work in the electronic health record era. Fam Med 2018; 50 (02) 91-99
  • 27 Kroth PJ, Morioka-Douglas N, Veres S. et al. Association of electronic health record design and use factors with clinician stress and burnout. JAMA Netw Open 2019; 2 (08) e199609
  • 28 Arndt BG, Beasley JW, Watkinson MD. et al. Tethered to the EHR: primary care physician workload assessment using EHR event log data and time-motion observations. Ann Fam Med 2017; 15 (05) 419-426
  • 29 Mack D, Zhang S, Douglas M, Sow C, Strothers H, Rust G. Disparities in primary care EHR adoption rates. J Health Care Poor Underserved 2016; 27 (01) 327-338
  • 30 Skillman SM, Andrilla CH, Patterson DG, Fenton SH, Ostergard SJ. Health information technology workforce needs of rural primary care practices. J Rural Health 2015; 31 (01) 58-66
  • 31 Arcury TA, Sandberg JC, Melius KP. et al. Older adult internet use and eHealth literacy. J Appl Gerontol 2020; 39 (02) 141-150
  • 32 Xie B. Effects of an eHealth literacy intervention for older adults. J Med Internet Res 2011; 13 (04) e90

Zoom Image
Fig. 1 Original clinic flow presented.