Original Paper
Abstract
Background: After hospital discharge, patients with acute coronary syndrome (ACS) often experience symptoms that prompt them to seek acute medical attention. Early evaluation of postdischarge symptoms by health care providers may reduce unnecessary acute care utilization. However, hospital-initiated follow-up encounters are insufficient for timely detection and assessment of symptoms. While digital health tools can help address this issue, little is known about the intention to use such tools in ACS patients.
Objective: This study aimed to assess ACS patients’ intention to use digital health apps that support postdischarge symptom monitoring by health care providers and identify patient-perceived facilitators and barriers to app use.
Methods: Using email invitations or phone calls, we recruited ACS patients discharged from a central Massachusetts health care system between December 2020 and April 2021, to participate in the study. Surveys were delivered online or via phone to individual participants. Demographics and access to technology were assessed. The intention to use a symptom monitoring app was assessed using 5-point Likert-type (from strongly agree to strongly disagree) items, such as “If this app were available to me, I would use it.” Responses were compared across demographic subgroups and survey delivery methods. Two open-ended questions assessed perceived facilitators and barriers to app use, with responses analyzed using qualitative content analysis.
Results: Among 100 respondents (response rate 8.1%), 45 (45%) completed the survey by phone. The respondents were on average 68 years old (SD 13 years), with 90% (90/100) White, 39% (39/100) women, and 88% (88/100) having access to the internet or a mobile phone. Most participants (65/100, 65%) agreed or strongly agreed that they would use the app, among which 53 (82%) would use the app as often as possible. The percentage of participants with the intention to use the app was 75% among those aged 65-74 years and dropped to 44% among those older than 75 years. The intention to use was higher in online survey respondents (vs phone survey respondents; odds ratio 3.07, 95% CI 1.20-7.88) after adjusting for age and access to technology. The analysis of open-ended questions identified the following 4 main facilitators (motivations): (1) easily reaching providers, (2) accessing or providing information, (3) quickly reaching providers, and (4) consulting providers for symptoms, and the following 4 main barriers: (1) privacy/security concerns, (2) uncomfortable using technology, (3) user-unfriendly app interface, and (4) preference for in-person/phone care.
Conclusions: There was a strong intention to use a symptom monitoring app postdischarge among ACS patients. However, this intent decreased in patients older than 75 years. The survey identified barriers related to technology use, privacy/security, and the care delivery mode. Further research is warranted to determine if such intent translates into app use, and better symptom management and health care quality.
doi:10.2196/34452
Keywords
Introduction
The transition from inpatient care to home is challenging for patients with acute coronary syndrome (ACS) [
- ]. After hospital discharge, ACS patients often experience symptoms that prompt them to seek acute medical attention [ - ]. A large portion of these symptoms are noncardiac [ - ], and could be assessed and managed through close follow-up care in the outpatient setting to reduce unnecessary acute care utilization [ , - ]. Symptom assessment and management are integral to transitional care [ - ], and are also part of the transitional care management services supported by Medicare [ ]. However, hospital-initiated follow-up activities alone may be inadequate to detect symptoms in a timely fashion, as new or worsening symptoms may occur between the initial contact and the follow-up appointment [ ]. Intensive transitional care programs offering multiple follow-up phone calls or home visits may better capture patient’s symptom episodes [ , ], but providing such thorough contact increases the need for staff resources and time, and can be challenging to scale up.Digital health tools for symptom monitoring can support timely detection and evaluation of patients’ symptoms [
- ], and have been successfully integrated with routine cancer care [ , , - ]. Some tools allowed patients to report symptoms frequently or at any time [ , ]. However, in general, evidence about the feasibility and efficacy of using these tools to improve patient outcomes is still limited, especially in patients with ACS. A recent study analyzed data related to using a digital symptom monitoring tool (which allowed patients to self-rate and track their symptoms of fatigue) to enhance a patient-centered care intervention for cardiac rehabilitation [ ]. This study found that the enhanced intervention improved patient-reported self-efficacy at 6 months postdischarge, compared with usual care (P=.01). However, only 39% of the patients in the intervention group chose to use the digital health tool.More research is needed to understand the intention, barriers, and facilitators to digital health symptom monitoring in ACS patients. This is particularly true among older adults (≥65 years old) representative of the ACS population. Older adults have unique barriers in using technology, such as lack of knowledge and confidence, age-related changes or disabilities, and skepticism about the benefits [
, ]. Prior studies showed that most patients, including older adults, are ready to accept digital health tools for monitoring mental health conditions and symptoms, but the intention to use decreased with age [ , ]. Understanding these issues may help improve design, development, and adherence to digital symptom monitoring in ACS patients.This study aimed to assess ACS patients’ intention to use digital health tools that support symptom monitoring by providers after hospital discharge. We conducted a survey, using both close-ended and open-ended questions, to assess the intention to use, the difference in the intention by patient characteristics (eg, age), and the facilitators and barriers of using these tools in this patient population. We also compared the intention to use between 2 survey delivery modes (online vs phone).
Methods
Study Design
We analyzed data collected through a survey using both close-ended and open-ended questions. The survey was delivered using 1 of the 2 modes (online surveys and phone calls) to ensure a balanced sample of participants who are comfortable or are not comfortable with the use of technology (ie, filling online surveys).
Ethics Approval
The study was approved by the Institutional Review Board at the University of Massachusetts Chan Medical School. The ethics approval number (ie, the Institutional Review Board Docket Number) for this study is H00018298. The Institutional Review Board approved the use of informed verbal consent procedures. We obtained verbal informed consent from each participant by email or phone.
Survey
The survey design was informed by prior literature on assessing participants’ intention to use digital interventions [
, ]. One researcher (with expertise in health informatics and implementation science) created the initial survey by adapting a subset of validated questions from a survey assessing participants’ intention to use mobile apps for COVID-19 symptom monitoring [ ]. A cardiologist and 2 research team members (with training in public health and clinical research, respectively) reviewed the survey content and provided feedback on clarifying and simplifying the language of the introduction paragraph, the survey questions, and the response options.The final survey (
) included 5 items to assess participants’ demographics (age, sex, and race) and access to technology (internet and smartphone), and 5 items (3 close-ended and 2 open-ended questions) related to the intention to use a hypothetical symptom monitoring app. The demographics questions and the open-ended questions were optional. Intention to use the app was assessed using a 5-point Likert-type (from “strongly agree” to “strongly disagree”) item (also called the intention-to-use question) as follows: “If this app were available to me, I would use it.” Participants who responded “strongly agree,” “agree,” or “neutral” to this item were prompted to respond to 2 additional items. The first item was a 5-point Likert-type item as follows: “I plan to use this app as often as necessary,” with response options ranging from “strongly agree” to “strongly disagree.” The second item was multiple-choice as follows: “I’d like the app to be designed as …,” with the following 3 options: “mobile app,” “web app,” and “other.” The 2 remaining open-ended questions collected free-text comments on the facilitators (ie, motivations) and barriers to using the app.Recruitment and Data Collection
We recruited patients from UMass Memorial Health Care, the largest health care system in Central Massachusetts, serving most patients hospitalized with cardiovascular diseases in this region.
Using information from electronic health records (EHRs), we identified adult patients (>18 years old) who were hospitalized for ACS (ICD-10 codes: I24.9, I21, I21.x, I21.xx, and I25.110) between January 2019 and December 2020, as eligible participants. Study data were collected and managed using REDCap electronic data capture tools hosted at the study institution [
, ].We recruited participants with a 2-stage procedure, using emails and phone calls, respectively. In the first stage (December 2020), we emailed invitations to 782 candidate participants. Once a participant replied to the email to indicate their interest, we sent the online survey via a secure REDCap link to their email address. An unanswered survey was automatically disabled in REDCap 30 days after being sent to the participant. Recruitment stopped after more than 40 participants responded to the online survey.
In the second stage (January 2021 to April 2021), we recruited participants who did not have an email address listed in the EHR via phone calls. Recruitment calls were made to 448 candidate participants until the total number of responses to the survey (from both email and phone recruitment) met the target (N=100). For phone recruitment, we documented the reasons for declining participation. Participants recruited by phone were given the option to complete the survey online (using the same procedure described for stage 1) or via phone. For surveys answered by phone, a research staff member documented participants’ verbal responses in REDCap. Each survey participant (for both stages of participant recruitment) was provided a US $10 gift card to compensate for their time.
Research Questions
The following 4 research questions were considered: (1) Do patients have the intention to use the app for symptom monitoring by providers? (Q1); (2) Is there a difference in the intention to use the app for symptom monitoring across subgroups characterized by participants’ characteristics, including age and access to technology? (Q2); (3) Is there a difference in the intention to use the app for symptom monitoring between participants responding to the survey online and those responding by phone? (Q3); and (4) What are the main factors that motivate or discourage patients’ use of an app for symptom monitoring by providers? (Q4).
Statistical Analyses
Statistical analyses were performed using STATA/IC 15.1 (StataCorp). We first calculated descriptive statistics of participants’ characteristics and examined their distributions over the 2 survey delivery modes. We then analyzed the data to answer research questions 1 to 3. We used participants’ age information from the EHR, which has greater granularity than the survey responses, for these analyses.
First, we calculated descriptive statistics of participants’ responses to the 3 close-ended survey questions related to the intention to use the symptom monitoring app (Q1). Second, we examined the distribution of the intention to use over participants’ characteristics and access to technology (Q2). Third, we assessed the associations between survey delivery mode and participants’ intention to use the app (Q3), using multivariable logistic regression to adjust for potential confounding factors related to participants’ characteristics and access to technology. We identified the confounders based on the literature and the examination of the distribution of participants’ characteristics over survey delivery mode (P<.05). In addition, we combined access to the internet and access to a smartphone into 1 variable, access to technology, when adjusting for the association analysis because the 2 variables are interdependent (Fisher exact test P<.001).
When conducting analyses related to questions 2 and 3, we grouped the 5 response options of the intention-to-use question into 2 categories, with 1 representing “agree” and “strongly agree” and 0 representing the other options. In addition, we assigned numeric values to the 5 response options (1: strongly disagree, 2: disagree, 3: neutral, 4: agree, 5: strongly agree) and presented the summary statistics of the responses.
Qualitative Analyses
To answer research question 4, we analyzed survey responses to the 2 open-ended survey questions through an iterative process using qualitative content analysis. Qualitative content analysis is a research method widely used to analyze written, verbal, or visual communication messages through the systematic coding and identification of themes or patterns [
- ]. Following established techniques [ , ], we carried on the analysis over 3 phases (ie, preparation, organizing, and reporting).In the preparation phase, GEE (premed student with training in biology, neuroscience, and clinical research) read through the survey responses and assigned initial codes to the responses. JC (with expertise in health informatics and implementation science), JGW (with training in public health and health education), and GEE discussed the initial coding results and created the initial codebook. Using the initial codebook, GEE, JGW, and LML (with training in clinical research and neuroscience) coded all survey responses independently. Codes were assigned to each response (primarily single sentences), and double coding was allowed. The coded responses were discussed among GEE, JGW, LML, and JC to resolve discrepancies, and new codes were added when necessary. This process resulted in the final codebook (
), with 9 codes (4 categories) for the facilitator question and 8 codes (4 categories) for the barrier question. Based on the coding results, JC segmented survey responses into units that entail a single code. Most segments were single sentences; some were phrases or contained multiple sentences.In the organizing phase, JC and JGW independently coded the segments using the final codebook. The intercoder agreement was 86% for the facilitator question and 87% for the barrier question. Discrepancies were discussed and resolved between JC and JGW to generate the final coding results.
In the reporting phase, we reported the definitions, frequencies, and representative quotes of codes and summarized key findings [
, ]. We identified the major barriers and facilitators to app use by considering code/category frequency and existing literature on health app use among patients or older adults, and through discussion in the research team. In addition, we compared the most salient facilitators and barriers for the following 2 age groups: younger and older than 65 years of age.Results
Participant Characteristics
Among 782 patients contacted by email, 59 (7.5%) showed interest in participating in the study, and 48 (81%) of them responded to the survey. Among 448 patients contacted by phone calls, 61 (13.6%) showed interest, and 52 (85%) of them responded to the survey. Overall, the survey response rate was 8.1% (100/1230). There was no difference in age between patients who responded to the survey and patients who did not, including those who did not show interest in participating in the study (67.6 vs 67.7 years, P=.94). Of the patients contacted for this study and who did not want to participate, 73 provided reasons for nonparticipation. The common reasons included poor health condition (n=31, 42%), no interest (n=17, 23%), no time (n=11, 15%), and no access or uncomfortable with the use of technology (n=9, 12%).
Among 100 respondents, 45% (ie, 45 of the participants recruited by phone) completed the survey by phone and 55% completed it online. The respondents were on average 68 years old (SD 13 years), with 90% (90/100) White, 39% (39/100) women, and 88% (88/100) reporting having access to the internet or a mobile phone. As shown in
, the rates of access to the internet (P<.001) and a smartphone (P<.001) were higher in online survey respondents than phone survey respondents. Among the 62 older participants (≥65 years old), 49 (79%) and 41 (66%) reported having access to the internet and a smartphone, respectively.Characteristic | Total (N=100), n (%) | Survey delivery mode, n (%) | P valuea | ||
Phone (n=45) | Online (n=55) | ||||
Age group | .82 | ||||
<65 years | 38 (38) | 16 (36) | 22 (40) | ||
65-74 years | 32 (32) | 14 (31) | 18 (33) | ||
≥75 years | 30 (30) | 15 (33) | 15 (27) | ||
Gender | .41 | ||||
Female | 39 (39) | 20 (44) | 19 (35) | ||
Male | 59 (59) | 24 (53) | 35 (64) | ||
Not reported | 2 (2) | 1 (2) | 1 (2) | ||
Race | >.99 | ||||
White | 90 (90) | 39 (87) | 51 (93) | ||
Others | 6 (6) | 3 (7) | 3 (5) | ||
Not reported | 4 (4) | 3 (7) | 1 (2) | ||
Has access to the internet | <.001b | ||||
No | 15 (15) | 14 (31) | 1 (2) | ||
Yes | 85 (85) | 31 (69) | 54 (98) | ||
Has a smartphone | <.001b | ||||
No | 25 (25) | 19 (42) | 6 (11) | ||
Yes | 75 (75) | 26 (58) | 49 (89) |
aCalculated by the Fisher exact test for categorical variables, using complete case analysis (ie, ignoring missing values for gender and race).
bStatistically significant (P<.05).
Intention to Use the Symptom Monitoring App
All participants (N=100) responded to the intention-to-use survey item, with responses of strongly agree (n=19), agree (n=46), neutral (n=15), disagree (n=15), and strongly disagree (n=5). A total of 74 participants responded to the survey item “I plan to use this app as often as necessary,” with responses of strongly agree (n=22), agree (n=35), neutral (n=16), disagree (n=1), and strongly disagree (n=0). Among the 65 (65%) respondents with a positive intention (agree or strongly agree) to use the app, 53 (82%) agreed or strongly agreed that they would use the app as often as possible. Among the 73 respondents to the app design question, 28 (38%) preferred a mobile app, 30 (41%) preferred a web-based app, 14 (19%) liked both mobile and web-based apps, and 1 (1%) preferred another design (unspecified).
Intention to Use by Patient Characteristics
Among the 62 older participants (≥65 years old), 37 (60%) reported having the intention to use the app. As shown in
, survey respondents aged 75 years or older had a lower rate of intention (ie, agree or strongly agree) to use the app (43%) than those in other age groups (74% for ages under 65 years and 75% for ages 65-74 years; Fisher exact test P=.02). There was no difference in the intention to use by gender or race. The rate of the intention to use the app was higher in respondents with access to the internet or a smartphone than those without access (72% vs 17%, P<.001).The mean (
) and median ( ) scores of the intention to use and the distributions of the 5 levels of the intention to use ( ), stratified by participant characteristics, showed similar patterns.Variablea | Response scoreb, mean (SD) | Rate of a positive (agree or strongly agree) intention to use the app | |||
n/N | % | P valuec | |||
All | 3.6 (1.1) | 65/100 | 65 | ||
Age group | .02d | ||||
<65 years | 3.9 (0.8) | 28/38 | 74 | ||
65-74 years | 3.7 (1.1) | 24/32 | 75 | ||
≥75 years | 3.1 (1.3) | 13/30 | 43 | ||
Gender | >.99 | ||||
Female | 3.6 (1.0) | 25/39 | 64 | ||
Male | 3.6 (1.2) | 39/59 | 66 | ||
Race | .66 | ||||
White | 3.6 (1.1) | 59/90 | 66 | ||
Others | 3.3 (0.8) | 3/6 | 50 | ||
Has access to technology (internet or a smartphone) | <.001d | ||||
No | 2.2 (1.0) | 2/12 | 17 | ||
Yes | 3.8 (1.0) | 63/88 | 72 | ||
Survey delivery mode | .001d | ||||
Phone | 3.1 (1.3) | 21/45 | 47 | ||
Online | 4.0 (0.8) | 44/55 | 80 |
aThe gender and race variables had 2 and 4 missing values, respectively.
bScores assigned to the response options were as follows: 1, strongly disagree; 2, disagree; 3, neutral; 4, agree; 5, strongly agree.
cCalculated by the Fisher exact test for all the items.
dStatistically significant (P<.05).
Intention to Use by the Survey Delivery Mode
The rate of a positive intention to use the app (
) was higher in online survey respondents than in phone survey respondents (80% vs 47%, P=.001). After adjusting for age and access to technology, the difference remained significant (adjusted odds ratio 3.07, 95% CI 1.20-7.88).Similarly, the mean (
) and median ( ) scores of the intention to use were higher in online survey respondents (mean 4.0, median 4) than in phone survey respondents (mean 3.1, median 3).Facilitators and Barriers to Using the App
A total of 84 (84%) participants responded to the facilitator question, for which we identified 73 segments (from 66 participants) that described facilitators. A total of 80 (80%) participants responded to the barrier question, for which we identified 70 segments (from 63 participants) that described barriers. The analyses of these segments identified 9 facilitators or motivations (
) and 9 barriers ( ). The major facilitators included (1) easily reaching providers, (2) accessing or providing information, (3) quickly reaching providers, and (4) consulting providers for symptoms. We distinguished between barriers 1 and 3, with barrier 1 focusing on convenience in care access (see code definition and more example quotes in ). The main barriers included (1) privacy/security concerns, (2) uncomfortable using technology, (3) user-unfriendly app interface, and (4) preference for in-person/phone care.Among participants under 65 years, 87% (33/38) mentioned facilitators to app use, with the most noticeable one being “easily reach providers” (frequency of 14). Among participants aged 65 years or older, 53% (33/62) mentioned facilitators, with the most noticeable one being “access and provide information” (frequency of 8). Among participants under 65, 55% (21/38) mentioned barriers to app use, with the most noticeable one being “lack of timely response” (frequency of 5). Among participants aged 65 years or older, 65% (40/62) mentioned barriers, with the most noticeable one being “uncomfortable with technology” (frequency of 12).
Discussion
Principal Findings
This is the first study to assess the intention to use a postdischarge symptom monitoring app in ACS patients. We found that most (65/100, 65%) ACS patients had the intention to use an app to monitor and report postdischarge symptoms to providers. Compared with other participants, those aged 75 years or older or lacking access to technology (ie, internet and smartphones) had a lower intention to use the app. Furthermore, phone survey respondents had a lower intention to use the app than online survey respondents. Open-ended survey questions identified important facilitators (
) and barriers ( ) to using the app in the following 4 domains: access to care, communication, technology, and privacy.Intention to Use Digital Symptom Monitoring in Older Patients With ACS
Although ACS patients are mostly older adults, we still found a high intention to use the symptom monitoring tool in this population. Specifically, 60% of older participants (≥65 years old) had the intention to use the app. Furthermore, the percentage of participants aged 65-74 years who had the intention to use the app (75%) was as high as that (74%) among younger participants. Our findings are compatible with previous findings on the intention to use health information technology, including symptom monitoring apps, in older adults [
, , - ]. For example, prior studies found that 46%-51% of participants older than 60 years would like to use a mobile app to track mental health conditions [ , ]. Other studies also found mobile symptom tracking apps acceptable for older patients with heart failure [ , ], and an app incorporating design features specific to older adults received high usability scores [ ]. Similar to prior studies [ , ], we found that older participants had a lower intention to use the app, but we saw this pattern only in participants aged 75 years or older.Lack of an Email Address in the EHR: A Potential Indicator for a Low Intention to Use Digital Symptom Monitoring
For this study, we intentionally used phone calls to recruit patients who did not have an email address in the EHR. The absence of an email address may imply a lack of email access, infrequent use of email, or less comfort with sending and receiving emails. Most of these participants (ie, those without an email address in the EHR) chose to complete the survey over the phone and had a lower intention to use a symptom monitoring app, even after adjusting for age and access to technology. This suggests that a lack of an email address itself may be a useful predictor and provide meaningful information for health care teams making decisions about remote symptom monitoring postdischarge. In the future, this information (ie, lack of an email address in the EHR) can be used to purposefully sample key informants to help design and user test symptom monitoring apps and identify patients who may need greater training and support in app use.
Patient-Perceived Facilitators and Barriers to Using Digital Symptom Monitoring
This study also identified important facilitators and barriers to using a symptom monitoring app in ACS patients. Prior studies found that perceived usefulness significantly influenced the intention to use medical apps in older patients [
, , , ]. Similarly, we found that the facilitators or motivations to using a symptom monitoring app mainly were related to perceived usefulness of the app, such as reaching health providers easily, accessing and providing health information, and consulting with providers regarding symptom management. The major barrier identified was patients’ concerns with privacy and security. This is common with digital health interventions and needs to be addressed from the perspectives of both the app and the users [ - ]. In addition to following the regulations and incorporating standard security features in app design [ , ], it is important to assess user opinions on desired privacy and security features in their local context [ , ]. In this study, we found that ACS patients were concerned about who will access their health information and the disclosure of their health information to a third party without their knowledge and authorization. Using hospital-authorized apps, clearly communicating with patients an app’s privacy statement, and providing options for choosing which information to disclose with whom may reduce this barrier. Similar to prior studies [ , ], we found that the most notable barrier for using the symptom app in older (≥65 years old) ACS patients is being uncomfortable using technology. Patient-centered app design, in-hospital training for app use, and app use support from caregivers may help reduce the barriers [ ].Previous studies found that patients sometimes have challenges in deciding when to use an app to report symptoms. For example, patients sometimes reported urgent issues via secure messaging services designed for communicating nonurgent issues [
- ]. In addition, prior studies found that ACS patients were more stressful about certain symptoms and 15% of patients developed stress disorder symptoms after ACS [ , ]. It is likely that some patients would unnecessarily seek acute care when experiencing nonurgent symptoms [ ]. In this study, we did not find these issues to be a theme when analyzing patient-reported barriers to app use. However, it is important to communicate with patients about the appropriate use of a symptom monitoring app and how frequently providers would review or respond to patient reporting. Patient education on how to assess the severity of symptoms, for example, identifying typical ACS symptoms that need urgent care, is also relevant and may improve health care utilization.Implications on App Design and Development
Whether an intention to use a digital health app can translate into real use depends on many factors, such as app design and implementation strategies to support app use. In addition to general app design principles (eg, secure and easy to use), this study suggests additional considerations in app development for ACS patients. Specifically, we found that older age and lack of access to technology were associated with a low intention to use the app, and the most common barrier to app use in older adults was being uncomfortable using technology. This suggests that a multimodal strategy may be more effective in engaging these patients. For those who have nonsmart phones or are less comfortable using apps, text messaging may serve as an additional communication channel. Alternatively, app design may allow for the involvement of family members or caregivers in symptom tracking. In addition, accessible design principles for older adults may be incorporated by including a consistent and simple interface, making the most essential functionalities readily visible and available, and making it easier to “undo” an unintended action [
, ]. A co-creation approach that engages older patients in all stages of app development and user testing is also important for improving app adoption and user experience [ , ].In this study, we also found that patients were motivated to use an app to easily reach providers. Therefore, the app should allow providers to easily access symptom reports, triage symptoms, and respond to patient symptoms and concerns. It is also critical to engage providers in all phases of app design and testing. App adoption will need to address how to integrate information from the app into the EHR, and assess the impact of the app on provider burden and clinical workflow [
, ].Limitations
Our sample was relatively small and from a health care system in 1 state, and most participants were non-Hispanic White. Therefore, our findings may not be generalizable to other settings. Constrained by the format of a survey study, participants’ responses to the open-ended survey questions were typically short and lacked detailed information about the contextual factors related to the perceived facilitators and barriers. We interpret these qualitative results based on the existing literature. In-depth qualitative studies are warranted to better understand certain barriers, such as the preference for in-person care and phone communication.
Conclusions
We found a strong intention of using a symptom monitoring app postdischarge among ACS patients. However, this intent was lower in patients aged 75 years or older. Our survey identified barriers related to privacy and security, technology use, and the care delivery mode. Using hospital-authorized apps and in-hospital training may reduce the barriers. Further research is warranted to determine if such intent translates into app use, and better symptom management and health care quality.
Acknowledgments
This work was supported by the National Heart, Lung and Blood Institute (grant 1K12HL138049-3). TKH, RSS, and JC also received funding support from the National Cancer Institute (grant 1P50CA244693). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Authors' Contributions
JC and RSS conceived this study. JC designed the study with inputs from RSS, JGW, and LML. LML and GEE recruited and surveyed the participants. JC analyzed the quantitative survey data. GEE, JGW, LML, and JC coded responses to open-ended survey questions, and JC and JGW finalized the qualitative analyses. RSS, BSG, and TKH provided expertise in digital health research and critical intellectual inputs to data analysis and result interpretation. JC drafted the manuscript. JGW, GEE, LML, and BSG contributed to paper writing. RSS, BSG, and TKH provided important feedback for paper revision. All authors reviewed and provided feedback for the manuscript and approved the final manuscript.
Conflicts of Interest
None declared.
Survey to assess the intention to use a symptom monitoring app.
DOCX File , 18 KB
Codebook and example quotes for facilitators and barriers to using a symptom monitoring app.
DOCX File , 28 KB
Intention to use a symptom monitoring app, stratified by patient characteristics and the survey delivery mode.
DOCX File , 24 KBReferences
- Virani SS, Alonso A, Benjamin EJ, Bittencourt MS, Callaway CW, Carson AP, American Heart Association Council on EpidemiologyPrevention Statistics CommitteeStroke Statistics Subcommittee. Heart Disease and Stroke Statistics-2020 Update: A Report From the American Heart Association. Circulation 2020 Mar 03;141(9):e139-e596 [FREE Full text] [CrossRef] [Medline]
- Dharmarajan K, Hsieh AF, Lin Z, Bueno H, Ross JS, Horwitz LI, et al. Diagnoses and timing of 30-day readmissions after hospitalization for heart failure, acute myocardial infarction, or pneumonia. JAMA 2013 Jan 23;309(4):355-363 [FREE Full text] [CrossRef] [Medline]
- Wasfy JH, Strom JB, O'Brien C, Zai AH, Luttrell J, Kennedy KF, et al. Causes of short-term readmission after percutaneous coronary intervention. Circ Cardiovasc Interv 2014 Feb;7(1):97-103. [CrossRef] [Medline]
- Southern DA, Ngo J, Martin B, Galbraith PD, Knudtson ML, Ghali WA, et al. Characterizing types of readmission after acute coronary syndrome hospitalization: implications for quality reporting. J Am Heart Assoc 2014 Sep 18;3(5):e001046 [FREE Full text] [CrossRef] [Medline]
- Kwok CS, Shah B, Al-Suwaidi J, Fischman DL, Holmvang L, Alraies C, et al. Timing and Causes of Unplanned Readmissions After Percutaneous Coronary Intervention: Insights From the Nationwide Readmission Database. JACC Cardiovasc Interv 2019 Apr 22;12(8):734-748 [FREE Full text] [CrossRef] [Medline]
- Shah M, Patil S, Patel B, Agarwal M, Davila CD, Garg L, et al. Causes and Predictors of 30-Day Readmission in Patients With Acute Myocardial Infarction and Cardiogenic Shock. Circ Heart Fail 2018 Apr;11(4):e004310. [CrossRef] [Medline]
- Iribarne A, Chang H, Alexander JH, Gillinov AM, Moquete E, Puskas JD, et al. Readmissions after cardiac surgery: experience of the National Institutes of Health/Canadian Institutes of Health research cardiothoracic surgical trials network. Ann Thorac Surg 2014 Oct;98(4):1274-1280 [FREE Full text] [CrossRef] [Medline]
- Enderlin CA, McLeskey N, Rooker JL, Steinhauser C, D'Avolio D, Gusewelle R, et al. Review of current conceptual models and frameworks to guide transitions of care in older adults. Geriatr Nurs 2013;34(1):47-52. [CrossRef] [Medline]
- Medicare Program; CY 2020 Revisions to Payment Policies Under the Physician Fee Schedule and Other Changes to Part B Payment Policies; Medicare Shared Savings Program Requirements; Medicaid Promoting Interoperability Program Requirements for Eligible Professionals; Establishment of an Ambulance Data Collection System; Updates to the Quality Payment Program; Medicare Enrollment of Opioid Treatment Programs and Enhancements to Provider Enrollment Regulations Concerning Improper Prescribing and Patient Harm; and Amendments to Physician Self-Referral Law Advisory Opinion Regulations Final Rule; and Coding and Payment for Evaluation and Management, Observation and Provision of Self-Administered Esketamine Interim Final Rule. Centers for Medicare & Medicaid Services. 2019. URL: https://tinyurl.com/2p8zk663 [accessed 2022-02-14]
- Jack B, Paasche-Orlow M, Mitchell S, Forsythe S, Martin J, Brach C. An overview of the Re-Engineered Discharge (RED) Toolkit (Prepared by Boston University under Contract No. HHSA290200600012i). Boston University. Rockville, MD: Agency for Healthcare Research and Quality; 2012. URL: https://www.bu.edu/fammed/projectred/newtoolkit/ProjectRED-tool1-overview.pdf [accessed 2022-02-14]
- Coleman EA, Parry C, Chalmers S, Min S. The care transitions intervention: results of a randomized controlled trial. Arch Intern Med 2006 Sep 25;166(17):1822-1828. [CrossRef] [Medline]
- Hirschman KB, Shaid E, McCauley K, Pauly MV, Naylor MD. Continuity of Care: The Transitional Care Model. Online J Issues Nurs 2015 Sep 30;20(3):1 [FREE Full text] [Medline]
- Burke RE, Guo R, Prochazka AV, Misky GJ. Identifying keys to success in reducing readmissions using the ideal transitions in care framework. BMC Health Serv Res 2014 Sep 23;14:423 [FREE Full text] [CrossRef] [Medline]
- Bloink J, Adler KG. Transitional care management services: new codes, new requirements. Fam Pract Manag 2013;20(3):12-17 [FREE Full text] [Medline]
- Felix HC, Seaberg B, Bursac Z, Thostenson J, Stewart MK. Why do patients keep coming back? Results of a readmitted patient survey. Soc Work Health Care 2015;54(1):1-15 [FREE Full text] [CrossRef] [Medline]
- Bennett AV, Jensen RE, Basch E. Electronic patient-reported outcome systems in oncology clinical practice. CA Cancer J Clin 2012;62(5):337-347 [FREE Full text] [CrossRef] [Medline]
- Jensen RE, Snyder CF, Abernethy AP, Basch E, Potosky AL, Roberts AC, et al. Review of electronic patient-reported outcomes systems used in cancer clinical care. J Oncol Pract 2014 Jul;10(4):e215-e222 [FREE Full text] [CrossRef] [Medline]
- Masterson Creber RM, Maurer MS, Reading M, Hiraldo G, Hickey KT, Iribarren S. Review and Analysis of Existing Mobile Phone Apps to Support Heart Failure Symptom Monitoring and Self-Care Management Using the Mobile Application Rating Scale (MARS). JMIR Mhealth Uhealth 2016 Jun 14;4(2):e74 [FREE Full text] [CrossRef] [Medline]
- Wang K, Varma DS, Prosperi M. A systematic review of the effectiveness of mobile apps for monitoring and management of mental health symptoms or disorders. J Psychiatr Res 2018 Dec;107:73-78. [CrossRef] [Medline]
- Menni C, Valdes AM, Freidin MB, Sudre CH, Nguyen LH, Drew DA, et al. Real-time tracking of self-reported symptoms to predict potential COVID-19. Nat Med 2020 Jul 11;26(7):1037-1040 [FREE Full text] [CrossRef] [Medline]
- Basch E, Deal AM, Kris MG, Scher HI, Hudis CA, Sabbatini P, et al. Symptom Monitoring With Patient-Reported Outcomes During Routine Cancer Treatment: A Randomized Controlled Trial. J Clin Oncol 2016 Feb 20;34(6):557-565 [FREE Full text] [CrossRef] [Medline]
- Denis F, Basch E, Septans A, Bennouna J, Urban T, Dueck AC, et al. Two-Year Survival Comparing Web-Based Symptom Monitoring vs Routine Surveillance Following Treatment for Lung Cancer. JAMA 2019 Jan 22;321(3):306-307 [FREE Full text] [CrossRef] [Medline]
- Basch E, Deal AM, Dueck AC, Scher HI, Kris MG, Hudis C, et al. Overall Survival Results of a Trial Assessing Patient-Reported Outcomes for Symptom Monitoring During Routine Cancer Treatment. JAMA 2017 Jul 11;318(2):197-198 [FREE Full text] [CrossRef] [Medline]
- Wolf A, Fors A, Ulin K, Thorn J, Swedberg K, Ekman I. An eHealth Diary and Symptom-Tracking Tool Combined With Person-Centered Care for Improving Self-Efficacy After a Diagnosis of Acute Coronary Syndrome: A Substudy of a Randomized Controlled Trial. J Med Internet Res 2016 Feb 23;18(2):e40 [FREE Full text] [CrossRef] [Medline]
- Gitlow L. Technology Use by Older Adults and Barriers to Using Technology. Physical & Occupational Therapy In Geriatrics 2014 Aug 12;32(3):271-280. [CrossRef]
- Vaportzis E, Clausen MG, Gow AJ. Older Adults Perceptions of Technology and Barriers to Interacting with Tablet Computers: A Focus Group Study. Front Psychol 2017 Oct 04;8:1687 [FREE Full text] [CrossRef] [Medline]
- Torous J, Friedman R, Keshavan M. Smartphone ownership and interest in mobile applications to monitor symptoms of mental health conditions. JMIR Mhealth Uhealth 2014 Jan 21;2(1):e2 [FREE Full text] [CrossRef] [Medline]
- Torous J, Chan SR, Yee-Marie Tan S, Behrens J, Mathew I, Conrad EJ, et al. Patient Smartphone Ownership and Interest in Mobile Apps to Monitor Symptoms of Mental Health Conditions: A Survey in Four Geographically Distinct Psychiatric Clinics. JMIR Ment Health 2014;1(1):e5 [FREE Full text] [CrossRef] [Medline]
- van Velsen L, van der Geest T, van de Wijngaert L, van den Berg S, Steehouder M. Personalization has a Price, Controllability is the Currency: Predictors for the Intention to use Personalized eGovernment Websites. Journal of Organizational Computing and Electronic Commerce 2015 Feb 05;25(1):76-97. [CrossRef] [Medline]
- Jansen-Kosterink S, Hurmuz M, den Ouden M, van Velsen L. Predictors to use mobile apps for monitoring COVID-19 symptoms and contact tracing: A survey among Dutch citizens. medRxiv. 2020. URL: https://www.medrxiv.org/content/10.1101/2020.06.02.20113423v1 [accessed 2022-02-14]
- Harris PA, Taylor R, Minor BL, Elliott V, Fernandez M, O'Neal L, REDCap Consortium. The REDCap consortium: Building an international community of software platform partners. J Biomed Inform 2019 Jul;95:103208 [FREE Full text] [CrossRef] [Medline]
- Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)--a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009 Apr;42(2):377-381 [FREE Full text] [CrossRef] [Medline]
- Hsieh H, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res 2005 Nov;15(9):1277-1288. [CrossRef] [Medline]
- Vaismoradi M, Turunen H, Bondas T. Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study. Nurs Health Sci 2013 Sep;15(3):398-405. [CrossRef] [Medline]
- Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs 2008 Apr;62(1):107-115. [CrossRef] [Medline]
- Parker SJ, Jessel S, Richardson JE, Reid MC. Older adults are mobile too!Identifying the barriers and facilitators to older adults' use of mHealth for pain management. BMC Geriatr 2013 May 06;13:43 [FREE Full text] [CrossRef] [Medline]
- Hung L, Lyons JG, Wu C. Health information technology use among older adults in the United States, 2009-2018. Curr Med Res Opin 2020 May;36(5):789-797. [CrossRef] [Medline]
- Portz JD, Vehovec A, Dolansky MA, Levin JB, Bull S, Boxer R. The Development and Acceptability of a Mobile Application for Tracking Symptoms of Heart Failure Among Older Adults. Telemed J E Health 2018 Feb 12;24(2):161-165 [FREE Full text] [CrossRef] [Medline]
- Reading Turchioe M, Grossman LV, Baik D, Lee CS, Maurer MS, Goyal P, et al. Older Adults Can Successfully Monitor Symptoms Using an Inclusively Designed Mobile Application. J Am Geriatr Soc 2020 Jun 10;68(6):1313-1318 [FREE Full text] [CrossRef] [Medline]
- Askari M, Klaver NS, van Gestel TJ, van de Klundert J. Intention to use Medical Apps Among Older Adults in the Netherlands: Cross-Sectional Study. J Med Internet Res 2020 Sep 04;22(9):e18080 [FREE Full text] [CrossRef] [Medline]
- de Veer AJE, Peeters JM, Brabers AEM, Schellevis FG, Rademakers JJDJM, Francke AL. Determinants of the intention to use e-Health by community dwelling older people. BMC Health Serv Res 2015 Mar 15;15:103 [FREE Full text] [CrossRef] [Medline]
- Sohn A, Speier W, Lan E, Aoki K, Fonarow G, Ong M, et al. Assessment of Heart Failure Patients' Interest in Mobile Health Apps for Self-Care: Survey Study. JMIR Cardio 2019 Oct 29;3(2):e14332 [FREE Full text] [CrossRef] [Medline]
- Cajita MI, Hodgson NA, Budhathoki C, Han H. Intention to Use mHealth in Older Adults With Heart Failure. J Cardiovasc Nurs 2017 Feb 28;32(6):E1-E7 [FREE Full text] [CrossRef] [Medline]
- Zhang Y, Liu C, Luo S, Xie Y, Liu F, Li X, et al. Factors Influencing Patients' Intentions to Use Diabetes Management Apps Based on an Extended Unified Theory of Acceptance and Use of Technology Model: Web-Based Survey. J Med Internet Res 2019 Aug 13;21(8):e15023 [FREE Full text] [CrossRef] [Medline]
- Dennison L, Morrison L, Conway G, Yardley L. Opportunities and challenges for smartphone applications in supporting health behavior change: qualitative study. J Med Internet Res 2013 Apr 18;15(4):e86 [FREE Full text] [CrossRef] [Medline]
- Zhou L, Bao J, Watzlaf V, Parmanto B. Barriers to and Facilitators of the Use of Mobile Health Apps From a Security Perspective: Mixed-Methods Study. JMIR Mhealth Uhealth 2019 Apr 16;7(4):e11223 [FREE Full text] [CrossRef] [Medline]
- Arora S, Yttri J, Nilse W. Privacy and Security in Mobile Health (mHealth) Research. Alcohol Res 2014;36(1):143-151 [FREE Full text] [Medline]
- Morera EP, de la Torre Díez I, Garcia-Zapirain B, López-Coronado M, Arambarri J. Security Recommendations for mHealth Apps: Elaboration of a Developer's Guide. J Med Syst 2016 Jun;40(6):152. [CrossRef] [Medline]
- Atienza AA, Zarcadoolas C, Vaughon W, Hughes P, Patel V, Chou WS, et al. Consumer Attitudes and Perceptions on mHealth Privacy and Security: Findings From a Mixed-Methods Study. J Health Commun 2015;20(6):673-679. [CrossRef] [Medline]
- Veinot TC, Mitchell H, Ancker JS. Good intentions are not enough: how informatics interventions can worsen inequality. J Am Med Inform Assoc 2018 Aug 01;25(8):1080-1088 [FREE Full text] [CrossRef] [Medline]
- North F, Crane SJ, Stroebel RJ, Cha SS, Edell ES, Tulledge-Scheitel SM. Patient-generated secure messages and eVisits on a patient portal: are patients at risk? J Am Med Inform Assoc 2013;20(6):1143-1149 [FREE Full text] [CrossRef] [Medline]
- Lanham HJ, Leykum LK, Pugh JA. Examining the Complexity of Patient-Outpatient Care Team Secure Message Communication: Qualitative Analysis. J Med Internet Res 2018 Jul 11;20(7):e218 [FREE Full text] [CrossRef] [Medline]
- Shimada SL, Petrakis BA, Rothendler JA, Zirkle M, Zhao S, Feng H, et al. An analysis of patient-provider secure messaging at two Veterans Health Administration medical centers: message content and resolution through secure messaging. J Am Med Inform Assoc 2017 Sep 01;24(5):942-949 [FREE Full text] [CrossRef] [Medline]
- Ayers S, Copland C, Dunmore E. A preliminary study of negative appraisals and dysfunctional coping associated with post-traumatic stress disorder symptoms following myocardial infarction. Br J Health Psychol 2009 Sep;14(Pt 3):459-471. [CrossRef] [Medline]
- Edmondson D, Rieckmann N, Shaffer JA, Schwartz JE, Burg MM, Davidson KW, et al. Posttraumatic stress due to an acute coronary syndrome increases risk of 42-month major adverse cardiac events and all-cause mortality. J Psychiatr Res 2011 Dec;45(12):1621-1626 [FREE Full text] [CrossRef] [Medline]
- Kwok CS, Wong CW, Shufflebotham H, Brindley L, Fatima T, Shufflebotham A, et al. Early Readmissions After Acute Myocardial Infarction. Am J Cardiol 2017 Sep 01;120(5):723-728. [CrossRef] [Medline]
- Fisk A, Czaja S, Rogers W, Charness N, Sharit J. Designing for Older Adults: Principles and Creative Human Factors Approaches. Boca Raton: CRC Press; 2019.
- Farage MA, Miller KW, Ajayi F, Hutchins D. Design principles to accommodate older adults. Glob J Health Sci 2012 Feb 29;4(2):2-25 [FREE Full text] [CrossRef] [Medline]
- Mansson L, Wiklund M, Öhberg F, Danielsson K, Sandlund M. Co-Creation with Older Adults to Improve User-Experience of a Smartphone Self-Test Application to Assess Balance Function. Int J Environ Res Public Health 2020 May 26;17(11):3768 [FREE Full text] [CrossRef] [Medline]
- Mannheim I, Schwartz E, Xi W, Buttigieg SC, McDonnell-Naughton M, Wouters EJM, et al. Inclusion of Older Adults in the Research and Design of Digital Technology. Int J Environ Res Public Health 2019 Oct 02;16(19):3718 [FREE Full text] [CrossRef] [Medline]
- Mishuris RG, Yoder J, Wilson D, Mann D. Integrating data from an online diabetes prevention program into an electronic health record and clinical workflow, a design phase usability study. BMC Med Inform Decis Mak 2016 Jul 11;16:88 [FREE Full text] [CrossRef] [Medline]
- Borycki E, Kushniruk A, Nohr C, Takeda H, Kuwata S, Carvalho C, et al. Usability Methods for Ensuring Health Information Technology Safety: Evidence-Based Approaches. Contribution of the IMIA Working Group Health Informatics for Patient Safety. Yearb Med Inform 2013;8:20-27. [Medline]
Abbreviations
ACS: acute coronary syndrome |
EHR: electronic health record |
Edited by A Kushniruk; submitted 24.10.21; peer-reviewed by M Stemmer, H Gandhi; comments to author 05.12.21; revised version received 17.12.21; accepted 19.12.21; published 07.03.22
Copyright©Jinying Chen, Jessica G Wijesundara, Gabrielle E Enyim, Lisa M Lombardini, Ben S Gerber, Thomas K Houston, Rajani S Sadasivam. Originally published in JMIR Human Factors (https://humanfactors.jmir.org), 07.03.2022.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on https://humanfactors.jmir.org, as well as this copyright and license information must be included.