Published on in Vol 9, No 1 (2022): Jan-Mar

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/23794, first published .
User-Centered Design to Enhance mHealth Systems for Individuals With Dexterity Impairments: Accessibility and Usability Study

User-Centered Design to Enhance mHealth Systems for Individuals With Dexterity Impairments: Accessibility and Usability Study

User-Centered Design to Enhance mHealth Systems for Individuals With Dexterity Impairments: Accessibility and Usability Study

Original Paper

1Department of Physical Medicine and Rehabilitation, University of Pittsburgh School of Medicine, Pittsburgh, PA, United States

2Department of Health Information Management, School of Health and Rehabilitation Sciences, University of Pittsburgh, Pittsburgh, PA, United States

3Human Engineering Research Laboratories, Department of Veterans Affairs, Veterans Affairs Pittsburgh Healthcare System, Pittsburgh, PA, United States

4Occupational Therapy Doctorate Program, Johnson & Wales University, Providence, RI, United States

Corresponding Author:

Bambang Parmanto, PhD

Department of Health Information Management

School of Health and Rehabilitation Sciences

University of Pittsburgh

6052 Forbes Tower

Pittsburgh, PA, 15260

United States

Phone: 1 412 383 6649

Email: parmanto@pitt.edu


Background: Mobile health systems have been shown to be useful in supporting self-management by promoting adherence to schedules and longitudinal health interventions, especially in people with disabilities. The Interactive Mobile Health and Rehabilitation (iMHere) system was developed to empower people with disabilities and those with chronic conditions with supports needed for self-management and independent living. Since the first iteration of the iMHere 1.0 app, several studies have evaluated the accessibility and usability of the system. Potential opportunities to improve and simplify the user interface were identified, and the iMHere modules were redesigned accordingly.

Objective: In this study, we aim to evaluate the usability of the redesigned modules within the iMHere 1.0 app.

Methods: We evaluated the original and redesigned iMHere modules—MyMeds and SkinCare. The Purdue Pegboard Test was administered to assess the participants’ dexterity levels. Participants were then asked to perform a set of tasks using both the original and redesigned MyMeds and SkinCare modules to assess their efficiency and effectiveness. Usability was measured using the Telehealth Usability Questionnaire to evaluate 10 new accessibility features that were added to the redesigned app. Participants were also asked which version they preferred.

Results: In total, 24 participants with disabilities and varying degrees of dexterity impairments completed the entire study protocol. Participants displayed improved efficiency and effectiveness when using the redesigned modules compared with the original modules. The participants also reported improved usability and preferred the redesigned modules.

Conclusions: This study demonstrated that the iMHere system became more efficient, effective, and usable for individuals with dexterity impairments after redesigning it according to user-centered principles.

JMIR Hum Factors 2022;9(1):e23794

doi:10.2196/23794

Keywords



Background

The advent of smartphones has transcended the mobile phone’s original purpose—the ability to make phone calls anywhere. Notably, smartphones have radically altered the way people communicate with friends and family, coordinate daily activities, and organize their lives. At the most fundamental level, smartphone users expect their devices to provide an immediate and reliable means of managing their everyday lives [1,2].

One of the most significant emerging trends in the health-related use of smartphones is the proliferation of mobile health (mHealth) apps. These apps can be implemented in a variety of settings, with many focusing on monitoring, managing, and supporting health-related behavior changes [3]. One of the most common type of health-related app focuses on the management of chronic conditions, such as obesity, chronic pain, and type 2 diabetes mellitus, through patient empowerment [4-6].

People with disabilities, however, are one of the largest populations facing health issues that limit their function and participation. The World Health Organization estimates that over 1 billion people, about 15% of the world’s population, live with some form of disability [7]. As the population continues to age, the rate of disability continues to rise, in part owing to chronic conditions and the effects of aging itself. Many people with disabilities also have limited access to health care.

Given the high degree of portability and adaptability, mHealth can facilitate self-management and community integration by providing support when the user is between medical visits or in any location, including outside the home. These features may be particularly useful in supporting people with disabilities, who often have limited access to health care and community-based resources to support independent living. The support provided by mHealth may allow users to address secondary complications, which are not always addressed adequately in the outpatient setting, thereby reducing the cost of care [8-10]. Strong evidence supports the importance of using tools to promote self-management skills to improve the health outcomes and independence of people with disabilities [11,12].

Despite the need for mHealth tools to support self-management, a Pew Research Center survey in 2016 found that 65% of people with disabilities have low confidence in their ability to use the internet and other communication devices to keep up with information [13]. This is further compounded by a general lack of awareness of the accessibility features of apps and the skills to use mobile devices optimally [14]. In addition, many mainstream mHealth apps are not designed to address usability or accessibility [15].

The Interactive Mobile Health and Rehabilitation (iMHere) system is an mHealth system that was developed to empower people with disabilities and those with chronic conditions with the skills needed for self-management and independent living [16]. The iMHere 1.0 system originally consisted of a smartphone app for people with disabilities and a web-based portal for clinicians, bridged by a 2-way communication protocol (Figure 1).

Figure 1. Interactive Mobile Health and Rehabilitation platform—MyMeds and SkinCare modules as seen by user.
View this figure

The iMHere 1.0 app comprised a suite of 5 modules to support medication management (MyMeds), skin integrity (SkinCare), bowel management, bladder self-catheterization, and mental health. People with disabilities could use this suite of modules to report compliance with treatment regimens, ask questions, and receive personalized treatment plans, educational materials, and messages from the clinician. On the clinician side, a web-based monitoring portal allowed clinicians to engage with patients and track their adherence to a specific and individualized treatment plan. By accessing the iMHere portal, clinicians were able to monitor patients’ adherence to self-management activities, view reported problems and issues, and send personalized treatment plans to patients [16].

Given the vast health care implications of using mHealth solutions in people with disabilities, usability testing of mHealth apps is needed. Usability testing has been widely used in the people with disabilities population to test mobile self-management programs. Payne et al [17] demonstrated that usability testing of a web-based e-counseling platform to promote behavioral self-management in patients with chronic heart failure had favorable outcomes in improving the navigation of the website. Williams et al [18] also assessed the usability of a pediatric cardiovascular disease risk factor tool, which yielded revisions through tester feedback to make the mobile app more user-friendly. Thirumalai et al [19] evaluated the development process of a telehealth app used by people with multiple sclerosis through a usability study, which incorporated revisions into the final app. These previous works highlight the importance of usability testing, as it can help identify issues specific to the people with disabilities population that may not have been addressed by program developers in the first iteration of the mHealth solution.

We conducted extensive user acceptance and usability testing of the iMHere system. In the past, 3 studies on the accessibility of iMHere 1.0 have been conducted. In the first study by Yu et al [20], the iMHere 1.0 system was tested for usability. In this study, the modules were tested for self-management workflow, user interface and navigation, and patient-clinician communication. All participants in the study were interested in daily use of the phone app, with the MyMeds and SkinCare modules used frequently by all users, as demonstrated by the consistent use of the phone app during the 6-month intervention period. The clinical portal allowed clinicians to continually monitor patients’ conditions and take appropriate steps to prevent secondary complications [20].

In a subsequent study by Yu et al [21], the iMHere 1.0 app was tested for accessibility in 6 participants with spina bifida (SB). The study specifically explored participant experiences with the user interface and the navigation of the module. All 6 participants viewed the modules positively with regard to their support for self-management activities, as indicated by the Telehealth Usability Questionnaire (TUQ) scores (6.52/7 points, 93%). This was further strengthened by the efficiency of performance, as it was noted that shorter times to complete tasks and reduced error rates were seen over repeated trials. In this study, a few avenues for improvement to accessibility were identified, including the need for changes to accommodate users with dexterity impairments.

In a subsequent study, Yu et al [22] explored the accessibility needs and preferences of iMHere users with various disabilities that lead to dexterity impairments. Participants completed 5 tasks, and the difficulty-on-performance (DP) was calculated. As expected, a higher degree of dexterity impairment demonstrated more problems in task completion. A few potential issues and barriers were identified, including changes needed to the user interface to create a consistent design, instructive guidance, and simpler cognitive processes in the use of the app.

Objectives

The modules within iMHere 1.0 were redesigned based on these prior studies. The aim of this study is to evaluate the usability of redesigned modules within iMHere 1.0. Hypothesis 1 was that usability (as defined by efficiency and effectiveness) would be higher when completing tasks in the redesigned modules compared with the original modules. Hypothesis 2 was that usability (as measured by the TUQ, which evaluates learnability and satisfaction) would be higher in the redesigned modules, compared with the original modules [23].


Overview

Modules

This study was designed to evaluate the usability of two modules of the iMHere system: the original and redesigned versions of MyMeds and SkinCare. These modules were specifically selected because of the high rates of medication use and pressure ulcers in the people with disabilities population. Medication mismanagement and inadequate care of pressure injuries are the causes of high rates of hospitalization in the people with disabilities population and significantly increase morbidity [24-26]. These modules were also the most complex iMHere modules in terms of functionality.

MyMeds Module

The MyMeds module helps users manage their medications by providing reminders and monitoring medication adherence. Persons with conditions, such as SB and spinal cord injuries (SCIs), for example, are frequently prescribed several medications for managing neurogenic bowel, neurogenic bladder, spasticity, pain, and depression. Taking multiple medications multiple times per day, while at the same time having to consistently follow other complex self-management routines can be particularly challenging. The MyMeds module helps patients adhere to their medication regimens by providing reminders or cues, keeping track of all their medications and medication schedules (including those medications currently prescribed or prescribed in the past), and reporting if and when the medications have been taken.

SkinCare Module

The SkinCare module reminds users to perform routine inspections of their skin, enables users to take pictures and track any wound or skin conditions that have developed, and at the same time provides the ability to communicate with clinicians on how to care for these problems. For people with SB or SCI; for example, constant vigilance is needed to prevent pressure injuries, particularly in the lower limbs and buttocks, where sensation may be impaired. Pressure injuries are not only detrimental to the patient owing to increased mortality and increased intensive care unit and hospital length of stay but also present a significant health care burden given increased health care costs and health care use following discharge [27-29].

Study Design

This study was approved by the Institutional Review Board of the University of Pittsburgh (PRO12090453). All participants provided informed consent for participation. The participants were recruited from local outpatient rehabilitation medicine clinics. A sample size calculation was performed using the Wilcoxon signed-rank test (2-sided). A sample size of 14 achieved 91% power to detect a mean of paired differences of 1.0, with an estimated SD of paired differences of 1.0, with a significance level (α) of .05.

The inclusion criteria were as follows: users must be between the ages of 18 and 64 years, have fine motor dexterity impairments, have potential for skin breakdown (defined by diagnosis or lack of sensation), and use at least one (prescription or nonprescription) medication. The exclusion criteria were as follows: users with vision, hearing, or speech limitations that entirely precluded the use of a smartphone. Individuals were not excluded if they had used iMHere in a prior study, but a 4-month washout period was used to mitigate learning effects.

Usability was defined according to the usability attributes by Nielsen [30]. The Nielsen model of usability was selected as a framework for this study, as it is multifaceted in its approach to the many dimensions of usability. We examined the usability constructs of efficiency and effectiveness (including errors; hypothesis 1) by assessing task time and errors made. We also used a validated usability survey (TUQ) to measure the usability constructs of learnability and satisfaction (hypothesis 2). We also evaluated user preferences. This design has been used in prior research [31-34]. Participants were first randomly assigned to use either the original or redesigned modules. Participants were then crossed over and provided with alternate modules, such that each participant served as his or her own matched control. As such, we elected not to test memorability in this study, as testing of memorability would confound our washout period between testing of the original and redesigned modules. Data were collected either in the laboratory or at the site of the participant's choosing (ie, home or office).

Demographics, Training, and Dexterity

A background questionnaire was administered to collect the participants’ demographic data, previous experience with mobile phones, and knowledge of mHealth technologies.

A face-to-face orientation and training session (approximately 15 minutes) was conducted to introduce the MyMeds and SkinCare modules. Participants were trained to perform the tasks for each of the modules using a trial medication bottle and a mock skin problem image. Participants were scheduled to complete the second set of modules after a 3-week period. This crossover period served as the washout period to minimize the learning effects.

To assess the participants’ dexterity levels, the Purdue Pegboard Test (PPBT) was administered to measure the movements of a person’s fingers, hands, and arms [35-39]. The PPBT was initially developed by Joseph Tiffin in 1948 to test the manual dexterity of those seeking employment in industrial jobs, such as factory workers on an assembly line. Although most individuals no longer have occupations akin to factory workers, technological advancements have created new requirements for high dexterity, such as typing on a computer keyboard or messaging on a cell phone. Despite the cultural shifts in the past few decades to include technology such as mobile devices, the PPBT has been shown to be valid and reliable for wrist and hand disorders and has since been adapted for use in testing dexterity in the clinical setting [40,41].

The PPBT consists of 3 tests at 30-second intervals using the right hand, left hand, and both hands. In each test, participants were asked to pick up pins, collars, or washers from the top of the pegboard and drop them in the peg holes as rapidly as possible in 30 seconds. The score for each test was based on the total number of pins, collars, or washers that dropped in the holes correctly. A composite score was calculated by summing the scores from these 3 tests, yielding the right+left+both hands score. This score represents participants’ overall dexterity levels. Lower right+left+both hands scores indicate a higher degree of dexterity impairment. On the basis of their right+left+both hands scores, participants were categorized into the following 3 groups reflecting their dexterity levels:

  • Group 1: users with mild dexterity issues as defined by PPBT scores for the right+left+both hands scores ranging from 1 SD to 3 SD below the generic mean of factory workers.
  • Group 2: users with moderate dexterity issues as defined by right+left+both hands scores >3 SD below the generic mean of factory workers.
  • Group 3: users with severe dexterity issues as defined by the inability to perform the PPBT (right+left+both hands score=0).

Efficiency and Effectiveness

Participants were then asked to perform a set of tasks using both the original and redesigned MyMeds and SkinCare modules. The think aloud method for product design and development was used to gain comprehensive knowledge of participants’ experiences, including any experienced frustration [42]. Specifically, participants were asked to verbally describe their intentions and actions to the researcher as they performed the following tasks:

  • Task 1: schedule a new medication—participants were asked to locate the correct medication, add information about their regimen, and set up a reminder.
  • Task 2: modify a medication reminder—participants were asked to change the alert time for medication.
  • Task 3: respond to a medication alert—participants were asked to indicate whether a medication was taken.
  • Task 4: set up a schedule to check the skin—participants were asked to set a daily alert to conduct a skin evaluation.
  • Task 5: modify an alert for skin check—participants were asked to change the alert time for the scheduled skin evaluation.
  • Task 6: report a skin issue—participants were asked to identify a skin issue, and then take a picture and enter data into predefined fields within the module, describing the affected skin region, including location, color, size, depth, and tissue condition.
  • Task 7: update or track changes in an existing skin issue—participants were asked to reassess previously identified skin issues and track changes by taking pictures and filling out a form describing the affected skin region, including location, color, size, depth, and tissue condition.
  • Task 8: set personalized configurations for user interface presentations—participants were asked to record a preferred module list, background, text size, and target size to optimize interactions. This task was conducted only for the redesigned module.

Task 8 was performed only when a participant was testing the redesigned modules.

The following variables were collected:

  • Efficiency
    • Average task time: the time in seconds for a participant to complete each task was measured and then averaged across all 8 tasks.
  • Effectiveness
    • Number of steps in each task: number of actions taken by the participant to complete a given task.
    • Number of mistakes in each task: when a participant reported a problem finishing a task, it was counted as a mistake. Mistakes were tallied to each task.
    • Error rate: the sum of mistakes divided by the total number of steps required to complete a task.
    • Mistake recovery: ability of participants to correct mistakes. Step-by-step observation notes were used to record the status of mistake recoveries, which were used to describe the DP experienced by a participant during mistake recovery. The DP score was calculated as the sum of weighted scores, where a lower DP score indicated better and easier performance on the task.
      1. The participant solved the problem without any help.
      2. The participant needed help solving the problem, addressed in one sentence.
      3. The participant needed help solving the problem, addressed in 2-4 sentences.
      4. The participant did not solve the problem.

Learnability and Satisfaction

Overview

Usability was measured using TUQ (Table 1). The TUQ measures constructs of usability, such as learnability and satisfaction. Learnability, as defined by Nielsen [30], assesses how easily users can accomplish a task the first time they encounter the interface and how many repetitions it takes for them to become efficient at that task. The TUQ has been shown to have high validity, reliability, and internal consistency [23]. It provides a more comprehensive evaluation of telehealth tools, given that it has combined existing sources in telemedicine (such as the Telemedicine Satisfaction Questionnaire) and computer software interface (such as the Technology Acceptance Model and the IBM Post Study System Usability Questionnaire). Participants were asked to rate the extent to which they agreed with 21 statements using a scale from 1 to 7 (minimum score 21; maximum score 147). Statements are grouped into six domains: usefulness, ease of use and learnability, interface quality, interaction quality, reliability, and satisfaction and future use. The average TUQ scores were calculated for each of the 6 domains and overall. A higher score reflects higher usability.

Table 1. Telehealth Usability Questionnaire items.
ComponentsQuestionnaire items
Usefulness

1Telehealth improves my access to health care services

2Telehealth saves me time traveling to a hospital or specialist clinic

3Telehealth provides for my health care needs
Ease of use and learnability

1It was simple to use this system

2It was easy to learn to use the system

3I believe I could become productive quickly using this system
Interface quality

1The way I interact with this system is pleasant

2I like using the system

3The system is simple and easy to understand

4This system is able to do everything I would want it to be able to do
Interaction quality

1I could easily talk to the clinician using the telehealth system

2I could hear the clinician clearly using the telehealth system

3I felt I was able to express myself effectively

4Using the telehealth system, I could see the clinician as well as if we met in person
Reliability

1I think the visits provided over the telehealth system are the same as in-person visits

2Whenever I made a mistake using the system, I could recover easily and quickly

3The system gave error messages that clearly told me how to fix problems
Satisfaction and future use

1I feel comfortable communicating with the clinician using the telehealth system

2Telehealth is an acceptable way to receive health care services

3I would use telehealth services again

4Overall, I am satisfied with this telehealth system
User Preferences

We measured user preferences by asking each participant whether they preferred the original or redesigned modules and the reasons for those preferences.

Accessibility

The following 10 accessibility features were demonstrated to participants in the redesigned app as part of the training during the study:

  1. Customized module list: this feature provides the user with the ability to customize their app by hiding or showing a selected module from the home screen. The participants were able to personalize their home screens with the modules that were most applicable to them.
  2. Customized text display: this feature allows the user to set up a reading size that is comfortable for them in the redesigned modules. The size, color, bold, and italic versions of titles, text, attention text, and warning text were predefined in the iMHere modules relative to the settings of the display text.
  3. Customized theme: this feature allows the user to select their preferred background and text color.
  4. Customized button size: customized button size was created after a user pressed their index finger on the screen to record her or her fingertip size. The smartphone then adapts button or icon size accordingly for all iMHere modules. Given the dexterity impairment in the study population, this feature improved the accuracy in making selections using a customized button target size.
  5. Customized keyboard: the iMHere app provided a customized keyboard with softer keys, larger key sizes, and preconfigured characters. Customized keyboards were used primarily for the MyMeds module, where users could easily enter medication dosage information. When using the customized keypad to enter 2 tablets, for instance, only 2 touches were needed, 2 and tablet. This customized keypad was designed to reduce the number of touches required on the smartphone screen.
  6. Ability to take pictures of a pill or bottle: using this feature, users could take a photo of a pill or medication bottle and upload it into his or her medication schedule.
  7. Color-coding: this feature matched the color with the module name. For instance, the title for the SkinCare module on the home page was highlighted in red. When navigating through the SkinCare modules, all screens under the module had a red bar.
  8. Navigational short cut: this feature allowed users to create personalized settings for the home screen, such as a list of modules.
  9. Text guidance: the modules provided short text cues with self-training instructional notes on the screen and were highlighted in a particular color.
  10. Voice guidance: the modules used text-to-speech technology, which allowed users to listen to text guidance as audio output.

We asked participants to rank the importance of each accessibility feature, using a 10-point Likert scale (1 indicated that this feature was the most important and 10 indicated that this feature was the least important). The average ranking was then calculated for each accessibility feature.

Statistical Analysis

Descriptive statistics were calculated for the demographic and usability variables, including PPBT scores.

The α level was set at .05. All statistical analyses regarding hypotheses 1 and 2 were carried out using Wilcoxon signed-rank tests. To test the first hypothesis, the original and redesigned modules were compared in terms of efficiency (average task time) and effectiveness (number of steps, number of mistakes, error rate, and mistake recovery). As some experienced users were recruited, a secondary analysis using the Mann-Whitney U test was used to explore whether differences in average task time for the original and redesigned modules between experienced and inexperienced users could be because of a learning effect not mitigated by the washout period. To test the second hypothesis, the original and redesigned modules were compared in terms of usability (average overall TUQ and average TUQ domain scores).


Overview

A total of 28 participants were recruited for this study; 2 (7%) participants were excluded based on the exclusion criteria: 1 (4%) user was blind, and 1 (4%) user had both vision and dexterity impairments that precluded the use of a smartphone. Moreover, 4% (1/28) of participants was not able to complete the entire protocol because of severe dexterity impairments, as assessed by PPBT scores. In addition, 4% (1/28) of participants dropped out because of scheduling conflicts. Therefore, in total, 24 participants (n=8, 33% females and n=16, 67% males) completed the entire study protocol.

Demographics and Dexterity

The demographics of the participants are presented in Table 2. Participants’ ages ranged from 18 to 64 years, with an average age of 28 years (SD 6.3 years). Of the 24 participants, 14 (58%) patients had SB, 5 (21%) had SCI, 3 (13%) had cerebral palsy, 1 (4%) had muscular dystrophy, and 1 (4%) had cerebellar ataxia. In total, of the 24 participants, 22 (92%) patients were right-hand dominant, 21 (88%) were smartphone users, 2 (8%) were regular mobile phone users, and 1 (4%) participant did not use any mobile device; 12 (50%) participants had used a mobile phone for <2 years, and 20 (83%) participants used a smartphone for at least 60 minutes per day. In addition, 21% (5/24) of participants had finished graduate-level education, while 71% (17/24) of participants had received a high school or equivalent education.

Table 2. Participant demographics (N=24).
Demographic detailsValues
Age (years), mean (SD)28 (6.3)
Gender, n (%)

Male15 (63)

Female9 (38)
Highest level of education, n (%)

High school17 (71)

Higher education5 (21)
Disability, n (%)

Spina bifida14 (58)

Spinal cord injury5 (21)

Cerebral palsy3 (13)

Muscular dystrophy1 (4)

Cerebellar ataxia1 (4)
Type of phone, n (%)

Regular2 (8)

Smart21 (88)

N/Aa1 (4)
Years of use, n (%)

<212 (50)

>211 (46)

N/Aa1 (4)
Daily use, n (%)

<60 min/day3 (13)

>60 min/day20 (83)

N/Aa1 (4)

aN/A: not applicable.

Of the 24 participants, 7 (29%) participants had previously used the iMHere modules (experienced), and 17 (71%) participants had not previously used any iMHere modules (inexperienced). The experienced participants had stopped using iMHere for at least 4 months before participating in this study, a washout period that we expected the participants did not carryover significant learning from previous experience. Of the 7 experienced users, 4 (57%) participants remembered approximately 5% of the process to complete the tasks in the original modules and approximately 10% of the process in the redesigned modules. Furthermore, 43% (3/7) of participants had no recollection of how to use the modules.

All participants’ PPBT scores (right+left+both hands) were at least 1 SD below the generic mean (46.8, SD 4) of factory workers (Multimedia Appendix 1). There were 8 participants in group 1, 12 participants in group 2, and 5 participants in group 3.

Efficiency: Average Task Time

Table 3 shows the average time in seconds for all participants to complete tasks 1-7 using the original and redesigned modules. The average time for the 24 participants to complete tasks 1-7 in the original modules was approximately 48 seconds. This time dropped by 35% to 31 seconds when completing the tasks using the redesigned modules. Participants’ speed in completing tasks 1, 2, 4, and 6 improved by >30% when comparing the redesigned modules with the original modules. A significant difference was found in the average task time for all tasks, except task 3, when comparing the original with the redesigned modules. Overall, a Wilcoxon signed-rank test showed that the total average task time for each participant was significantly different between the original and the redesigned modules (W=0.0; Z=−4.3; P<.001).

Table 3. Comparison of the average task time for all participants.
TasksOriginal modules (task time in seconds), mean (SD)Redesigned modules (task time in seconds), mean (SD)Time difference, seconds (%)Statistics




W valueZ valueP value
Task 1: schedule a medication alert110.5 (36.5)68.9 (23.1)−41.7 (−37.7)3−4.2<.001
Task 2: modify a medication alert39.6 (15.2)25.1 (11.1)−14.5 (−36.5)24−3.6<.001
Task 3: respond to a medication alert4.2 (3.1)4.3 (2.9)0.1 (1.8)144−0.2.85
Task 4: schedule skin check25.3 (11.2)16.7 (6.6)−8.5 (−33.7)17−3.8<.001
Task 5: modify a skincare alert21.8 (9.4)16.5 (9.5)−5.3 (−24.4)56–2.7.007
Task 6: report a new skin problem81.2 (17.8)48.5 (12.0)−32.7 (−40.2)1−4.3<.001
Task 7: track the changes of a skin issue56.0 (15.2)38.8 (11.0)−17.2 (−30.6)9−4.0<.001

The average time in seconds to complete tasks using the original and the redesigned modules for the 29% (7/24) experienced participants and the 71% (17/24) inexperienced participants is shown in Table 4. A secondary analysis revealed no significant difference in average task time between the experienced (n=7; mean 49.0, SD 36.6) and inexperienced participants (n=17; mean 48.0, SD 37.4) when using the original modules (U=59; Z=−0.03; P=.98), or between the experienced (n=7; mean 31.6, SD 23.8) and inexperienced participants (n=17; mean 31.1, SD 21.7) when using the redesigned modules (U=59, Z=−0.03; P=.98).

Table 4. Experienced versus inexperienced: average task time for all participants.
TasksOriginal modulesRedesigned modules

Experienced (task time in seconds), mean (SD)Inexperienced (task time in seconds), mean (SD)Experienced (task time in seconds), mean (SD)Inexperienced (task time in seconds), mean (SD)
Task 1: schedule a medication alert109.0 (49.2)111.2 (31.8)74.1 (36.5)66.7 (15.8)
Task 2: modify a medication alert46.0 (19.5)37.0 (12.9)21.4 (7.7)26.7 (12.1)
Task 3: respond to a medication alert4.2 (1.6)4.2 (3.6)3.9 (1.2)4.5 (3.3)
Task 4: schedule a skin check25.4 (15.8)25.2 (9.4)16.5 (7.1)16.8 (6.6)
Task 5: modify a skincare alert21.2 (10.1)22.0 (9.5)18.4 (13.1)15.7 (8.0)
Task 6: report a new skin problem81.1 (18.4)81.2 (18.1)47.6 (12.7)48.9 (12.0)
Task 7: track the changes in skin issues58.7 (18.4)54.9 (17.1)39.2 (9.2)38.7 (12.0)

As shown in Table 5, participants with severe dexterity issues (group 3) required approximately 55 seconds on average to complete the tasks using the original modules. The time to complete the tasks improved by 40% (33 seconds) using the redesigned modules, which was the largest improvement among the 3 groups. The speed of participants with mild and moderate dexterity impairments (groups 1 and 2) to complete these tasks with the redesigned modules improved by >30%.

Table 5. Group comparison of the average task time.
TasksOriginal modules (task time in seconds), mean (SD)Redesigned modules (task time in seconds), mean (SD)Time difference, seconds (%)
Group 144.6 (8.0)31.7 (5.7)−12.8 (−28.8)
Group 247.9 (11.4)30.2 (6.5)−17.7 (−37)
Group 354.9 (14.1)35.8 (10.8)−19.1 (−34.8)

The activities in task 8 for configuring personalized settings include choosing preferred modules, changing the background and text color, changing the text display size, and choosing the button or target size. Participants required approximately 36 seconds (SD 9.0 seconds) to complete this task. Specifically, participants with mild dexterity issues (group 1) spent 32.8 seconds (SD 7.07 seconds), participants with moderate dexterity issues (group 2) spent 34.4 seconds (SD 9.98 seconds), and those with severe dexterity issues (group 3) spent 42.2 seconds (SD 6.67 seconds) to complete this task.

Effectiveness

Overview

Table 6 shows the total number of steps to complete the tasks, the total number of mistakes committed, the calculated error rate, and the total DP scores recorded for participants completing tasks 1-7 using the original and redesigned modules.

Table 6. Comparison of total steps, mistakes, and error rate.
TasksOriginal modulesRedesigned modules

Total steps, nTotal mistakes, nError rate, %Total DPaTotal steps, nTotal mistakes, nError rate, %Total DP
Task 1: schedule a new medication360329.36926441.58
Task 2: modify a medication alert1922110.94114421.44
Task 3: respond to a medication alert2400024000
Task 4: schedule a skin check14452.99120000
Task 5: modify skin check alert16863.11212032.55
Task 6: report new skin problem480132.62131241.38
Task 7: update the existing skin problem264165.93619231.65
Total1632935.71881176161.430

aDP: difficulty-on-performance.

Number of Steps

Figure 2 shows the average number of steps required by each participant to complete tasks 1-7 when using both the original and redesigned modules. On average, 68 steps (15+8+1+6+7+20+11) were required for a participant to complete tasks 1-7 using the original modules. This number dropped by approximately 25% to 49 steps (11+6+1+5+5+13+8) for the redesigned modules. In both modules, tasks 1 and 6 required the greatest number of steps to complete the task. A statistically significant difference was found in the number of steps for a participant to complete tasks in the original (mean 9.71, SD 6.26), and redesigned modules (W=0.0; Z=−2.2; P=.03).

Figure 2. Number of steps for participants to complete tasks.
View this figure
Number of Mistakes and Error Rate

A total of 93 mistakes were identified when the participants completed the tasks using the original modules. Only 16 mistakes were identified when participants completed tasks using the redesigned modules, with an 82.8% drop rate. The reduction in the total number of mistakes for participants completing tasks 1-7 using the redesigned modules (mean 0.63, SD 1.13) compared with the original modules (mean 3.88, SD 2.66) was significantly lower (W=0.0; Z=−2.2; P=.03).

Mistake Recovery

The total DP score for participants to complete tasks 1-7 using the redesigned modules (mean 4.29, SD 3.30) was significantly lower than that for the original modules (mean 26.86, SD 23.65; W=0.0; Z=−2.2; P=.03).

While using the original module, participants were able to self-correct 22% (21/93) of the mistakes identified without any assistance (DP=1), 55% (52/93) after 1 sentence of assistance (DP=2), and 21% (20/93) after 2 sentences of assistance (DP=3). With the redesigned module, participants were able to self-correct 18% (3/16) of the mistakes without any assistance (DP=1), 73% (11/16) of the mistakes after 1 sentence of assistance (DP=2), and 6% (1/16) of the mistakes after 2 sentences of assistance (DP=3).

Learnability and Satisfaction

Figure 3 shows a comparison of the mean TUQ scores from the domain of the TUQ for the original and redesigned modules. On average, participants’ usability scores improved from 83% (5.86/7, SD 0.97 points) for the original modules to 92% (6.46/7 points, SD 0.53 points) for the redesigned modules, an 8.6% improvement rate. The greatest improvements in user satisfaction were noted for ease of use and learning (15.45%), interface quality (10.97%), interaction (10.24%), and reliability (13.78%). The average TUQ scores for usefulness, and satisfaction and future use increased by >7%. The difference in usability between the original and redesigned modules was significant (W=210; Z=3.9; P<.001).

Figure 4 illustrates the average overall TUQ scores for each of the 24 participants using the original and redesigned modules. With the exception of participants 15 and 21, who had the same average overall TUQ score for both modules, all other participants had higher scores for the redesigned modules. The average overall TUQ scores were significantly different when comparing scores for the original and redesigned modules (P<.001).

Figure 3. Comparison of Telehealth Usability Questionnaire (TUQ) factors and scores.
View this figure
Figure 4. Telehealth Usability Questionnaire (TUQ) scores from participants.
View this figure

User Preferences

Of the 24 participants, 11 (46%) tested the original modules first, followed by a test of the redesigned modules. A total of 50% (12/24) of participants tested the redesigned modules first, followed by a test of the original modules. When we asked participants’ preferences regarding the use of the original or redesigned modules, 79% (19/24) of participants indicated that they preferred the redesigned modules, 13% (3/24) possibly preferred the redesigned modules, and 4% (1/24) preferred the original modules.

Participants who preferred the redesigned modules appreciated the ease of navigation and display of the redesigned modules owing to less typing and larger target. Others found the voice guidance to be useful, stating the guide gets user’s attention for directional notes.

Only 4% (1/24) of participants preferred using the original modules, stating that it looks clean compared with the redesigned modules. This participant chose the picture of bamboo as a background in the redesigned modules, which made the redesigned modules look busy. However, the participant preferred the redesigned module in terms of flow in use compared with the original modules.

Importance of Accessibility Features

Table 7 shows rankings of the 10 new accessibility features.

Table 7. Importance of accessibility features.
Serial no10-item Likert scale (1=most important; 10=not important)Average scoresRanking based on the average scores
1Customized module list2.82
2Customized text display4.09
3Customized theme5.310
4Customized button size3.13
5Customized keyboard3.34
6Ability to take a picture of a pill or a bottle3.88
7Color-coding3.87
8Navigational short cuts3.56
9Text guidance2.71
10Voice guidance3.34

Table 8 summarizes the accessibility importance rankings grouped by dexterity levels. Regardless of their dexterity level, all participants preferred using text guidance, ranking it highly across groups. Participants with mild to moderate dexterity impairments preferred to use both voice guidance and text guidance equally. However, users with severe dexterity impairments ranked the voice guidance feature as less important. Owing to their physical limitations with respect to holding a smartphone and accessing the volume control button, participants with severe dexterity impairments had problems turning off the voice using the volume control button. The ability to change the button size and use the customized keypad was more essential for participants with severe dexterity issues.

Table 8. User preference for new accessibility features.
FeaturesAverage scoresRanking based on the average scores

Group 1Group 2Group 3Group 1Group 2Group 3
Customized module list3.72.23.2714
Customized text display4.03.84.4878
Customized theme7.34.54.210107
Customized button size3.63.02.8441
Customized keyboard3.03.83.0272
Ability to take a picture of a pill or a bottle4.73.63.0962
Color-coding3.43.93.8396
Navigational short cuts3.63.14.4458
Text guidance2.02.53.2124
Voice guidance3.62.55.04210

Principal Findings

The use of mHealth as a self-management intervention is a new field of research. The iMHere system is unique in that it is specifically designed to support the self-management of people with disabilities. A previous systematic review by Nussbaum et al [43] identified several mHealth apps relevant to the field of rehabilitation medicine and identified only 3 mHealth apps focused on self-management, including the iMHere system. The iMHere system has been shown to be feasible for use in the SB and SCI populations, and its use has been associated with improvements in self-management skills, caregiver assistance needed, frequency of urinary tract infections, and depressive symptoms [44,45]. In addition, Nguyen et al [46] used a web-based application to promote dyspnea self-management in persons with chronic obstructive pulmonary disease. Duggan et al [5] evaluated the SMART2 app in the self-management of chronic pain. Both mHealth apps demonstrated positive outcomes and effectiveness in self-management of the respective conditions they evaluated. However, there remains a paucity of apps focused on self-management in people with disabilities with motor, cognitive, and sensory impairments.

This study further adds to the literature on the usability of mHealth systems in people with disabilities with various dexterity-limiting disabilities, as it demonstrates that mHealth systems can be made more usable by improving efficiency, effectiveness, learnability, and user satisfaction.

Our first hypothesis addressed the usability constructs of efficiency and effectiveness (including errors). The efficiency and effectiveness of the redesigned modules were significantly better than those of the original modules, resulting in improved user performance and reduced user error. These changes were likely because of the design criteria that were implemented after careful consideration of how dexterity affects workflow and recovery from errors. The most apparent improvements in efficiency were seen in those with severe dexterity issues who benefited from text cues and color-coding of modules. These features allowed users to troubleshoot their own actions and reduce the overall error rate. Those with mild to moderate dexterity impairments benefited most from voice guidance, changes to button size, and custom keyboard options. Voice guidance, similar to text cues, also helped participants troubleshoot and reduce errors. The ability to change the target button size helped improve the user’s accuracy. The customized keyboard simplified the process of data entry. It is important to note that the improvements in efficiency gained from these features may be a result of the modules becoming more intuitive from a cognitive perspective.

Our second hypothesis addressed learnability and satisfaction. The improved usability of the redesigned modules was also evidenced by the participants’ preference for the redesigned modules. With the addition of accessibility features, we were able to further improve learnability through features such as navigational shortcuts and voice or text guidance. In addition, we added features to improve customizability, such as custom themes and lists. As seen with improvements in TUQ scores, the participants were more satisfied with the redesigned modules and would use the iMHere modules in the future. Of note, significant improvement in usability detected in the redesigned modules compared with the original modules may have been even larger because there was no ceiling effect in TUQ.

Future work on the translation of mHealth to various models of care for people with disabilities is planned. We are currently carrying out a clinical trial evaluating the community integration of people with disabilities using mHealth to supplement services provided by a community-based organization that supports independent living. We are also carrying out implementation studies to evaluate how iMHere 2.0 can be used to deliver support to caregivers of people with disabilities and those with chronic conditions and to help facilitate long-term services and support such as caregiving services.

Study Limitations

Some limitations of this study deserve further discussion. First, we recruited a small sample, which limits the types of statistical analyses that can be performed. Second, although we redesigned all iMHere modules, this study assessed the design of only 2 modules. We chose these 2 modules because they are the most complex, containing both advanced features and basic features that are also found in the other 3 iMHere modules. As the 3 less-complex modules contain features that are similar to those tested in the more complex modules, we expect that the usability testing results for those modules would have been similar. Third, a variety of tools exist to test dexterity and usability measures. We chose the tests and measures intentionally based on the proposed usability theory but certainly, other theories, constructs, and tools are available. For instance, we did not test memorability as a measure of usability. We plan to incorporate this attribute of usability in future studies. Fourth, iMHere was not designed to support every disability or medical need, but its design is a result of research involving over 200 people with various disabilities and chronic conditions, children to older adults, and a diverse group of professionals and support personnel involved in the care of people with disabilities and chronic conditions. Finally, the participants in the study had a variety of diagnoses that resulted not only in dexterity impairments but also sensory and cognitive impairments. Thus, we were not able to determine which types of usability or accessibility issues were related to impairments other than those related to dexterity. Future studies will expand the participant population and stratify the results to further investigate the usability and accessibility needs of individuals based on their unique impairments and abilities.

Conclusions

This study demonstrated that the iMHere mHealth system became more usable for individuals with disabilities after redesigning it according to user-centered principles. Our findings demonstrate that users became more efficient and effective when using the redesigned modules. In addition, we found that the redesigned modules were easier to use and learn for the first-time users, and users were satisfied with the redesigned modules. By including the user in the iterative process to test usability, we were able to identify features in our original module that benefited from redesign. Since the publication of this work, iMHere has launched a subsequent version (iMHere 2.0) with additional features that are focused on enhancing user experience. The associated app now integrates the family and formal caregiver interface with the client app. In addition to the existing modules, additional modules focused on physical activity, nutrition, goal setting, and education were added to the app. In the future, we hope to complete usability testing with studies that incorporate memorability into user testing. With successful implementation of iMHere among our test participants, we hope to make this app available to different disability populations in the community to promote independence of self-management with improved clinical integration to bolster continuity of care.

Conflicts of Interest

BED, AF, BP, and GP are inventors of the iMHere system with no other financial interests in this technology.

Multimedia Appendix 1

Purdue Pegboard Test results (time to complete in seconds for right, left, and both hands).

DOCX File , 18 KB

  1. Kane S, Jayant C, Wobbrock J, Ladner R. Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities. In: Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility. 2009 Presented at: Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility; Oct 25 - 28, 2009; Pittsburgh Pennsylvania USA. [CrossRef]
  2. Dawe M. Desperately seeking simplicity: how young adults with cognitive disabilities and their families adopt assistive technologies. In: Proceedings of the CHI 2006 Conference on Human Factors in Computing Systems. 2006 Presented at: CHI06: CHI 2006 Conference on Human Factors in Computing Systems; Apr 22 - 27, 2006; Montréal Québec Canada. [CrossRef]
  3. Tirado M. Role of mobile health in the care of culturally and linguistically diverse US populations. Perspect Health Inf Manag 2011 Jan 01;8:1e [FREE Full text] [Medline]
  4. Hebden L, Balestracci K, McGeechan K, Denney-Wilson E, Harris M, Bauman A, et al. 'TXT2BFiT' a mobile phone-based healthy lifestyle program for preventing unhealthy weight gain in young adults: study protocol for a randomized controlled trial. Trials 2013 Mar 18;14:75 [FREE Full text] [CrossRef] [Medline]
  5. Duggan GB, Keogh E, Mountain GA, McCullagh P, Leake J, Eccleston C. Qualitative evaluation of the SMART2 self-management system for people in chronic pain. Disabil Rehabil Assist Technol 2015 Jan;10(1):53-60. [CrossRef] [Medline]
  6. Kristjánsdóttir OB, Fors EA, Eide E, Finset A, Stensrud TL, van Dulmen S, et al. A smartphone-based intervention with diaries and therapist-feedback to reduce catastrophizing and increase functioning in women with chronic widespread pain: randomized controlled trial. J Med Internet Res 2013 Jan 07;15(1):e5 [FREE Full text] [CrossRef] [Medline]
  7. Disability and health. World Health Organization. 2020.   URL: https://www.who.int/news-room/fact-sheets/detail/disability-and-health [accessed 2021-11-21]
  8. Wagner EH, Bennett SM, Austin BT, Greene SM, Schaefer JK, Vonkorff M. Finding common ground: patient-centeredness and evidence-based chronic illness care. J Altern Complement Med 2005;11 Suppl 1:S7-15. [CrossRef] [Medline]
  9. Bodenheimer T, Wagner EH, Grumbach K. Improving primary care for patients with chronic illness: the chronic care model, Part 2. JAMA 2002 Oct 16;288(15):1909-1914. [CrossRef] [Medline]
  10. Holman H. Chronic disease--the need for a new clinical education. JAMA 2004 Sep 01;292(9):1057-1059. [CrossRef] [Medline]
  11. Lorig KR, Holman H. Self-management education: history, definition, outcomes, and mechanisms. Ann Behav Med 2003 Aug;26(1):1-7. [CrossRef] [Medline]
  12. Clark NM. Management of chronic disease by patients. Annu Rev Public Health 2003;24:289-313. [CrossRef] [Medline]
  13. Anderson M, Perrin A. Disabled Americans less likely to use technology. Pew Research Center. 2017.   URL: https://www.benton.org/headlines/disabled-americans-are-less-likely-use-technology [accessed 2021-11-21]
  14. Franz R, Wobbrock J, Cheng Y, Findlater L. Perception and adoption of mobile accessibility features by older adults experiencing ability changes. In: Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility. 2019 Presented at: ASSETS '19: The 21st International ACM SIGACCESS Conference on Computers and Accessibility; Oct 28 - 30, 2019; Pittsburgh PA USA. [CrossRef]
  15. Lippincot B, Thompson N, Morris J, Jones M, DeRuyter F. Survey of user needs: mobile apps for mHealth and people with disabilities. In: Computers Helping People with Special Needs. Cham: Springer; 2020.
  16. Parmanto B, Pramana G, Yu DX, Fairman AD, Dicianno BE, McCue MP. iMHere: a novel mHealth system for supporting self-care in management of complex and chronic conditions. JMIR Mhealth Uhealth 2013 Jul 11;1(2):e10 [FREE Full text] [CrossRef] [Medline]
  17. Payne AY, Surikova J, Liu S, Ross H, Mechetiuc T, Nolan RP. Usability testing of an internet-based e-counseling platform for adults with chronic heart failure. JMIR Hum Factors 2015 May 08;2(1):e7 [FREE Full text] [CrossRef] [Medline]
  18. Williams PA, Furberg RD, Bagwell JE, LaBresh KA. Usability testing and adaptation of the pediatric cardiovascular risk reduction clinical decision support tool. JMIR Hum Factors 2016 Jun 21;3(1):e17 [FREE Full text] [CrossRef] [Medline]
  19. Thirumalai M, Rimmer JH, Johnson G, Wilroy J, Young H, Mehta T, et al. TEAMS (Tele-Exercise and Multiple Sclerosis), a tailored telerehabilitation mHealth app: participant-centered development and usability study. JMIR Mhealth Uhealth 2018 May 24;6(5):e10181 [FREE Full text] [CrossRef] [Medline]
  20. Yu DX, Parmanto B, Dicianno BE, Pramana G. Accessibility of mHealth self-care apps for individuals with Spina Bifida. Perspect Health Inf Manag 2015;12:1h [FREE Full text] [Medline]
  21. Yu DX, Parmanto B, Dicianno BE, Watzlaf VJ, Seelman KD. Accessibility needs and challenges of a mHealth system for patients with dexterity impairments. Disabil Rehabil Assist Technol 2017 Jan;12(1):56-64. [CrossRef] [Medline]
  22. Yu D, Parmanto B, Dicianno B. An mHealth app for users with dexterity impairments: accessibility study. JMIR Mhealth Uhealth 2019 Jan 08;7(1):e202 [FREE Full text] [CrossRef] [Medline]
  23. Parmanto B, Lewis AN, Graham KM, Bertolet MH. Development of the Telehealth Usability Questionnaire (TUQ). Int J Telerehabil 2016 Jul 1;8(1):3-10 [FREE Full text] [CrossRef] [Medline]
  24. Whitney DG, Schmidt M, Peterson MD, Haapala H. Polypharmacy among privately insured adults with cerebral palsy: a retrospective cohort study. J Manag Care Spec Pharm 2020 Sep;26(9):1153-1161. [CrossRef] [Medline]
  25. Cadel L, Everall AC, Packer TL, Hitzig SL, Patel T, Lofters AK, et al. Exploring the perspectives on medication self-management among persons with spinal cord injury/dysfunction and providers. Res Social Adm Pharm 2020 Dec;16(12):1775-1784. [CrossRef] [Medline]
  26. Morley CP, Struwe S, Pratte MA, Clayton GH, Wilson PE, Dicianno BE, et al. Survey of U.S. adults with spina bifida. Disabil Health J 2020 Apr;13(2):100833 [FREE Full text] [CrossRef] [Medline]
  27. Beierwaltes P, Munoz S, Wilhelmy J. Integument: guidelines for the care of people with spina bifida. J Pediatr Rehabil Med 2020;13(4):543-548 [FREE Full text] [CrossRef] [Medline]
  28. Baron J, Swaine J, Presseau J, Aspinall A, Jaglal S, White B, et al. Self-management interventions to improve skin care for pressure ulcer prevention in people with spinal cord injuries: a systematic review protocol. Syst Rev 2016 Sep 06;5(1):150 [FREE Full text] [CrossRef] [Medline]
  29. Groah SL, Schladen M, Pineda CG, Hsieh CJ. Prevention of pressure ulcers among people with spinal cord injury: a systematic review. PM R 2015 Jun;7(6):613-636. [CrossRef] [Medline]
  30. Nielsen J. Usability Engineering. Amsterdam: Elsevier; 1993.
  31. Zeng X. Evaluation and enhancement of web content accessibility for persons with disabilities. Doctoral Dissertation, University of Pittsburgh. 2004.   URL: http://d-scholarship.pitt.edu/7311/ [accessed 2021-11-21]
  32. Hackett S. An exploration into two solutions to propagating web accessibility for blind computer users. Doctoral Dissertation, University of Pittsburgh. 2007.   URL: http://d-scholarship.pitt.edu/10108/ [accessed 2021-11-21]
  33. Scotch M, Parmanto B, Monaco V. Evaluation of SOVAT: an OLAP-GIS decision support system for community health assessment data analysis. BMC Med Inform Decis Mak 2008 Jun 09;8:22 [FREE Full text] [CrossRef] [Medline]
  34. Saptono A. Development of an integrated telerehabilitation information management system to support remote wheelchair prescription. Doctoral Dissertation, University of Pittsburgh. 2011.   URL: https://tinyurl.com/yckpm5ys [accessed 2021-11-21]
  35. Desrosiers J, Hébert R, Bravo G, Dutil E. The Purdue Pegboard Test: normative data for people aged 60 and over. Disabil Rehabil 1995 Jul;17(5):217-224. [CrossRef] [Medline]
  36. Ozçelik IB, Purisa H, Sezer I, Mersa B, Kabakaş F, Tuncer S, et al. [Evaluation of long-term results in mutilating hand injuries]. Ulus Travma Acil Cerrahi Derg 2009 Mar;15(2):164-170 [FREE Full text] [Medline]
  37. Delp HL, Newton RA. Effects of brief cold exposure on finger dexterity and sensibility in subjects with Raynaud's phenomenon. Phys Ther 1986 Apr;66(4):503-507. [CrossRef] [Medline]
  38. Wilson BC, Iacoviello JM, Wilson JJ, Risucci D. Purdue Pegboard performance of normal preschool children. J Clin Neuropsychol 1982 May;4(1):19-26. [Medline]
  39. Smoot B, Wong J, Cooper B, Wanek L, Topp K, Byl N, et al. Upper extremity impairments in women with or without lymphedema following breast cancer treatment. J Cancer Surviv 2010 Jun;4(2):167-178 [FREE Full text] [CrossRef] [Medline]
  40. Amirjani N, Ashworth NL, Olson JL, Morhart M, Chan KM. Validity and reliability of the Purdue Pegboard Test in carpal tunnel syndrome. Muscle Nerve 2011 Feb;43(2):171-177. [CrossRef] [Medline]
  41. Buddenberg LA, Davis C. Test-retest reliability of the Purdue Pegboard Test. Am J Occup Ther 2000;54(5):555-558. [CrossRef] [Medline]
  42. Lewis C. Using the "Thinking Aloud" Method in Cognitive Interface Design. Yorktown Heights, NY: IBM T. J. Watson Research Center; 1982.
  43. Nussbaum R, Kelly C, Quinby E, Mac A, Parmanto B, Dicianno BE. Systematic review of mobile health applications in rehabilitation. Arch Phys Med Rehabil 2019 Jan;100(1):115-127. [CrossRef] [Medline]
  44. Dicianno BE, Fairman AD, McCue M, Parmanto B, Yih E, McCoy A, et al. Feasibility of using mobile health to promote self-management in spina bifida. Am J Phys Med Rehabil 2016 Jun;95(6):425-437. [CrossRef] [Medline]
  45. Kryger MA, Crytzer TM, Fairman A, Quinby EJ, Karavolis M, Pramana G, et al. The effect of the interactive mobile health and rehabilitation system on health and psychosocial outcomes in spinal cord injury: randomized controlled trial. J Med Internet Res 2019 Aug 28;21(8):e14305 [FREE Full text] [CrossRef] [Medline]
  46. Nguyen HQ, Donesky-Cuenco D, Wolpin S, Reinke LF, Benditt JO, Paul SM, et al. Randomized controlled trial of an internet-based versus face-to-face dyspnea self-management program for patients with chronic obstructive pulmonary disease: pilot study. J Med Internet Res 2008 Apr 16;10(2):e9 [FREE Full text] [CrossRef] [Medline]


DP: difficulty-on-performance
iMHere: Interactive Mobile Health and Rehabilitation
mHealth: mobile health
PPBT: Purdue Pegboard Test
SB: spina bifida
SCI: spinal cord injury
TUQ: Telehealth Usability Questionnaire


Edited by A Kushniruk; submitted 23.08.20; peer-reviewed by C Jones, A Miguel-Cruz; comments to author 30.09.20; revised version received 23.12.20; accepted 02.08.21; published 24.02.22

Copyright

©Kuntal Chowdhary, Daihua Xie Yu, Gede Pramana, Matthew Mesoros, Andrea Fairman, Brad Edward Dicianno, Bambang Parmanto. Originally published in JMIR Human Factors (https://humanfactors.jmir.org), 24.02.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on https://humanfactors.jmir.org, as well as this copyright and license information must be included.