Published on in Vol 5, No 3 (2018): Jul-Sept

A Patient-Facing Diabetes Dashboard Embedded in a Patient Web Portal: Design Sprint and Usability Testing

A Patient-Facing Diabetes Dashboard Embedded in a Patient Web Portal: Design Sprint and Usability Testing

A Patient-Facing Diabetes Dashboard Embedded in a Patient Web Portal: Design Sprint and Usability Testing

Original Paper

1Division of General Internal Medicine and Public Health, Vanderbilt University Medical Center, Nashville, TN, United States

2Health Information Technology, Vanderbilt University Medical Center, Nashville, TN, United States

3Department of Medicine, Vanderbilt University Medical Center, Nashville, TN, United States

4Department of Biomedical Informatics, Vanderbilt University Medical Center, Nashville, TN, United States

5Department of Pediatrics, Vanderbilt University Medical Center, Nashville, TN, United States

6School of Nursing, Vanderbilt University, Nashville, TN, United States

7Quality, Safety & Risk Prevention, Vanderbilt University Medical Center, Nashville, TN, United States

Corresponding Author:

William Martinez, MD, MS

Division of General Internal Medicine and Public Health

Vanderbilt University Medical Center

2525 West End Avenue

Suite 450

Nashville, TN, 37203

United States

Phone: 1 615 322 7277

Fax:1 615 936 1269

Email: william.martinez@vumc.org


Background: Health apps and Web-based interventions designed for patients with diabetes offer novel and scalable approaches to engage patients and improve outcomes. However, careful attention to the design and usability of these apps and Web-based interventions is essential to reduce the barriers to engagement and maximize use.

Objective: The aim of this study was to apply design sprint methodology paired with mixed-methods, task-based usability testing to design and evaluate an innovative, patient-facing diabetes dashboard embedded in an existing patient portal and integrated into an electronic health record.

Methods: We applied a 5-day design sprint methodology developed by Google Ventures (Alphabet Inc, Mountain View, CA) to create our initial dashboard prototype. We identified recommended strategies from the literature for using patient-facing technologies to enhance patient activation and designed a dashboard functionality to match each strategy. We then conducted a mixed-methods, task-based usability assessment of dashboard prototypes with individual patients. Measures included validated metrics of task performance on 5 common and standardized tasks, semistructured interviews, and a validated usability satisfaction questionnaire. After each round of usability testing, we revised the dashboard prototype in response to usability findings before the next round of testing until the majority of participants successfully completed tasks, expressed high satisfaction, and identified no new usability concerns (ie, stop criterion was met).

Results: The sample (N=14) comprised 5 patients in round 1, 3 patients in round 2, and 6 patients in round 3, at which point we reached our stop criterion. The participants’ mean age was 63 years (range 45-78 years), 57% (8/14) were female, and 50% (7/14) were white. Our design sprint yielded an initial patient-facing diabetes dashboard prototype that displayed and summarized 5 measures of patients’ diabetes health status (eg, hemoglobin A1c). The dashboard used graphics to visualize and summarize health data and reinforce understanding, incorporated motivational strategies (eg, social comparisons and gamification), and provided educational resources and secure-messaging capability. More than 80% of participants were able to successfully complete all 5 tasks using the final prototype. Interviews revealed usability concerns with design, the efficiency of use, and content and terminology, which led to improvements. Overall satisfaction (0=worst and 7=best) improved from the initial to the final prototype (mean 5.8, SD 0.4 vs mean 6.7, SD 0.5).

Conclusions: Our results demonstrate the utility of the design sprint methodology paired with mixed-methods, task-based usability testing to efficiently and effectively design a patient-facing, Web-based diabetes dashboard that is satisfying for patients to use.

JMIR Hum Factors 2018;5(3):e26

doi:10.2196/humanfactors.9569

Keywords



Background

Diabetes is a leading cause of kidney failure, heart disease, stroke, visual impairment, and nontraumatic lower limb amputations [1]. Many of these complications can be delayed or prevented through disease control. Research demonstrates that diabetes self-monitoring, preventative health services, medication adherence, regular exercise, and attention to diet can lead to improved outcomes [2,3]. Despite their importance, few patients consistently receive all recommended services or engage in recommended self-care behaviors that can be challenging to implement and sustain [4,5]. Many patients with diabetes struggle with the knowledge and motivation necessary to successfully manage their disease [6].

Interventions aimed at enhancing patients’ motivation, skills, knowledge, and confidence in diabetes self-care have had limited success, with many relying on face-to-face interactions that are costly and challenging to scale [7,8]. Web-based diabetes self-management interventions have the potential to overcome these limitations; however, these interventions have also demonstrated variable effects on patients’ self-care and glycemic control [9,10]. Mixed results have been attributed to differences in the design and usability of these Web-based interventions, leading to varying degrees of user engagement [10,11]. Web-based interventions with greater user engagement are associated with better outcomes [12,13]. However, some Web-based interventions have not involved end users in the design process [14,15], and many have failed to include one or more recommended features for increasing patient engagement, including (1) ability to track, visualize, and summarize health data; (2) guidance in response to the data displayed; (3) ability to communicate with health care providers; (4) peer support; and (5) motivational challenges using elements of game design and competition [11,16].

Human-centered design is an approach to software development that emphasizes optimal user experience by integrating users directly into the design process and helps ensure the creation of a suitable user interface [17,18]. One human-centered design method, called design sprint, is a rapid 5-phase user-centered process that utilizes design principles to understand the problem, explore creative solutions, identify and map the best ideas, prototype, and ultimately test [17,18]. Usability testing ensures that Web-based interventions meet users’ expectations and work as intended, such that users are able to efficiently and effectively interact with the website [11]. Although usability testing is sometimes performed once the Web-based intervention has been fully developed, incorporating usability testing into the design process beginning with the earliest prototype provides the greatest opportunity to inform and improve the user interface design [17,18].

Objectives

This paper describes the application of design sprint methodology paired with mixed-methods, task-based usability testing to design and evaluate an innovative, patient-facing diabetes dashboard embedded in an existing patient portal, My Health at Vanderbilt (MHAV) [19] and integrated into an electronic health record. In particular, we sought to design a dashboard that addresses the needs of users, allows users to easily comprehend their diabetes health data, incorporates recommended strategies for increasing user engagement, and is satisfying and easy to use.


Dashboard Design

We utilized a 5-day design sprint methodology [17,18] developed by Google Ventures (Alphabet Inc, Mountain View, CA) to design our initial dashboard prototype. The process was facilitated by an experienced health information technology expert (ALT) who specializes in user experience (UX) and product design. A 5-day design sprint approach was chosen over other iterative agile methodologies because a design sprint approach offered the ability to rapidly develop a user-centered solution in the form of a prototype that could be tested and revised before investing limited research funds into the programming of the dashboard.

On day 1, we began by mapping out our challenge (Figure 1) to create a dashboard that would satisfy patients’ desire for information regarding their diabetes health status and address existing challenges in patients’ diabetes knowledge and motivation for diabetes self-management [5,20]. This process was informed by a review of the literature [14,21-30] from which we identified factors contributing to the limited efficacy of existing digital interventions, including (1) absence of user-centered design [14], (2) lack of integration with the health care delivery system [22,28], (3) absence of key features to maximize patient engagement, including patient-centered motivational strategies [29], and (4) failure to account for the unique needs of older patients and those with limited health literacy [30-32]. In addition, we reviewed recommended strategies to increase patient activation [6,33] (ie, the motivation, knowledge, skills, and confidence for managing one’s health condition) using mobile apps [16] and prior research on the potential role of social comparison information for motivating diabetes self-care [27,34].

Figure 1. Whiteboard image mapping out challenge to create a patient-facing, diabetes dashboard.
View this figure

We also met one-on-one with expert stakeholders (eg, patient portal users with diabetes, diabetes educators, behavioral scientists, physicians, educators, and nurses) to ask questions aimed at enhancing our understanding of the challenge and refine our map. We identified expert stakeholders by approaching organizational leaders with a description of the project and by asking them to identify individuals in their area who could provide valuable input. For example, we approached the director of the Vanderbilt University Hospital Patient and Family Advisory Council who connected us with patients from the Council, who had diabetes, were current patient portal users, and expressed interest in improving care for people with diabetes. Experts’ comments were recorded in the form of how might we (HMW) statements [17,18]. The HMW method is used in design thinking to take insights and challenges and reframe them as opportunities [17,18]. Consistent with design sprint methodology, experts’ HMW statements were reviewed (Figure 2) to identify statements that shared a common theme. This was followed by grouping the statements into categories based on emerging themes to identify the most useful ideas for building the prototype. Experts encouraged the authors to consider how we might design the dashboard to (1) maximize accessibility, (2) frame diabetes health data in ways that promote patients’ understanding and motivate health behaviors, (3) facilitate patient action in response to the data they see (eg, patient resources and referral services), (4) enable communication with their health care team, (5) enhance social supports, and (6) incorporate strategies (eg, goal setting, progress tracking, and positive reinforcement) that motivate health behavior and keep users engaged.

On day 2, the existing ideas, architecture, and designs from health care and other industries related to the challenge were reviewed to establish the building blocks of our prototype. For example, existing solutions for displaying health and performance data and other types of quantitative, longitudinal, and benchmarked data from other industries (eg, finance and education) were reviewed. Subsequently, findings from the review and the meetings with expert stakeholders were used to sketch our own solutions (Figure 3).

On day 3, the solutions were critiqued and the solutions that had the greatest potential to successfully meet the challenge in the long term were decided by consensus. Following this, the authors adapted the solutions chosen to create a storyboard or step-by-step plan for the prototype (Figure 4).

On day 4, the authors developed the prototype using Apple Keynote (Apple Inc, Cupertino, CA) [35]. They collected assets (eg, stock imagery or icons) and stitched all components of the prototype together. Keynote slides (ie, screens) were tethered together using the animate feature to transition from one slide (ie, screen) to the next based on the action the user performs within the prototype. This resulted in an initial prototype (Figure 5) that functioned similar to a real webpage and was ready for the first round of usability testing on day 5. The initial prototype displayed and summarized 5 measures of patients’ diabetes health status (ie, hemoglobin A1c [HbA1c], systolic blood pressure, low-density lipoprotein cholesterol, microalbumin, and flu vaccination status). The existing literature on patient’s information needs when interpreting test results and strategies for improving comprehension was reviewed [36-38]. In addition, the authors identified recommended strategies for using patient-facing technologies to increase patient activation and incorporated dashboard functionality that matched each strategy. For example, for each measure, the dashboard used graphics to visualize and summarize health data and reinforce understanding with a color-coded system (red, yellow, and green) similar to the National Heart, Lung, and Blood Institute’s asthma treatment guideline [39] to indicate when action is needed. To facilitate understanding, we paired each measure with hyperlinks to literacy level–appropriate educational materials. To help motivate patients, the dashboard provided patients with social and goal-based comparison information regarding their diabetes health status [27,34]. In addition, using elements of game design, a star rating provided patients with feedback on the number of measures at goal. To facilitate communication with their health care team, patients could click a link to contact their doctor’s office via a secure message. Reminders for self-care (eg, take medication, exercise, etc) could be set and delivered to patients’ mobile phones or email, and diabetes self-care goals could be set and tracked.

Usability Study Design

From September to October 2016, we conducted a mixed-methods, task-based usability study of dashboard prototypes with individual patients under controlled conditions. Patients were recruited from the Vanderbilt Adult Primary Care (VAPC) clinic. Individual usability sessions lasted between 30 and 75 min. Given that the majority of usability problems are commonly identified within the first 5 usability evaluations [40-42], each round of usability testing included between 3 and 6 participants. After each round of usability testing, the dashboard prototype was revised in response to usability findings before the next round of testing.

Figure 2. Design sprint day 1—expert comments/ideas organized into categories.
View this figure
Figure 3. Design sprint day 2—solution sketches.
View this figure
Figure 4. Design sprint day 3—dashboard storyboard.
View this figure
Figure 5. Design sprint day 4—screenshot of initial dashboard prototype. A1c: hemoglobin A1c.
View this figure

Setting

The VAPC clinic is located within the Vanderbilt University Medical Center (VUMC) in Nashville, TN. The clinic cares for about 25,000 unique patients annually, of which about 4500 (18.00%) have diabetes. All clinical data are entered into an electronic health record, and the patients are provided access to their clinical data via a Web portal.

Participants and Recruitment

Participants were eligible for the study if they had type 2 diabetes mellitus, were English-speaking, were aged 21 years or older, and were current users of the VUMC patient Web portal, MHAV. Potential participants were identified automatically using VUMC’s Subject Locator to query the electronic health records of patients with upcoming clinic appointments for discrete inclusion and exclusion criteria. Identified patients (n=334) were mailed a letter describing the study and asked to contact the investigators if they were interested in participating. Interested patients (n=22) contacted the research coordinator to learn more about the study and confirm eligibility. Patients who agreed to participate (n=17) were scheduled to participate in a usability session on the day of their clinic appointment. Overall, 3 patients canceled due to weather or a conflicting appointment. A total of 14 patients ultimately completed a usability session and provided written informed consent before participating in their session. The Vanderbilt University Institutional Review Board approved this research.

Data Collection and Measures

Before the usability testing session, enrolled patients were asked to complete a short questionnaire before their interview. The questionnaire included basic demographic questions, including items about computer and smartphone usage and internet access, as well as validated measures of health literacy [43] and numeracy [44]. In addition, data regarding comorbidities were extracted from participants’ medical record as reported by the physicians within the patients’ problem list.

Each participant received a standardized introduction to the dashboard and the think-aloud procedure that allows testing observers to understand and track a participant’s thought processes as they navigate the dashboard [45]. One of the authors (ALT) led each session using a semistructured interview guide, while another author (WM) observed and took notes. With a dashboard prototype that contained fictitious patient data, participants were asked to perform common standardized tasks including logging in, retrieving HbA1c data, messaging their doctor, setting a reminder, and setting a goal. The tasks were designed to represent what typical users might do when visiting their dashboard. All participants accessed and navigated the dashboard using a 15-inch MacBook Pro 11,3 (2014 generation) with an external mouse and Chrome Web browser with default resolution. In addition, after participants attempted each assigned task (eg, message your doctor), the interviewers used open-ended questions outlined in the interview guide to elicit participants’ (1) expectations for the feature’s functionality, (2) ability to comprehend the information displayed, (3) ability to navigate to and from the feature, (4) satisfaction with the feature, and (5) how the feature might be improved. Each session was audio-recorded, and the computer screen was video-recorded using QuickTime Player (Apple Inc, Cupertino, CA).

To assess and quantify participant satisfaction with the dashboard, at the conclusion of their usability session, participants completed 12 items from the Computer System Usability Questionnaire (CSUQ), which assess participants’ perceptions of the dashboard’s ease of use, likability of the interface, and overall satisfaction using a 7-point Likert response scale (1=strongly disagree to 7=strongly agree), with 7 indicating the highest possible satisfaction [46].

Data Analysis

Task Completion Analysis

Task completion was coded with a usability rating scale utilized in prior studies [47-49]. Task completion was rated on a 5-category scale: (1) successful/straightforward, (2) successful/prolonged, (3) partial, (4) unsuccessful/prolonged, and (5) gave up [47]. Two coders first coded the same usability session video (not used in the analysis) to calibrate their coding. They subsequently coded the remaining videos independently. Disagreements were resolved by consensus, and both coders were blinded to the dashboard prototype representing the initial prototype and the prototypes that were revisions.

Interview Analysis

Audio files of interviews were submitted to a professional transcription service, Rev.com Inc (San Francisco, CA). Transcripts were checked for accuracy and identifying information was removed. Deidentified transcripts were imported into NVivo 10 (version 10; QSR International, Burlington, VT) for coding and analysis. Similar to other health app usability studies [47,50], we used selective coding to capture participants’ comments about usability concerns [51]. Participant comments were sorted into categories that addressed 3 elements of usability: design, efficiency of use, and content and terminology [52]. A research assistant with training in qualitative methods coded all interviews. After the initial coding, a second trained coder reviewed each code and noted any discrepancies. The 2 coders then met and resolved any differences by consensus. Illustrative quotes from participants were edited slightly for grammar and clarity for inclusion in this paper. Participants’ comments informed revisions to the dashboard prototype.

Statistical Analysis

Descriptive statistics were used to characterize the study participants, task completion, and survey data. All analyses were completed with SAS version 9.4 (SAS Institute, Inc, Cary, NC).

Stop Criteria

Data analysis began after the initial round of testing, and the authors used the findings to inform prototype revisions before the subsequent round of testing. Additional rounds of testing were conducted until the majority of participants within a round of testing (1) were able to successfully complete all tasks, (2) indicated high overall satisfaction with the dashboard as assessed by the overall satisfaction item on the CSUQ (score≥6), and (3) expressed no new usability concerns during the interview (ie, saturation).


Participants

Table 1 shows participant characteristics. The sample (N=14) comprised 5 patients in round 1, 3 patients in round 2, and 6 patients in round 3; at this point, the authors reached their stop criteria. Participants’ mean age was 63 years (range 45-78 years), 57% (8/14) were female, and 50% (7/14) were white. All participants reported using a home computer, and 64% (9/14) reported using a smartphone. All participants had home internet access. Most participants had one or more comorbid diseases in addition to diabetes.

Task-Based Usability

Figure 6 illustrates task performance among the 5 participants in round 1 who tested the initial prototype compared with the 6 participants in round 3 who tested the final prototype. Participants attempted 5 tasks that ranged in complexity from logging in to setting a reminder.

Tasks: (A) Log-In and (B) Set a Goal

All participants in both rounds straightforwardly logged in to the dashboard and set a goal.

Task: (C) Identify Most Recent Hemoglobin A 1c

Only one participant in the initial round of testing was able to identify their most recent HbA1c value from the dashboard. Most participants had difficulty interpreting the dial display, were confused regarding which icon on the dial indicated the user’s most current value, and could not comprehend the HbA1c data. In response, the authors revised the data display design and status indicator icons. They relocated the features aimed at facilitating patients’ understanding of their health data, including a hover over info icon providing a nontechnical description of the measure (eg, HbA1c) and links to literacy level–sensitive educational materials so they were adjacent to the data (see Figure 1 initial prototype and Figure 7 final prototype). After revisions, all 6 participants in the final round were able to complete the task and comprehend their data.

Task: (D) Message Doctor’s Office

All 5 participants in the initial round were able to message their doctor’s office; however, 2 participants hesitated or demonstrated some confusion despite completing the task. Participants indicated that they were accustomed to using the existing messaging icon within the header of the patient portal, and some struggled to locate the messaging icon within the dashboard. After revising the icon in response to feedback (ie, larger text, adding color and a button icon), the majority of participants in the final round successfully completed the task. However, 3 participants continued to initially attempt messaging via the existing icon in the header, one of whom completed the task only after being directed to the correct button icon.

Table 1. Participant characteristics.
CharacteristicTotal (N=14)Round 1 (N=5)Round 2 (N=3)Round 3 (N=6)
Age (years), mean (SD)63.4 (11.0)62.2 (10.3)75.7 (3.2)58.2 (9.9)
Age (years), n (%)




40-491 (7)0 (0)0 (0)1 (16)

50-594 (29)2 (40)0 (0)2 (33)

60-694 (29)2 (40)0 (0)2 (33)

70-795 (36)1 (20)3 (100)1 (16)
Gender, n (%)




Female8 (57)3 (60)0 (0)5 (83)

Male6 (43)2 (40)3 (100)1 (17)
Race, n (%)




White7 (50)3 (60)1 (33)3 (50)

African American3 (21)1 (20)1 (33)1 (17)

Asian2 (14)1 (20)1 (33)0 (0)

Other2 (14)0 (0)0 (0)2 (33)
Education, n (%)




High school degree / graduate equivalency degree1 (7)1 (20)0 (0)0 (0)

Some college3 (21)1 (20)0 (0)2 (33)

College degree5 (36)1 (20)2 (67)2 (33)

Postgraduate degree5 (36)2 (40)1 (33)2 (33)
Health literacy, mean (rangea)13.4 (11-15)13.2 (12-15)12.7 (11-15)14.0 (13-15)
Numeracy, mean (rangeb)15.0 (7-18)13.0 (7-18)17.0 (16-18)15.7 (10-18)
Home computer userc, n (%)14 (100)5 (100)3 (100)6 (100)
Smartphone user, n (%)9 (64)3 (60)2 (67)4 (67)
Home internet access, n (%)14 (100)5 (100)3 (100)6 (100)
Comorbidities, n (%)




Hyperlipidemia10 (71)3 (60)3 (100)4 (67)

Atherosclerotic cardiovascular disease3 (21)0 (0)1 (33)2 (33)

Hypertension7 (50)2 (40)3 (100)2 (33)

Chronic kidney disease3 (21)1 (20)1 (33)1 (17)

aPossible score range: 3 (worst) to 15 (best).

bPossible score range: 3 (worst) to 18 (best).

cIncludes desktops, laptops, or tablets.

Figure 6. Task-based usability ratings for initial and final prototype iterations. The asterisk indicates that one participant within the final round of testing was not asked to complete the task due to time constraints. HbA1c: hemoglobin A1c.
View this figure
Figure 7. Screenshot of final dashboard prototype. A1c: hemoglobin A1c.
View this figure
Task: (E) Set a Reminder

Only 2 participants in round 1 were able to set a reminder on the dashboard. Participants struggled to set the frequency of recurrence and a stop date for reminders they wished to receive only for a specified time. Subsequently, the authors revised the layout of the “set reminder” pop up window to include a clear start and stop date and time, as well as a drop-down menu to set recurrences (eg, daily, weekly, etc). After revisions, 4 of 6 participants in round 3 were able to set a reminder, with one additional participant successfully completing the task with prolonged effort.

Participant Interviews

Table 2 shows the participants’ comments about usability concerns grouped by usability area. Several revisions were made in response to participants’ usability concerns, including revisions to the display of patients’ health data and star status, icons indicating the patient’s value and “patients like me” value, standardizing educational links and adding diet information, grouping and standardizing action items, enlarging the font size, and providing a frequently asked questions page (see Figure 1 initial prototype and Figure 7 final prototype).

Satisfaction Survey

Table 3 reports mean scores for the CSUQ items among participants in round 1 who tested the initial prototype compared with participants in round 3 who tested the final prototype. Participants who tested the initial prototype and those who tested the final prototype rated the usability above average (ie, scores >4 on a 7-point scale) for all 12 items. The mean score for all 12 items improved between the initial and final prototypes.

Table 2. Participants’ concerns with dashboard usability.
Usability element and unique concern typeIllustrative quote
Design

Font sizeIt’s very clear to me but I would definitely want to enlarge the size of the font.

Patient status indicatorI don’t know what [the indicator] is supposed to be. Still I want to figure it out. The person’s goal would be about 6.2 and the actual would be 7.5. Is that correct?

Reminder functionalityThat’s a reminder, oh! That’s a clock symbol. Gotcha. It could be clearer [laughs].

Patients like me indicatorNot clear that this [icon] is for individual. This [icon] is for group. Up to here [group icon], just add one more figure so that will show more people.

Star ratingThere’s a star over here, on this side, but does it indicate the same thing as the star rating over here? By rating, is that telling me that I’m doing poor, good, with my goals?

Hover over functionalityNo I wouldn’t have known [I could hover over]. Once you clicked, then I realized.

Goal setting functionalityThe end date [for the goal], you’re talking about the last day of your, I don’t get that. The end date [for the goal]. Help me.
Efficiency of use

RedundancyI mean those two things [my medical concerns drop down menu] and the message subject [free text; are the same].
Content and terminology

Historical valuesI’d actually like to see what my last three [HbA1c] were.

Medical jargonI don’t even know what [microalbumin] is. I’ve never heard of that.

Diet informationIf you could just do something about diet. I don’t see that on there anywhere. I mean, because that’s like a big part of it, like what can I eat, what should I eat.

Online communityYou’re not going to be able to communicate with other patients and talk about the key things they do for support. That might be something you would add.
Table 3. Computer system usability questionnaire survey items assessing the dashboard usability: initial versus final prototype.
ItemInitial prototype (n=5), mean (SD)Final prototype (n=6), mean (SD)
Overall, I am satisfied with how easy it is to use this system.5.6 (1.1)6.3 (0.8)
It is simple to use this system.6.0 (0.8)6.3 (0.8)
I feel comfortable using this system.5.7 (1.3)6.5 (1.3)
It was easy to learn to use this system.6.2 (0.8)6.5 (0.8)
It is easy to find the information I need.5.6 (1.5)4.8 (1.2)
The information provided with the system is easy to understand.5.4 (1.7)5.8 (1.2)
The organization of information on the system screens is clear.4.2 (2.2)6.5 (0.5)
The interface of this system is pleasant.5.4 (1.3)6.5 (0.5)
I like using the interface of this system.5.4 (1.1)6.5 (0.5)
The system has all the functions and capabilities I expect it to have.6.0 (0.7)6.2 (0.8)
Overall, I am satisfied with this system.5.8 (0.4)6.7 (0.5)
The system is visually appealing.5.8 (1.3)6.5 (0.5)

Principal Findings

Our study illustrates the use of design sprint methodology alongside mixed-methods, task-based usability testing in the design of a Web-based intervention for patients with diabetes. By using this design approach, we were able to rapidly create a prototype and rigorously assess task-based usability before any programming. Task-based usability testing and qualitative analysis of interviews with a small number of participants quickly identified usability challenges that led to improvements in successive iterations. Participant feedback informed changes in the data display that led to improved comprehension of diabetes health data. Participants’ usability satisfaction surveys demonstrated a high level of satisfaction with the dashboard that improved from initial to final prototype. The final prototype incorporated recommended strategies to enhance patient activation across the engagement spectrum, from providing educational resources to promoting behavior change through rewards (see Figure 8) [16].

Building Upon Prior Research

Several prior studies have reported the design and usability of patient-facing health apps and Web-based interventions for patients with diabetes [50,53-58]. Approaches to the design of these health apps and Web-based interventions typically employ some variation of user-centered design [56-59]. A significant limitation of prior design approaches is the time and cost involved with the rapidly evolving pace of technology [60,61]. This study is the first in our knowledge to report the design of a digital health intervention using design sprint methodology and demonstrate its utility in efficiently and effectively designing a Web-based intervention that is satisfying to use.

By utilizing design sprint methodology, we were able to create a viable initial prototype within 5 days. Given the rapidly evolving technology and patient expectations of health technology [60,62], efficient yet rigorous design methodology is essential. We were able to enhance the scientific rigor of the design sprint approach by using validated measures of usability [46] and task-performance [47-49], as well as an established qualitative methodology to analyze interviews and determine saturation [51]. This approach allows usability concerns to be identified before programming, potentially saving the researcher both time and money. Consistent with the findings of Nielsen, we found that the majority of usability problems were identified in the first 5 usability evaluations, with diminishing returns after the eighth evaluation [40-42]. While enrolling additional participants in our study may have revealed additional usability concerns, our sample was sufficient to establish a minimally viable product (eg, final prototype) that allowed us to proceed to program the dashboard with the reasonable confidence that most usability issues were identified and addressed. As with any app or website, ongoing attention to user feedback and iterative improvements are likely to continue indefinitely as technology and users evolve. Although some usability studies employ a large number of participants, this is mostly done to provide sufficient sample size for quantitative analyses, and additional participants yield relatively few new usability concerns [40-42]. In addition, our usability findings build upon other recent studies of patient-facing diabetes health apps [50,53,59]. Georgsson et al used a similar mixed-methods approach to evaluate the usability of their mHealth system for diabetes type 2 self-management [53]. Similar to this study, their study included task-based testing with a think-aloud protocol, semistructured interviews, and a questionnaire on patients’ experiences using their system. Consistent with Georgsson et al, we found a mixed-methods approach resulted in a comprehensive understanding of usability. Our study extends these findings by demonstrating the effectiveness of this approach to objectively assess and track usability in response to iterative revisions of a prototype in the design phase.

Figure 8. Recommended strategies for patient activation and paired dashboard functionality by level of patient engagement.The asterisk refers to the engagement pyramid reported by Singh et al, 2016 [16]. HbA1c: hemoglobin A1c.
View this figure

Our study also has implications for the design of patient portals and the display of patients’ health data. By giving patients direct access to their health data, patient portals can improve patient engagement [63] and empower patients to actively participate in their care [64]. However, research suggests that patients struggle to understand health data communicated to them via patient portals [65]. A recent study by Giardian et al suggests that current patient portals do not display health data in a patient-centered way, which can lead to misunderstandings and patient distress [66]. In our study, patients had difficulty comprehending HbA1c data in the dial display (Figure 1) that improved with ruler display (Figure 7), demonstrating the importance of user-centered design. Although the content was relatively unchanged, we revised the display based on user feedback, resulting in increased comprehension and improved visibility of features aimed at facilitating patients’ understanding of their health data.

Limitations

This study has important limitations. We recruited a convenience sample of patients from a single, large, urban academic medical center that may limit the generalizability of our findings. Our sample included patients who were more educated and had greater computer and internet access than the overall population of patients with diabetes [67,68]. For future studies, researchers should consider purposive sampling to recruit patients with specific characteristics. Given the known barriers to usability among older patients [15], a strength of our sample was the inclusion of a majority of patients over the age of 60 years that allowed us to ensure the dashboard usability among this demographic. In addition, although we were able to directly observe individual users as they attempted several assigned tasks using the dashboard, our data are subject to the Hawthorne effect (ie, altered behavior due to an awareness of being observed). Similarly, we did not collect data on how patients would engage with the dashboard on their own. It would be useful to collect actual-use data in future studies including the level of engagement with specific dashboard functions over time. Although we designed the dashboard with elements aimed at increasing patient activation, this study focused on the design and task-based usability of the dashboard and not on the evaluation of its impact. Further research is needed to test the efficacy of the dashboard on cognitive, behavioral, and clinical outcomes including patient activation.

Researchers and others considering using design sprint methodology should also consider some of the limitations of the approach. Although a standard design sprint that unfolds over 5 days is generally recommended [17,18], researchers may wish to experiment with shorter, or more likely, longer sprints. Design sprint methodology relies on understanding the user (ie, the consumer and their needs), and in some instances, it may be necessary to spend additional time before the design sprint to understand the target user and their needs and challenges. In our case, a literature review on the patients’ experiences with portal use, challenges with diabetes self-management, and the limitations of existing diabetes apps provided insights about our target users. Design sprints also rely heavily on the ideas generated from the solutions sketched by team members on day 2. Therefore, this phase of idea generation should not be shortened and may, in fact, benefit from more time.

Conclusions

In conclusion, the results underscore the value of design sprint methodology to efficiently create a viable user-centric prototype of a Web-based intervention and the importance of mixed-methods evaluation of usability as a part of the design phase beginning with the initial prototype. Design sprints offer an efficient way to define the problem, assess the needs of users, iteratively generate ideas and develop a viable product for testing, whereas usability evaluation methods ensure health apps and Web-based interventions appeal to users and support their use.

Acknowledgments

This work was supported by the National Institute of Diabetes and Digestive and Kidney Diseases / National Institutes of Health (K23DK106511 and 2P30DK092986-07) and the National Center for Advancing Translational Sciences / National Institutes of Health UL1 TR000445. We are grateful to Ricardo J Trochez, BA, and Kemberlee R Bonnet, MA, for their assistance as coders.

Conflicts of Interest

None declared.

  1. Centers for Disease Control and Prevention. Atlanta, GA: US Department of Health and Human Services; 2017. National Diabetes Statistics Report, 2017   URL: https://www.cdc.gov/diabetes/pdfs/data/statistics/national-diabetes-statistics-report.pdf [WebCite Cache]
  2. Powers MA, Bardsley J, Cypress M, Duker P, Funnell MM, Fischl AH, et al. Diabetes self-management education and support in type 2 diabetes. Diabetes Educ 2017 Dec;43(1):40-53. [CrossRef] [Medline]
  3. Shrivastava SR, Shrivastava PS, Ramasamy J. Role of self-care in management of diabetes mellitus. J Diabetes Metab Disord 2013 Mar 05;12(1):14 [FREE Full text] [CrossRef] [Medline]
  4. Bennett KJ, McDermott S, Mann JR, Hardin J. Receipt of recommended services among patients with selected disabling conditions and diabetes. Disabil Health J 2017 Jan;10(1):58-64. [CrossRef] [Medline]
  5. McBrien KA, Naugler C, Ivers N, Weaver RG, Campbell D, Desveaux L, et al. Barriers to care in patients with diabetes and poor glycemic control: a cross-sectional survey. PLoS One 2017 May 1;12(5):e0176135 [FREE Full text] [CrossRef] [Medline]
  6. Sacks RM, Greene J, Hibbard J, Overton V, Parrotta CD. Does patient activation predict the course of type 2 diabetes? A longitudinal study. Patient Educ Couns 2017 Jul;100(7):1268-1275. [CrossRef] [Medline]
  7. Bolen SD, Chandar A, Falck-Ytter C, Tyler C, Perzynski AT, Gertz AM, et al. Effectiveness and safety of patient activation interventions for adults with type 2 diabetes: systematic review, meta-analysis, and meta-regression. J Gen Intern Med 2014 Aug;29(8):1166-1176 [FREE Full text] [CrossRef] [Medline]
  8. Sullivan SD, Dalal MR, Burke JP. The impact of diabetes counseling and education: clinical and cost outcomes from a large population of US managed care patients with type 2 diabetes. Diabetes Educ 2013;39(4):523-531. [CrossRef] [Medline]
  9. Wu Y, Yao X, Vespasiani G, Nicolucci A, Dong Y, Kwong J, et al. Mobile app-based interventions to support diabetes self-management: a systematic review of randomized controlled trials to identify functions associated with glycemic efficacy. JMIR Mhealth Uhealth 2017 Mar 14;5(3):e35 [FREE Full text] [CrossRef] [Medline]
  10. Cotter AP, Durant N, Agne AA, Cherrington AL. Internet interventions to support lifestyle modification for diabetes management: a systematic review of the evidence. J Diabetes Complications 2014;28(2):243-251 [FREE Full text] [CrossRef] [Medline]
  11. Yu CH, Bahniwal R, Laupacis A, Leung E, Orr MS, Straus SE. Systematic review and evaluation of web-accessible tools for management of diabetes and related cardiovascular risk factors by patients and healthcare providers. J Am Med Inform Assoc 2012;19(4):514-522 [FREE Full text] [CrossRef] [Medline]
  12. Glasgow RE, Christiansen SM, Kurz D, King DK, Woolley T, Faber AJ, et al. Engagement in a diabetes self-management website: usage patterns and generalizability of program use. J Med Internet Res 2011 Jan 25;13(1):e9 [FREE Full text] [CrossRef] [Medline]
  13. Bennett GG, Herring SJ, Puleo E, Stein EK, Emmons KM, Gillman MW. Web-based weight loss in primary care: a randomized controlled trial. Obesity (Silver Spring) 2010 Feb;18(2):308-313 [FREE Full text] [CrossRef] [Medline]
  14. McCurdie T, Taneva S, Casselman M, Yeung M, McDaniel C, Ho W, et al. mHealth consumer apps: the case for user-centered design. Biomed Instrum Technol 2012;Suppl:49-56. [CrossRef] [Medline]
  15. Isaković M, Sedlar U, Volk M, Bešter J. Usability pitfalls of diabetes mHealth apps for the elderly. J Diabetes Res 2016;2016:1604609 [FREE Full text] [CrossRef] [Medline]
  16. Singh K, Drouin K, Newmark LP, Rozenblum R, Lee J, Landman A, et al. Developing a framework for evaluating the patient engagement, quality, and safety of mobile health applications. Issue Brief (Commonw Fund) 2016 Feb;5:1-11. [Medline]
  17. Banfield R, Lombardo CT, Wax T. Design Sprint: A Practical Guidebook for Building Great Digital Products. Sebastopol, CA: O'Reilly Media, Inc; 2015.
  18. Knapp J, Zeratsky J, Kowitz B. Sprint: How to Solve Big Problems and Test New Ideas in Just Five Days. New York, NY: Simon and Schuster; 2016.
  19. Osborn CY, Rosenbloom ST, Stenner SP, Anders S, Muse S, Johnson KB, et al. MyHealthAtVanderbilt: policies and procedures governing patient portal functionality. J Am Med Inform Assoc 2011 Dec;18 Suppl 1:i18-i23 [FREE Full text] [CrossRef] [Medline]
  20. Sweileh WM, Zyoud SH, Abu Nab'a RJ, Deleq MI, Enaia MI, Nassar SM, et al. Influence of patients' disease knowledge and beliefs about medicines on medication adherence: findings from a cross-sectional survey among patients with type 2 diabetes mellitus in Palestine. BMC Public Health 2014 Jan 30;14:94 [FREE Full text] [CrossRef] [Medline]
  21. Amante DJ, Hogan TP, Pagoto SL, English TM. A systematic review of electronic portal usage among patients with diabetes. Diabetes Technol Ther 2014 Nov;16(11):784-793. [CrossRef] [Medline]
  22. Eng DS, Lee JM. The promise and peril of mobile health applications for diabetes and endocrinology. Pediatr Diabetes 2013 Jun;14(4):231-238 [FREE Full text] [CrossRef] [Medline]
  23. Kuo A, Dang S. Secure messaging in electronic health records and its impact on diabetes clinical outcomes: a systematic review. Telemed J E Health 2016 Dec;22(9):769-777. [CrossRef] [Medline]
  24. Osborn CY, Mayberry LS, Mulvaney SA, Hess R. Patient web portals to improve diabetes outcomes: a systematic review. Curr Diab Rep 2010 Dec;10(6):422-435 [FREE Full text] [CrossRef] [Medline]
  25. Payne HE, Lister C, West JH, Bernhardt JM. Behavioral functionality of mobile apps in health interventions: a systematic review of the literature. JMIR Mhealth Uhealth 2015 Feb 26;3(1):e20 [FREE Full text] [CrossRef] [Medline]
  26. Martínez-Pérez B, de la Torre-Díez I, López-Coronado M. Mobile health applications for the most prevalent conditions by the World Health Organization: review and analysis. J Med Internet Res 2013 Jun 14;15(6):e120 [FREE Full text] [CrossRef] [Medline]
  27. Schokker MC, Keers JC, Bouma J, Links TP, Sanderman R, Wolffenbuttel BH, et al. The impact of social comparison information on motivation in patients with diabetes as a function of regulatory focus and self-efficacy. Health Psychol 2010 Jul;29(4):438-445. [CrossRef] [Medline]
  28. El-Gayar O, Timsina P, Nawar N, Eid W. Mobile applications for diabetes self-management: status and potential. J Diabetes Sci Technol 2013 Jan 01;7(1):247-262 [FREE Full text] [CrossRef] [Medline]
  29. Hood M, Wilson R, Corsica J, Bradley L, Chirinos D, Vivo A. What do we know about mobile applications for diabetes self-management? A review of reviews. J Behav Med 2016 Dec;39(6):981-994. [CrossRef] [Medline]
  30. Lyles CR, Sarkar U, Osborn CY. Getting a technology-based diabetes intervention ready for prime time: a review of usability testing studies. Curr Diab Rep 2014 Oct;14(10):534 [FREE Full text] [CrossRef] [Medline]
  31. Arnhold M, Quade M, Kirch W. Mobile applications for diabetics: a systematic review and expert-based usability evaluation considering the special requirements of diabetes patients age 50 years or older. J Med Internet Res 2014 Apr 09;16(4):e104 [FREE Full text] [CrossRef] [Medline]
  32. Sarkar U, Karter AJ, Liu JY, Adler NE, Nguyen R, Lopez A, et al. The literacy divide: health literacy and the use of an internet-based patient portal in an integrated health system-results from the diabetes study of northern California (DISTANCE). J Health Commun 2010;15 Suppl 2:183-196 [FREE Full text] [CrossRef] [Medline]
  33. Hibbard JH, Stockard J, Mahoney ER, Tusler M. Development of the Patient Activation Measure (PAM): conceptualizing and measuring activation in patients and consumers. Health Serv Res 2004 Aug;39(4 Pt 1):1005-1026 [FREE Full text] [CrossRef] [Medline]
  34. Martinez W, Wallston KA, Schlundt DG, Hickson GB, Bonnet KR, Trochez RJ, et al. Patients' perspectives on social and goal-based comparisons regarding their diabetes health status. Br Med J Open Diabetes Res Care 2018;6(1):e000488 [FREE Full text] [CrossRef] [Medline]
  35. Siddiqui H. Keynote Animation – How To Prototype UI. Freiburg, Germany: Smashing Magazine; 2015.   URL: https://www.smashingmagazine.com/2015/08/animating-in-keynote/ [WebCite Cache]
  36. Elder NC, Barney K. “But what does it mean for me?” Primary care patients' communication preferences for test results notification. Jt Comm J Qual Patient Saf 2012 Apr;38(4):168-176. [Medline]
  37. Torsvik T, Lillebo B, Mikkelsen G. Presentation of clinical laboratory results: an experimental comparison of four visualization techniques. J Am Med Inform Assoc 2013;20(2):325-331 [FREE Full text] [CrossRef] [Medline]
  38. Zikmund-Fisher BJ, Scherer AM, Witteman HO, Solomon JB, Exe NL, Tarini BA, et al. Graphics help patients distinguish between urgent and non-urgent deviations in laboratory test results. J Am Med Inform Assoc 2017 May 1;24(3):520-528. [Medline]
  39. National Institutes of Health. 2007. Guidelines for the Diagnosis and Management of Asthma   URL: https://www.nhlbi.nih.gov/files/docs/guidelines/asthsumm.pdf [WebCite Cache]
  40. Nielsen J, Landauer TK. A mathematical model of the finding of usability problems. 1993 Presented at: INTERACT'93 and CHI'93 Conference on Human Factors in Computing Systems; April 24-29, 1993; Amsterdam, The Netherlands p. 206-213.
  41. Nielsen J. Usability inspection methods. 1994 Presented at: CHI 94 ACM Conference Companion on Human Factors in Computing Systems; April 24-28, 1994; Boston, MA p. 413-414. [CrossRef]
  42. Nielsen J. Nielsen Norman Group. 2000. Why You Only Need to Test with 5 Users   URL: https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/ [WebCite Cache]
  43. Sarkar U, Schillinger D, López A, Sudore R. Validation of self-reported health literacy questions among diverse English and Spanish-speaking populations. J Gen Intern Med 2011 Mar;26(3):265-271 [FREE Full text] [CrossRef] [Medline]
  44. Fagerlin A, Zikmund-Fisher BJ, Ubel PA, Jankovic A, Derry HA, Smith DM. Measuring numeracy without a math test: development of the Subjective Numeracy Scale. Med Decis Making 2007;27(5):672-680. [Medline]
  45. Jaspers MW, Steen T, van den Bos C, Geenen M. The think aloud method: a guide to user interface design. Int J Med Inform 2004 Nov;73(11-12):781-795. [CrossRef] [Medline]
  46. Lewis JR. IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int J Hum-Comput Interact 1995 Jan;7(1):57-78. [CrossRef]
  47. Sarkar U, Gourley GI, Lyles CR, Tieu L, Clarity C, Newmark L, et al. Usability of commercially available mobile applications for diverse patients. J Gen Intern Med 2016 Dec;31(12):1417-1426 [FREE Full text] [CrossRef] [Medline]
  48. Taha J, Sharit J, Czaja SJ. The impact of numeracy ability and technology skills on older adults' performance of health management tasks using a patient portal. J Appl Gerontol 2014 Jun;33(4):416-436 [FREE Full text] [CrossRef] [Medline]
  49. Segall N, Saville JG, L'Engle P, Carlson B, Wright MC, Schulman K, et al. Usability evaluation of a personal health record. AMIA Annu Symp Proc 2011;2011:1233-1242 [FREE Full text] [Medline]
  50. Nelson LA, Bethune MC, Lagotte AE, Osborn CY. The usability of diabetes MAP: a web-delivered intervention for improving medication adherence. JMIR Hum Factors 2016 May 12;3(1):e13 [FREE Full text] [CrossRef] [Medline]
  51. Strauss A, Corbin J. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. Newbury Park, CA: Sage Publications; 1990.
  52. US Department of Health Human Services. Usability.gov. 2014. Usability evaluation basics   URL: https://www.usability.gov/what-and-why/usability-evaluation.html [WebCite Cache]
  53. Georgsson M, Staggers N. An evaluation of patients' experienced usability of a diabetes mHealth system using a multi-method approach. J Biomed Inform 2016 Feb;59:115-129 [FREE Full text] [CrossRef] [Medline]
  54. Georgsson M, Staggers N. Quantifying usability: an evaluation of a diabetes mHealth system on effectiveness, efficiency, and satisfaction metrics with associated user characteristics. J Am Med Inform Assoc 2016;23(1):5-11 [FREE Full text] [CrossRef] [Medline]
  55. Fu H, McMahon SK, Gross CR, Adam TJ, Wyman JF. Usability and clinical efficacy of diabetes mobile applications for adults with type 2 diabetes: a systematic review. Diabetes Res Clin Pract 2017 Sep;131:70-81. [CrossRef] [Medline]
  56. Grant RW, Wald JS, Poon EG, Schnipper JL, Gandhi TK, Volk LA, et al. Design and implementation of a web-based patient portal linked to an ambulatory care electronic health record: patient gateway for diabetes collaborative care. Diabetes Technol Ther 2006 Oct;8(5):576-586 [FREE Full text] [CrossRef] [Medline]
  57. Osborn CY, Mulvaney SA. Development and feasibility of a text messaging and interactive voice response intervention for low-income, diverse adults with type 2 diabetes mellitus. J Diabetes Sci Technol 2013 May 01;7(3):612-622 [FREE Full text] [CrossRef] [Medline]
  58. Yu CH, Parsons JA, Hall S, Newton D, Jovicic A, Lottridge D, et al. User-centered design of a web-based self-management site for individuals with type 2 diabetes-providing a sense of control and community. BMC Med Inform Decis Mak 2014 Jul 23;14:60 [FREE Full text] [CrossRef] [Medline]
  59. Alanzi T, Istepanian R, Philip N. Design and usability evaluation of social mobile diabetes management system in the gulf region. JMIR Res Protoc 2016 Sep 26;5(3):e93 [FREE Full text] [CrossRef] [Medline]
  60. Patrick K, Hekler EB, Estrin D, Mohr DC, Riper H, Crane D, et al. The pace of technologic change: implications for digital health behavior intervention research. Am J Prev Med 2016 Nov;51(5):816-824. [CrossRef] [Medline]
  61. Da Silva TS, Martin A, Maurer F, Silveira M. User-centered design and agile methods: a systematic review. 2011 Presented at: Agile Conference (AGILE); August 7-13, 2011; Salt Lake City, UT p. 77-86. [CrossRef]
  62. Lithgow K, Edwards A, Rabi D. Smartphone app use for diabetes management: evaluating patient perspectives. JMIR Diabetes 2017 Jan 23;2(1):e2. [CrossRef]
  63. Tulu B, Trudel J, Strong DM, Johnson SA, Sundaresan D, Garber L. Patient portals: an underused resource for improving patient engagement. Chest 2016 Jan;149(1):272-277. [CrossRef] [Medline]
  64. Archer N, Fevrier-Thomas U, Lokker C, McKibbon KA, Straus SE. Personal health records: a scoping review. J Am Med Inform Assoc 2011;18(4):515-522 [FREE Full text] [CrossRef] [Medline]
  65. Giardina TD, Modi V, Parrish DE, Singh H. The patient portal and abnormal test results: an exploratory study of patient experiences. Patient Exp J 2015;2(1):148-154 [FREE Full text] [Medline]
  66. Giardina TD, Baldwin J, Nystrom DT, Sittig DF, Singh H. Patient perceptions of receiving test results via online portals: a mixed-methods study. J Am Med Inform Assoc 2018 Apr 01;25(4):440-446 [FREE Full text] [CrossRef] [Medline]
  67. Geiss LS, Wang J, Cheng YJ, Thompson TJ, Barker L, Li Y, et al. Prevalence and incidence trends for diagnosed diabetes among adults aged 20 to 79 years, United States, 1980-2012. J Am Med Assoc 2014 Sep 24;312(12):1218-1226. [CrossRef] [Medline]
  68. Lyles CR, Harris LT, Jordan L, Grothaus L, Wehnes L, Reid RJ, et al. Patient race/ethnicity and shared medical record use among diabetes patients. Med Care 2012 May;50(5):434-440. [CrossRef] [Medline]


CSUQ: Computer System Usability Questionnaire
HbA1c: hemoglobin A1c
HMW: how might we
MHAV: My Health at Vanderbilt
UX: user experience
VAPC: Vanderbilt Adult Primary Care
VUMC: Vanderbilt University Medical Center


Edited by G Eysenbach; submitted 04.12.17; peer-reviewed by GL Dimaguila, J Waycott; comments to author 17.03.18; revised version received 17.07.18; accepted 17.07.18; published 24.09.18

Copyright

©William Martinez, Anthony L Threatt, S Trent Rosenbloom, Kenneth A Wallston, Gerald B Hickson, Tom A Elasy. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 24.09.2018.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on http://humanfactors.jmir.org, as well as this copyright and license information must be included.