This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on http://humanfactors.jmir.org, as well as this copyright and license information must be included.
Health apps and Web-based interventions designed for patients with diabetes offer novel and scalable approaches to engage patients and improve outcomes. However, careful attention to the design and usability of these apps and Web-based interventions is essential to reduce the barriers to engagement and maximize use.
The aim of this study was to apply design sprint methodology paired with mixed-methods, task-based usability testing to design and evaluate an innovative, patient-facing diabetes dashboard embedded in an existing patient portal and integrated into an electronic health record.
We applied a 5-day design sprint methodology developed by Google Ventures (Alphabet Inc, Mountain View, CA) to create our initial dashboard prototype. We identified recommended strategies from the literature for using patient-facing technologies to enhance patient activation and designed a dashboard functionality to match each strategy. We then conducted a mixed-methods, task-based usability assessment of dashboard prototypes with individual patients. Measures included validated metrics of task performance on 5 common and standardized tasks, semistructured interviews, and a validated usability satisfaction questionnaire. After each round of usability testing, we revised the dashboard prototype in response to usability findings before the next round of testing until the majority of participants successfully completed tasks, expressed high satisfaction, and identified no new usability concerns (ie, stop criterion was met).
The sample (N=14) comprised 5 patients in round 1, 3 patients in round 2, and 6 patients in round 3, at which point we reached our stop criterion. The participants’ mean age was 63 years (range 45-78 years), 57% (8/14) were female, and 50% (7/14) were white. Our design sprint yielded an initial patient-facing diabetes dashboard prototype that displayed and summarized 5 measures of patients’ diabetes health status (eg, hemoglobin A1c). The dashboard used graphics to visualize and summarize health data and reinforce understanding, incorporated motivational strategies (eg, social comparisons and gamification), and provided educational resources and secure-messaging capability. More than 80% of participants were able to successfully complete all 5 tasks using the final prototype. Interviews revealed usability concerns with design, the efficiency of use, and content and terminology, which led to improvements. Overall satisfaction (0=worst and 7=best) improved from the initial to the final prototype (mean 5.8, SD 0.4 vs mean 6.7, SD 0.5).
Our results demonstrate the utility of the design sprint methodology paired with mixed-methods, task-based usability testing to efficiently and effectively design a patient-facing, Web-based diabetes dashboard that is satisfying for patients to use.
Diabetes is a leading cause of kidney failure, heart disease, stroke, visual impairment, and nontraumatic lower limb amputations [
Interventions aimed at enhancing patients’ motivation, skills, knowledge, and confidence in diabetes self-care have had limited success, with many relying on face-to-face interactions that are costly and challenging to scale [
Human-centered design is an approach to software development that emphasizes optimal user experience by integrating users directly into the design process and helps ensure the creation of a suitable user interface [
This paper describes the application of design sprint methodology paired with mixed-methods, task-based usability testing to design and evaluate an innovative, patient-facing diabetes dashboard embedded in an existing patient portal, My Health at Vanderbilt (MHAV) [
We utilized a 5-day design sprint methodology [
On day 1, we began by mapping out our challenge (
Whiteboard image mapping out challenge to create a patient-facing, diabetes dashboard.
We also met one-on-one with expert stakeholders (eg, patient portal users with diabetes, diabetes educators, behavioral scientists, physicians, educators, and nurses) to ask questions aimed at enhancing our understanding of the challenge and refine our map. We identified expert stakeholders by approaching organizational leaders with a description of the project and by asking them to identify individuals in their area who could provide valuable input. For example, we approached the director of the Vanderbilt University Hospital Patient and Family Advisory Council who connected us with patients from the Council, who had diabetes, were current patient portal users, and expressed interest in improving care for people with diabetes. Experts’ comments were recorded in the form of
On day 2, the existing ideas, architecture, and designs from health care and other industries related to the challenge were reviewed to establish the building blocks of our prototype. For example, existing solutions for displaying health and performance data and other types of quantitative, longitudinal, and benchmarked data from other industries (eg, finance and education) were reviewed. Subsequently, findings from the review and the meetings with expert stakeholders were used to sketch our own solutions (
On day 3, the solutions were critiqued and the solutions that had the greatest potential to successfully meet the challenge in the long term were decided by consensus. Following this, the authors adapted the solutions chosen to create a storyboard or step-by-step plan for the prototype (
On day 4, the authors developed the prototype using Apple Keynote (Apple Inc, Cupertino, CA) [
From September to October 2016, we conducted a mixed-methods, task-based usability study of dashboard prototypes with individual patients under controlled conditions. Patients were recruited from the Vanderbilt Adult Primary Care (VAPC) clinic. Individual usability sessions lasted between 30 and 75 min. Given that the majority of usability problems are commonly identified within the first 5 usability evaluations [
Design sprint day 1—expert comments/ideas organized into categories.
Design sprint day 2—solution sketches.
Design sprint day 3—dashboard storyboard.
Design sprint day 4—screenshot of initial dashboard prototype. A1c: hemoglobin A1c.
The VAPC clinic is located within the Vanderbilt University Medical Center (VUMC) in Nashville, TN. The clinic cares for about 25,000 unique patients annually, of which about 4500 (18.00%) have diabetes. All clinical data are entered into an electronic health record, and the patients are provided access to their clinical data via a Web portal.
Participants were eligible for the study if they had type 2 diabetes mellitus, were English-speaking, were aged 21 years or older, and were current users of the VUMC patient Web portal, MHAV. Potential participants were identified automatically using VUMC’s Subject Locator to query the electronic health records of patients with upcoming clinic appointments for discrete inclusion and exclusion criteria. Identified patients (n=334) were mailed a letter describing the study and asked to contact the investigators if they were interested in participating. Interested patients (n=22) contacted the research coordinator to learn more about the study and confirm eligibility. Patients who agreed to participate (n=17) were scheduled to participate in a usability session on the day of their clinic appointment. Overall, 3 patients canceled due to weather or a conflicting appointment. A total of 14 patients ultimately completed a usability session and provided written informed consent before participating in their session. The Vanderbilt University Institutional Review Board approved this research.
Before the usability testing session, enrolled patients were asked to complete a short questionnaire before their interview. The questionnaire included basic demographic questions, including items about computer and smartphone usage and internet access, as well as validated measures of health literacy [
Each participant received a standardized introduction to the dashboard and the
To assess and quantify participant satisfaction with the dashboard, at the conclusion of their usability session, participants completed 12 items from the Computer System Usability Questionnaire (CSUQ), which assess participants’ perceptions of the dashboard’s ease of use, likability of the interface, and overall satisfaction using a 7-point Likert response scale (1=strongly disagree to 7=strongly agree), with 7 indicating the highest possible satisfaction [
Task completion was coded with a usability rating scale utilized in prior studies [
Audio files of interviews were submitted to a professional transcription service, Rev.com Inc (San Francisco, CA). Transcripts were checked for accuracy and identifying information was removed. Deidentified transcripts were imported into NVivo 10 (version 10; QSR International, Burlington, VT) for coding and analysis. Similar to other health app usability studies [
Descriptive statistics were used to characterize the study participants, task completion, and survey data. All analyses were completed with SAS version 9.4 (SAS Institute, Inc, Cary, NC).
Data analysis began after the initial round of testing, and the authors used the findings to inform prototype revisions before the subsequent round of testing. Additional rounds of testing were conducted until the majority of participants within a round of testing (1) were able to successfully complete all tasks, (2) indicated high overall satisfaction with the dashboard as assessed by the overall satisfaction item on the CSUQ (score≥6), and (3) expressed no new usability concerns during the interview (ie, saturation).
All participants in both rounds straightforwardly logged in to the dashboard and set a goal.
Only one participant in the initial round of testing was able to identify their most recent HbA1c value from the dashboard. Most participants had difficulty interpreting the dial display, were confused regarding which icon on the dial indicated the user’s most current value, and could not comprehend the HbA1c data. In response, the authors revised the data display design and status indicator icons. They relocated the features aimed at facilitating patients’ understanding of their health data, including a hover over info icon providing a nontechnical description of the measure (eg, HbA1c) and links to literacy level–sensitive educational materials so they were adjacent to the data (see
All 5 participants in the initial round were able to message their doctor’s office; however, 2 participants hesitated or demonstrated some confusion despite completing the task. Participants indicated that they were accustomed to using the existing messaging icon within the header of the patient portal, and some struggled to locate the messaging icon within the dashboard. After revising the icon in response to feedback (ie, larger text, adding color and a button icon), the majority of participants in the final round successfully completed the task. However, 3 participants continued to initially attempt messaging via the existing icon in the header, one of whom completed the task only after being directed to the correct button icon.
Participant characteristics.
Characteristic | Total (N=14) | Round 1 (N=5) | Round 2 (N=3) | Round 3 (N=6) | |
Age (years), mean (SD) | 63.4 (11.0) | 62.2 (10.3) | 75.7 (3.2) | 58.2 (9.9) | |
40-49 | 1 (7) | 0 (0) | 0 (0) | 1 (16) | |
50-59 | 4 (29) | 2 (40) | 0 (0) | 2 (33) | |
60-69 | 4 (29) | 2 (40) | 0 (0) | 2 (33) | |
70-79 | 5 (36) | 1 (20) | 3 (100) | 1 (16) | |
Female | 8 (57) | 3 (60) | 0 (0) | 5 (83) | |
Male | 6 (43) | 2 (40) | 3 (100) | 1 (17) | |
White | 7 (50) | 3 (60) | 1 (33) | 3 (50) | |
African American | 3 (21) | 1 (20) | 1 (33) | 1 (17) | |
Asian | 2 (14) | 1 (20) | 1 (33) | 0 (0) | |
Other | 2 (14) | 0 (0) | 0 (0) | 2 (33) | |
High school degree / graduate equivalency degree | 1 (7) | 1 (20) | 0 (0) | 0 (0) | |
Some college | 3 (21) | 1 (20) | 0 (0) | 2 (33) | |
College degree | 5 (36) | 1 (20) | 2 (67) | 2 (33) | |
Postgraduate degree | 5 (36) | 2 (40) | 1 (33) | 2 (33) | |
Health literacy, mean (rangea) | 13.4 (11-15) | 13.2 (12-15) | 12.7 (11-15) | 14.0 (13-15) | |
Numeracy, mean (rangeb) | 15.0 (7-18) | 13.0 (7-18) | 17.0 (16-18) | 15.7 (10-18) | |
Home computer userc, n (%) | 14 (100) | 5 (100) | 3 (100) | 6 (100) | |
Smartphone user, n (%) | 9 (64) | 3 (60) | 2 (67) | 4 (67) | |
Home internet access, n (%) | 14 (100) | 5 (100) | 3 (100) | 6 (100) | |
Hyperlipidemia | 10 (71) | 3 (60) | 3 (100) | 4 (67) | |
Atherosclerotic cardiovascular disease | 3 (21) | 0 (0) | 1 (33) | 2 (33) | |
Hypertension | 7 (50) | 2 (40) | 3 (100) | 2 (33) | |
Chronic kidney disease | 3 (21) | 1 (20) | 1 (33) | 1 (17) |
aPossible score range: 3 (worst) to 15 (best).
bPossible score range: 3 (worst) to 18 (best).
cIncludes desktops, laptops, or tablets.
Task-based usability ratings for initial and final prototype iterations. The asterisk indicates that one participant within the final round of testing was not asked to complete the task due to time constraints. HbA1c: hemoglobin A1c.
Screenshot of final dashboard prototype. A1c: hemoglobin A1c.
Only 2 participants in round 1 were able to set a reminder on the dashboard. Participants struggled to set the frequency of recurrence and a stop date for reminders they wished to receive only for a specified time. Subsequently, the authors revised the layout of the “set reminder” pop up window to include a clear start and stop date and time, as well as a drop-down menu to set recurrences (eg, daily, weekly, etc). After revisions, 4 of 6 participants in round 3 were able to set a reminder, with one additional participant successfully completing the task with prolonged effort.
Participants’ concerns with dashboard usability.
Usability element and unique concern type | Illustrative quote | ||
Font size | |||
Patient status indicator | |||
Reminder functionality | |||
Patients like me indicator | |||
Star rating | |||
Hover over functionality | |||
Goal setting functionality | |||
Redundancy | |||
Historical values | |||
Medical jargon | |||
Diet information | |||
Online community |
Computer system usability questionnaire survey items assessing the dashboard usability: initial versus final prototype.
Item | Initial prototype (n=5), mean (SD) | Final prototype (n=6), mean (SD) |
Overall, I am satisfied with how easy it is to use this system. | 5.6 (1.1) | 6.3 (0.8) |
It is simple to use this system. | 6.0 (0.8) | 6.3 (0.8) |
I feel comfortable using this system. | 5.7 (1.3) | 6.5 (1.3) |
It was easy to learn to use this system. | 6.2 (0.8) | 6.5 (0.8) |
It is easy to find the information I need. | 5.6 (1.5) | 4.8 (1.2) |
The information provided with the system is easy to understand. | 5.4 (1.7) | 5.8 (1.2) |
The organization of information on the system screens is clear. | 4.2 (2.2) | 6.5 (0.5) |
The interface of this system is pleasant. | 5.4 (1.3) | 6.5 (0.5) |
I like using the interface of this system. | 5.4 (1.1) | 6.5 (0.5) |
The system has all the functions and capabilities I expect it to have. | 6.0 (0.7) | 6.2 (0.8) |
Overall, I am satisfied with this system. | 5.8 (0.4) | 6.7 (0.5) |
The system is visually appealing. | 5.8 (1.3) | 6.5 (0.5) |
Our study illustrates the use of design sprint methodology alongside mixed-methods, task-based usability testing in the design of a Web-based intervention for patients with diabetes. By using this design approach, we were able to rapidly create a prototype and rigorously assess task-based usability before any programming. Task-based usability testing and qualitative analysis of interviews with a small number of participants quickly identified usability challenges that led to improvements in successive iterations. Participant feedback informed changes in the data display that led to improved comprehension of diabetes health data. Participants’ usability satisfaction surveys demonstrated a high level of satisfaction with the dashboard that improved from initial to final prototype. The final prototype incorporated recommended strategies to enhance patient activation across the engagement spectrum, from providing educational resources to promoting behavior change through rewards (see
Several prior studies have reported the design and usability of patient-facing health apps and Web-based interventions for patients with diabetes [
By utilizing design sprint methodology, we were able to create a viable initial prototype within 5 days. Given the rapidly evolving technology and patient expectations of health technology [
Recommended strategies for patient activation and paired dashboard functionality by level of patient engagement.The asterisk refers to the engagement pyramid reported by Singh et al, 2016 [
Our study also has implications for the design of patient portals and the display of patients’ health data. By giving patients direct access to their health data, patient portals can improve patient engagement [
This study has important limitations. We recruited a convenience sample of patients from a single, large, urban academic medical center that may limit the generalizability of our findings. Our sample included patients who were more educated and had greater computer and internet access than the overall population of patients with diabetes [
Researchers and others considering using design sprint methodology should also consider some of the limitations of the approach. Although a standard design sprint that unfolds over 5 days is generally recommended [
In conclusion, the results underscore the value of design sprint methodology to efficiently create a viable user-centric prototype of a Web-based intervention and the importance of mixed-methods evaluation of usability as a part of the design phase beginning with the initial prototype. Design sprints offer an efficient way to define the problem, assess the needs of users, iteratively generate ideas and develop a viable product for testing, whereas usability evaluation methods ensure health apps and Web-based interventions appeal to users and support their use.
Computer System Usability Questionnaire
hemoglobin A1c
how might we
My Health at Vanderbilt
user experience
Vanderbilt Adult Primary Care
Vanderbilt University Medical Center
This work was supported by the National Institute of Diabetes and Digestive and Kidney Diseases / National Institutes of Health (K23DK106511 and 2P30DK092986-07) and the National Center for Advancing Translational Sciences / National Institutes of Health UL1 TR000445. We are grateful to Ricardo J Trochez, BA, and Kemberlee R Bonnet, MA, for their assistance as coders.
None declared.