JMIR Publications

JMIR Human Factors

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 24.04.17 in Vol 4, No 2 (2017): Apr-Jun

This paper is in the following e-collection/theme issue:

    Original Paper

    Toward a More Usable Home-Based Video Telemedicine System: A Heuristic Evaluation of the Clinician User Interfaces of Home-Based Video Telemedicine Systems

    1Department of Civil Engineering, Clemson University, Clemson, SC, United States

    2Department of Industrial Engineering, Clemson University, Clemson, SC, United States

    3MUSC Center for Telehealth, Department of Public Health Sciences, Medical University of South Carolina, Charleston, SC, United States

    4School of Dental Medicine, Southern Illinois University at Edwardsville, Edwardsville, IL, United States

    *these authors contributed equally

    Corresponding Author:

    Kapil Chalil Madathil, PhD

    Department of Civil Engineering

    Clemson University

    116 Lowry Hall

    Clemson, SC, 29634

    United States

    Phone: 1 7132946499

    Fax:1 8646560856

    Email:


    ABSTRACT

    Background: Telemedicine is the use of technology to provide and support health care when distance separates the clinical service and the patient. Home-based telemedicine systems involve the use of such technology for medical support and care connecting the patient from the comfort of their homes with the clinician. In order for such a system to be used extensively, it is necessary to understand not only the issues faced by the patients in using them but also the clinician.

    Objectives: The aim of this study was to conduct a heuristic evaluation of 4 telemedicine software platforms—Doxy.me, Polycom, Vidyo, and VSee—to assess possible problems and limitations that could affect the usability of the system from the clinician’s perspective.

    Methods: It was found that 5 experts individually evaluated all four systems using Nielsen’s list of heuristics, classifying the issues based on a severity rating scale.

    Results: A total of 46 unique problems were identified by the experts. The heuristics most frequently violated were visibility of system status and Error prevention amounting to 24% (11/46 issues) each. Esthetic and minimalist design was second contributing to 13% (6/46 issues) of the total errors.

    Conclusions: Heuristic evaluation coupled with a severity rating scale was found to be an effective method for identifying problems with the systems. Prioritization of these problems based on the rating provides a good starting point for resolving the issues affecting these platforms. There is a need for better transparency and a more streamlined approach for how physicians use telemedicine systems. Visibility of the system status and speaking the users’ language are keys for achieving this.

    JMIR Hum Factors 2017;4(2):e11

    doi:10.2196/humanfactors.7293

    KEYWORDS



    Introduction

    Health Care System

    The health care system in the United States is currently undergoing extensive changes. Possible causes for such challenges faced by its delivery system are increased demand for health care due to an increased number and changing lifestyle leading to an increase in chronic diseases, the demand for increased accessibility of care outside hospitals, moving health services into the patient’s own homes, the need for increased efficiency, individualization and equity of quality-oriented health care with limited financial resources, the difficulties of recruiting and retaining personnel in health care services in general, and in home and elderly care in particular [1,2]. Telehealth, the use of electronic information and telecommunication technologies to support long-distance clinical health care, patient and professional health-related education, public health and health administration [1,3], has the potential to address these issues. One subsection, telemedicine, the use of technology to provide and support health care when distance separates the clinical service and the patient, appears to be particularly attractive [4].

    Although playing an important role in addressing the health issues of patients living in underserved and rural areas, telemedicine is now attracting attention beyond these limited regions. It offers mechanisms for centralizing specialists, reducing costs for specialty care, and supporting primary care clinician needs in the urban and suburban areas these typically serve [5-9]. The possible benefits of using these systems have resulted in an increased interest in telemedicine. For example, they help patients with chronic illness and those with limited mobility to connect with a health care facility from the comfort of their homes [1], and it is important because it reduces the stress for patients who otherwise would have to travel long distances for their appointments [10]. Currently, this remote care is extensively used for clinical visits that do not require physical presence such as behavioral health [11,12], counseling [13-15], follow-up [8,16], and patient education [17], with studies finding telemedicine an appealing solution for the real-time remote monitoring of patients. It has also shown promise for improving patient knowledge of health care, thus helping them better manage their diseases or illnesses. With the expanding technical capabilities and the decreasing costs of telemedicine software solutions, home-based telemedicine is becoming more widely used, evidenced by a recent workshop conducted by the National Academies that discussed the potential of scaling such delivery of care for a growing number of patients [2,18].

    Telemedicine System

    Unlike face-to-face encounters, in which clinicians and patients are both located in the same setting, telemedicine participants usually use the teleconferencing systems at their respective locations. Thus, for a telemedicine system to become widely accepted, it should not only be functional but also user-friendly [19-23]. The Institute of Medicine (IOM) recently emphasized the role of usability in telemedicine systems, given its potential to replace regular clinical visits which are both time-consuming and resource-demanding [2,8,23].

    However, limited research has been conducted evaluating the perceived usefulness and usability of such tools from a home-based video telemedicine system perspective [24,25]. The evaluation of a user interface can be carried out by 4 ways; formally using analytical tools, automatically using computer technology, empirically, that is, testing with users, and heuristically [25-27]. Heuristic evaluation is the process of usability testing wherein evaluators are provided with an interface and asked to comment on it based on a set of heuristics [27,28]. The efficiency of this system of evaluation allows for an iterative design process of user interfaces [18]. Studies have found that this type of evaluation can reveal both major and minor usability issues, including problems that lead to errors and user dissatisfaction content [29-31]. Furthermore, it has been extensively used to ascertain usability issues in the medical field ranging from websites to medical devices [25,32-34] to health information technology applications [35-40]. In light of these advantages of a heuristic evaluation, this study aimed to understand the issues of the clinician’s interfaces of 4 telemedicine platforms. The issues uncovered through this heuristic evaluation could serve as a basis to improve the clinician’s interface in telemedicine platforms.


    Methods

    Telemedicine Systems

    The criteria for a telemedicine system to be included in this heuristic evaluation were as follows: (1) the system is primarily used to deliver video-based telemedicine at home; (2) the system does not require specialized or proprietary equipment for home use; (3) the system runs on an Internet-connected computer with audio and video capabilities; and (4) the system is Health Insurance Portability and Accountability Act (HIPAA) compliant, which aims at protecting the health rights and privacy of patiens [41]. Initially, the telemedicine systems used by the medical staff at the Medical University of South Carolina (MUSC) Center for Telehealth and South Carolina (SC) Telehealth Alliance were reviewed. Subsequently, 8 software applications, Adobe Connect, Cisco WebEx, Cisco Jabber, Doxy.me, Polycom, Skype, Vidyo, and VSee, were identified as potential candidates based on this preliminary review. Next, a detailed analysis of the features of each and its primary use were conducted. A total of 5 key stakeholders including the physicians and directors associated with MUSC Center for Telehealth and SC Telehealth Alliance were consulted. It was understood that Adobe Connect, Skype, Cisco WebEx, and Cisco Jabber could be used to deliver video-based telemedicine, but they were not currently used extensively for doing so. On the basis of this feedback, the telemedicine tools selected for this research were (1) Doxy.me, (2) Vidyo, (3) VSee, and (4) Polycom.

    Doxy.me

    Doxy.me is a free Web-based system (as opposed to downloaded desktop application) specifically designed for telemedicine purposes. Clinicians create an account and a personalized waiting room where they communicate with their patients, either copying and emailing or directly emailing the address of their waiting room. By clicking on this link, patient is directed to the clinician’s waiting room. There is a self-view box at the top right and a chat box at the bottom right of the screen. Volume and video control buttons are located below the patient’s video. In addition to these features, the clinician can edit the waiting room and change the account settings. Figure 1 below shows the Doxy.me log-in screen, waiting room, and clinician’s view.

    Figure 1. Doxy.me log-in screen, waiting room, and clinician’s view.
    View this figure
    Vidyo

    Vidyo is one of the leading telemedicine videoconferencing desktop-based application solutions. After creating an account, the clinician receives an email with log-in credentials and a Vidyo portal. The Vidyo desktop application is downloaded by clicking the portal. The clinician sends a Vidyo meeting invitation after logging into this application. He or she can change the video quality and other settings by clicking on the configuration button. In addition, this task bar includes the volume and video control buttons, group chat option, self-view option, screen layout option, and end call option. Figure 2 shows the Vidyo desktop application from the clinician’s perspective.

    Figure 2. Vidyo log-in screen, contact list, and clinician’s view.
    View this figure
    VSee

    VSee is a telemedicine system which requires the clinician to create an account and install a desktop application. After logging into the account, the clinician invites patients by entering their email ids or copying and emailing them an invitation link. This system includes an option for text chatting with the patient as well as separate windows for self-view, clinician’s video, chat box, and contacts, with the microphone and camera settings being found on the self-view window. Figure 3 shows the VSee log-in screen, application screen, and clinician’s view.

    Figure 3. VSee log-in screen, application screen, and clinician’s view.
    View this figure
    Polycom

    Polycom, a licensed Web-based application that can be purchased from the Polycom website, provides telemedicine and video services for remote conferencing and collaboration. Although this company provides a hardware-based telemedicine solution, this study used a lightweight product for home-based care. The clinician receives an email with log-in credentials and a link for accessing the account. After logging into the account, the clinician selects the devices or system, which includes an option for adding participants and managing meetings. The clinician invites patients by emailing them the address of his or her chat room; after clicking this link, the patient is then directed to the meeting. There is a self-view option on the left side of the screen, the participant list on the right, and the patient’s video in the middle. The control buttons are located below the patient’s video. Figure 4 shows the Polycom log-in screen, welcome room, and clinician’s view.

    Figure 4. Polycom log-in screen, welcome screen, and clinician’s view.
    View this figure

    Study Personnel

    The investigation reported here was based on the experts’ heuristic evaluations and the severity of the problems based on a severity rating scale. A heuristic evaluation is a discounted usability method in which the evaluation of an interface is done based on established usability principles. Five human factors engineers—three PhD students, one assistant professor, and one master’s degree student, all with prior training in conducting heuristic evaluations—were recruited to serve as the subject matter experts for this study. All received verbal information about the purpose and the goal of the study, and a detailed written task flow to guide the evaluation of the 4 telemedicine platforms using a modified heuristic evaluation procedure. They were compensated with a US $20 Amazon gift card for their time.

    Study Design and Procedure

    The method of evaluation used in the study was a heuristic evaluation, a discounted usability evaluation method, combined with a severity rating scale [27,42]. Specifically, Nielsen’s heuristics were used because of their widespread use and acceptability [27,43-45]. The heuristics, which are listed in Table 1, were used to highlight possible usability issues.

    Table 1. Usability heuristics used for evaluating the telemedicine interfaces (adapted from Nielsen’s heuristics [27,28]).
    View this table

    A heuristic evaluation, typically conducted with 5 experts, detects up to 80% of the problems [18]. For this study, the experts individually conducted the assessment in a closed lab setting to avoid bystander bias. A 5-point severity scale was applied to each of the usability issues to indicate the level of concern [27]. The scale ranged from issues which may not impact the usability of the system to issues that could potentially lead to its failure. The 5-point scale is as follows [46]:

    0—May not be a problem: other observers do not agree that this is a usability problem

    1—Cosmetic problem only: it need not be fixed unless extra time is available

    2—Minor usability problem: fixing it should be given low priority

    3—Major usability problem: it is important to fix it, should be given high priority

    4—Usability catastrophe: imperative that it is fixed before product can be released

    Understanding the source of errors in a task begins with an in-depth understanding of the task flow [33]. This study, thus, began with a detailed task analysis for each of the 4 telemedicine systems to help understand the feedback they provide and the potential problems the user could face. This task analysis also included determining the knowledge the user must have in order to perform the task successfully. Before actual evaluation, the researcher discussed the detailed task analysis, heuristics, and severity ranking scale with the experts. As the experts were from the field of human factors and familiar with heuristic evaluation studies, only context-specific instructions were provided to evaluate the telemedicine platforms. A separate sheet containing the list of the heuristics and the severity rating chart was also given to the experts for their reference. The experts then evaluated the systems from the clinician’s perspective with the help of a hypothetical patient with whom no communication was carried out. After which, they listed the heuristic violations individually. The tasks to be completed by the evaluators were as follows:

    Initiation: Create an account, log into the portal or desktop application, send an email invitation for the telemedicine session, call the patient.

    Consultation: Toggle microphone and video, enter full screen mode, enter data into a chat box (where applicable), and end video call.

    On completing individual evaluations, experts discussed their findings with others in a postevaluation debriefing session. In the case of extreme inconsistencies, the evaluators discussed and came to a consensus about the appropriate rating. Individual lists were subsequently compiled for data analysis. Figure 5 outlines the experimental procedure followed in this study which lasted approximately 1 h.

    Figure 5. Experimental procedure followed.
    View this figure

    Data Analysis

    For the evaluation, each expert recorded the heuristic violations for the respective tasks, including grading the severity of the issues. The individual ratings from the evaluators were averaged to obtain a single value of the severity. These data were then compiled to understand which heuristics were most violated and the severity was analyzed to prioritize the problems. The number of heuristics violated was graphed for the telemedicine session initiation and consultation.


    Results

    Heuristic Violations

    The heuristic evaluations of the experts for the clinician’s interface revealed a total of 46 unique issues: 11 for Doxy.me, 10 for Vidyo, 12 for VSee, and 13 for Polycom. Of these, 22% (10/46) concerns were recognized by all the experts. Tables 2-4 list the important usability issues and the heuristics they violated for initiating, conducting, and concluding a telemedicine session, respectively. Figures 6-8 are graphical representations of heuristic violations for initiating, conducting, and concluding a telemedicine session, respectively. As these figures show, 60.9% (28 issues) of the issues was identified in the initiation phase, 33% (15/46 issues) in the telemedicine session phase, and 7% (3/46 issues) in the conclusion phase. Sharing or setting up microphone and camera was one of the specific issues observed in the initiation phase (Figure 9). Using default email client (Microsoft Outlook) to invite patients was one of the issues identified during the telemedicine session. During conclusion, difficulty to find the log-out button was pointed out as an important issue (Figure 10).

    Table 2. Heuristic violations identified in the telemedicine initiation session.
    View this table
    Table 3. Heuristic violations identified during the telemedicine session.
    View this table

    The heuristics most frequently seen violated were visibility of system status and error prevention, each with 11 violations (24%, 11/46), with esthetic and minimalist design being second with 6 out of 46 violations (13%, 6/46). Violations were not observed for the heuristics user control and freedom and flexibility and efficiency of use. It was found that (1) 4 out of 46 (9%, 4/46) violations were recorded for each of consistency and standards, recognition rather than recall, help users recognize, diagnose, and recover from errors and help and documentation heuristics, and (2) 2 out of 46 (4%, 2/46) violations were observed for match between system and real world heuristic. Specific issues related to visibility of system status included lack of feedback while downloading setup (.exe), lack of saliency of notifications on receiving a message or when a patient enters a Web-based waiting room, and the absence of salient call end and log-out icons. Inconspicuous check boxes, inadequate labeling of icons, and failure to exit full screen on completion of a session were identified under error prevention.

    Table 4. Heuristic violations identified while concluding the telemedicine session.
    View this table
    Figure 6. Heuristics violated during the telemedicine initiation session.
    View this figure
    Figure 7. Heuristics violated during the telemedicine consultation.
    View this figure
    Figure 8. Heuristics violated during the telemedicine session conclusion.
    View this figure
    Figure 9. Microphone and camera sharing option.
    View this figure
    Figure 10. Absence of log-out option.
    View this figure

    Issues Requiring Immediate Attention

    The experts rated two issues as requiring immediate action. The multiple check-in options available in the welcome screen of Polycom was one, with the experts finding that the availability of multiple options confused the user, and that the welcome screen did not contain sufficient information to choose the appropriate device (Figure 11). The second problem highlighted by the experts was the use of a default email client to email invitations to the patients. Three (Doxy.me, Vidyo, and Polycom) of the four conditions redirect the user to the default email client to send email invitations to the patients. It may be more effective to give the user the choice of using the email client of his or her choice.

    During the debriefing, experts discussed their most and least favorite aspects of each of the platforms. Experts indicated that they preferred interfaces that were not cluttered with too many options, language that they could relate to that in the real world, and systems that provided adequate and timely feedback for their actions. The least enjoyable aspects were welcome emails from the telemedicine platforms with multiple links, the use of a default email client to invite patients, and the failure of many options to respond the way expected.

    Figure 11. Multiple log-in options in Polycom.
    View this figure

    Discussion

    Principal Findings

    The heuristic evaluation was conducted using a structured table containing the task flow and a column for experts to input the problems they found for each task, the respective heuristic violated, the severity of the violation, and possible solutions. Several issues were identified during the course of this evaluation. Visibility, error prevention, and minimalistic design issues were frequently violated. The effect of such issues on the ability of a user to process information can be explained using the information processing model [47]. A detailed description of the different aspects of the information processing model that are affected due to these issues would result in developing better solutions.

    The information processing model [47] can be used to understand the impact of these issues on the user’s ability to understand and make decisions (Figure 12). This model illustrates the procedure of the human cognition process. The sensory register includes our sense organs that help a person take cues from our surroundings, which then leads to understanding or perceiving these cues. The working memory refers to the understanding and retention of information only for the span of completing a task. However, the long-term memory involves the retention of information for longer period of time such as a few weeks, months, or years. Using the information in the working memory and the long-term memory, the process of thinking and decision-making occurs to make a decision about the cues obtained from the environment. Once thoughts about the cues have been formed, an appropriate response to the cue is developed and based on this, an action is executed in response to the cue obtained. The execution of the response is again taken in by the sensory register and stored in the long-term memory for future situations. Throughout this process, there is also a constant requirement of attentional resources which help the user to focus on the necessary information and eliminate the rest.

    On the basis of the issues specified, the lack of feedback would have a direct effect on the perception of the process. As a result, the user would have difficulty in deciding the subsequent process to be carried out in the procedure. It was also indicated that the popup for the chat box was not salient. This would directly affect the sensory register as the popup would not be visible and hence the user would fail to understand (perception) that a message has been received. The icon size and design would again affect sensory register and perception. The content of the email invitation, which was reported to require immediate attention, will prove to be an important issue affecting the working memory. The working memory, responsible for understanding and retaining information until the completion of a task, would be affected due to the large amount of information or the lack of information in the email invitation. Another problem reported as requiring immediate attention was the use of default email client to send email invitations to patients. This could potentially require the retrieval of passwords to log into the system which affects the long-term memory.

    Although of lesser importance, there exist some other issues which must be studied with respect to their impact on a user’s decision-making. One such issue is the need to enter large amount of data for registration. This could affect the working memory limits of a person as they would be required to read and retain multiple data to enter. In three of the four platforms analyzed—Doxy.me, Polycom, and Vidyo—it was seen that the clinician was required to send an email invitation for every meeting. This would add to the working memory limits to process immediately available information and the long-term memory to remember patient name and email address to send the emails. The popups used to share the microphone and camera was indicated to be inconspicuous resulting in additional load on the sensory register due to lack of visibility.

    On the basis of the understanding of the different areas of the information processing model affected by the issues and the issues highlighted by the experts, certain design recommendations were developed for telemedicine systems. Some of the key findings for improving the interaction of physician with the interface to enhance the usability of telemedicine platforms are given in Textbox 1.


    Textbox 1. Key findings for improving the physician-interface interaction.
    View this box

    or Apart from providing recommendations for improvement based on the information processing model, this study also demonstrates the practicality and ease of applying heuristic evaluation in usability studies. The entire process of conducting the study and analyzing the results took a week’s time and did not involve the use of any software applications. The efficiency of this method makes it well-suited for use during the early development stages [18].

    Figure 12. Information processing model.
    View this figure

    Limitations

    This study is not without its limitations. In this study, we evaluated the telemedicine systems on a Windows 7 computer with Mozilla Firefox browser only. These systems need to be tested on multiple operating systems and Web browsers. Also, the evaluators for this study were not medically trained professionals. Future studies have to be conducted with actual clinicians to find usability issues from their perspective. Furthermore, as indicated by Nielsen and Molich [27], this heuristic evaluation like all others, only helps to identify the usability issues without providing solutions to address them.

    Conclusions

    Multiple studies have been carried out explaining the effectiveness of telemedicine in providing medical care with little research focusing on the ease and usability of these systems [24]. In this study, we used a heuristic evaluation and severity rating method to assess the usability of 4 telemedicine software platforms with a focus on understanding the interface issues faced by the clinician. Furthermore, the information processing model was used as the baseline to explain the impact of these issues on the user’s capability in making decisions. The heuristic evaluation and severity rating method was found to be effective in uncovering issues in the interface as 46 unique issues were uncovered across 4 different platforms. Prominent issues among these, whose impact was explained using the information processing model, is an indication of the need for further human factors concept-based studies of the interfaces of telemedicine systems.

    With a focus on the clinician’s interface design, this heuristic evaluation was found to be an effective method for uncovering violations. This heuristic evaluation identified only potential usability problems in an existing interface; usability studies involving physicians could further indicate aspects of the system that work well and identify the most appropriate functionalities. However, with limited resources available, heuristic evaluation is a practical, affordable, and efficient method for revealing usability problems. Experts liked systems that had a straightforward and simple interface and that did not require installation. In addition, they preferred systems that sent simple welcome emails. From a telemedicine point of view, this is important as clinicians and technicians do not have the time to spend navigating and comprehending complex platforms.

    Heuristic evaluation is a discounted usability evaluation method with limited generalizability. Future studies need to focus on detailed usability evaluation with actual clinicians and patients. Conducting retrospective interviews with the users help the designers understand their needs and in turn design or modify the system appropriately.

    Conflicts of Interest

    None declared.

    References

    1. Koch S. Home telehealth--current state and future trends. Int J Med Inform 2006 Aug;75(8):565-576. [CrossRef] [Medline]
    2. Services B, Medicine IO. In: Lustig TA, editor. The Role of Telehealth in an Evolving Health Care Environment: Workshop Summary. Washington, DC: National Academies Press; 2012.
    3. American Telemedicine Association. What is telemedicine   URL: [WebCite Cache]
    4. Perednia DA. Telemedicine technology and clinical applications. JAMA 1995 Feb 08;273(6):483. [CrossRef]
    5. Doolittle GC, Spaulding AO. Providing access to oncology care for rural patients via telemedicine. JOP 2006 Sep;2(5):228-230. [CrossRef]
    6. Coelho KR. Identifying telemedicine services to improve access to specialty care for the underserved in the san francisco safety net. Int J Telemed Appl 2011;2011:523161 [FREE Full text] [CrossRef] [Medline]
    7. Levine SR, Gorman M. “Telestroke” : the application of telemedicine for stroke. Stroke 1999 Feb 01;30(2):464-469. [CrossRef]
    8. Field MJ. Telemedicine: a guide to assessing telecommunications in health care. Washington, DC: National Academy Press; 1996.
    9. Chen L, Ho T, Shih C, Lin F, Lai F, Guo J, et al. Urban-rural difference in patients utilizing the service of telehealthcare. JHA 2015 Sep 01;4(6):9. [CrossRef]
    10. Miller EA. The technical and interpersonal aspects of telemedicine: effects on doctor-patient communication. J Telemed Telecare 2003;9(1):1-7. [CrossRef] [Medline]
    11. Glascock AP, Kutzik DM. Behavioral telemedicine: a new approach to the continuous nonintrusive monitoring of activities of daily living. Telemed J 2000 May;6(1):33-44. [CrossRef]
    12. Shaw RJ, Kaufman MA, Bosworth HB, Weiner BJ, Zullig LL, Lee SY, et al. Organizational factors associated with readiness to implement and translate a primary care based telemedicine behavioral program to improve blood pressure control: the HTN-IMPROVE study. Implement Sci 2013 Sep 08;8:106 [FREE Full text] [CrossRef] [Medline]
    13. Frank AP, Wandell MG, Headings MD, Conant MA, Woody GE, Michel C. Anonymous HIV testing using home collection and telemedicine counseling. a multicenter evaluation. Arch Intern Med 1997 Feb 10;157(3). [Medline]
    14. Riemer-Reiss ML. Utilizing distance technology for mental health counseling. J Ment Health Couns 2000 Jul;22(3):189.
    15. Ertelt TW, Crosby RD, Marino JM, Mitchell JE, Lancaster K, Crow SJ. Therapeutic factors affecting the cognitive behavioral treatment of bulimia nervosa via telemedicine versus face-to-face delivery. Int J Eat Disord 2010 Nov 15;44(8):687-691. [CrossRef]
    16. Welch G, Balder A, Zagarins S. Telehealth program for type 2 diabetes: usability, satisfaction, and clinical usefulness in an urban community health center. Telemed J E Health 2015 May 08;21(5):395-403. [CrossRef]
    17. Izquierdo RE, Knudson PE, Meyer S, Kearns J, Ploutz-Snyder R, Weinstock RS. A comparison of diabetes education administered through telemedicine versus in person. Diabetes Care 2003 Apr 01;26(4):1002-1007. [CrossRef]
    18. Lilholt PH, Jensen MH, Hejlesen OK. Heuristic evaluation of a telehealth system from the Danish TeleCare North Trial. Int J Med Inform 2015 May;84(5):319-326. [CrossRef]
    19. Yellowlees PM. Successfully developing a telemedicine system. J Telemed Telecare 2005;11(7):331-335. [Medline]
    20. Klaassen B, van Beijnum BJF, Hermens HJ. Usability in telemedicine systems—a literature survey. Int J Med Inform 2016 Sep;93:57-69. [CrossRef]
    21. Evans J, Papadopoulos A, Silvers CT, Charness N, Boot WR, Schlachta-Fairchild L, et al. Remote health monitoring for older adults and those with heart failure: adherence and system usability. Telemed J E Health 2016 May 25;22(6):480-488. [CrossRef]
    22. Rogers H, Chalil Madathil K, Agnisarman S, Narasimha S, Ashok A, Nair A, et al. A Systematic Review of the Implementation Challenges of Telemedicine Systems in Ambulances. Telemedicine and e-Health 2017 Mar 15 Epub ahead of print. [CrossRef]
    23. Agnisarman SO, Chalil MK, Smith K, Ashok A, Welch B, McElligott J. Lessons learned from the usability assessment of home-based telemedicine systems. Appl Ergon 2017 Jan;58:424-434. [CrossRef] [Medline]
    24. Narasimha S, Chalil MK, Agnisarman S, Rogers H, Welch B, Ashok A, et al. Designing telemedicine systems for geriatric patients: a review of the usability studies. Telemed J E Health 2016 Nov 22 Epub ahead of print. [CrossRef] [Medline]
    25. Narasimha S, Agnisarman S, Chalil Madathil K, Gramopadhye AK, Welch B, Mcelligott J. An investigation of the usability issues of home-based video telemedicine systems with geriatric patients. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2016 Presented at: Human Factors and Ergonomics Society Annual Meeting, 2016; September 19–23, 2016; Washington, DC p. 1804.
    26. Chalil MK, Greenstein JS. An investigation of the efficacy of collaborative virtual reality systems for moderated remote usability testing. Appl Ergon 2017 Feb 27 Epub ahead of print. [CrossRef] [Medline]
    27. Nielsen J, Molich R. Heuristic evaluation of user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, NY: ACM; 1990 Presented at: SIGCHI Conference on Human Factors in Computing Systems; April 01 - 05, 1990; Seattle, Washington, DC p. 249-256. [CrossRef]
    28. Nielsen J. Usability inspection methods. In: CHI '94 Conference Companion on Human Factors in Computing Systems. New York, NY: ACM; 1994 Presented at: Conference Companion on Human Factors in Computing Systems; April 24 - 28, 1994; Boston, MA.
    29. Tan W, Liu D, Bishu R. Web evaluation: Heuristic evaluation vs. user testing. Int J Ind Ergonom 2009;39(4):621-627. [CrossRef]
    30. Hvannberg ET, Law E, Lárusdóttir MK. Heuristic evaluation: comparing ways of finding and reporting usability problems. Interact Comput 2007 Mar;19(2):225-240. [CrossRef]
    31. Chalil Madathil K, Greenstein J. Synchronous remote usability testing: a new approach facilitated by virtual worlds. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, NY: ACM; 2011 Presented at: SIGCHI Conference on Human Factors in Computing Systems; May 07 - 12, 2011; Vancouver, British Columbia, Canada p. 2225-2234.
    32. Graham MJ, Kubose T, Jordan D, Zhang J, Johnson T, Patel V. Heuristic evaluation of infusion pumps: implications for patient safety in Intensive Care Units. Int J Med Inform 2004 Nov;73(11-12):771-779. [CrossRef] [Medline]
    33. Rogers WA, Mykityshyn A, Campbell R, Fisk A. Analysis of a “Simple” Medical Device. Ergon Des 2001 Jan 01;9(1):6-14. [CrossRef]
    34. Zhang J, Johnson T, Patel V, Paige D, Kubose T. Using usability heuristics to evaluate patient safety of medical devices. J Biomed Inform 2003;36(1-2):23-30 [FREE Full text] [Medline]
    35. Kushniruk AW, Patel V. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform 2004 Feb;37(1):56-76 [FREE Full text] [CrossRef] [Medline]
    36. Stellefson M, Chaney B, Chaney D. Heuristic evaluation of online COPD respiratory therapy and education video resource center. Telemed J E Health 2014 Oct;20(10):972-976 [FREE Full text] [CrossRef] [Medline]
    37. Chalil Madathil K, Alapatt GF, Greenstein JS. An investigation of the usability of image-based CAPTCHAs. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2010 Presented at: Human Factors and Ergonomics Society Annual Meeting, 2010; September 27-October 1, 2010; San Francisco, CA.
    38. Chalil Madathil K, Koikkara R, Dorlette-Paul M, Ranganayakulu S, Greenstein JS, Gramopadhye AK. An investigation of format modifications on the comprehension of information in consent form when presented on mobile devices. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2012 Presented at: Human Factors and Ergonomics Society Annual Meeting, 2012; October 22-26, 2012; Boston, MA.
    39. Chalil Madathil K, Koikkara R, Gramopadhye AK, Greenstein JS. An empirical study of the usability of consenting systems iPad, touchscreen and paper-based systems. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2011 Presented at: Human Factors and Ergonomics Society Annual Meeting, 2011; September 19-23, 2011; Las Vegas, NV.
    40. Wismer AJ, Chalil Madathil K, Koikkara R, Juang KA, Greenstein JS. Evaluating the usability of CAPTCHAs on a mobile device with voice and touch input. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2016 Presented at: Human Factors and Ergonomics Society Annual Meeting, 2012; October 22-26, 2012; Boston, MA.
    41. U.S. Department of Health & Human Services. 2015 Aug 26. Health Information Privacy   URL: https://www.hhs.gov/hipaa/index.html [accessed 2017-03-17] [WebCite Cache]
    42. Nielsen J. Nielsen Norman Group.: Nielsen Norman Group; 1995. Severity ratings for usability problems   URL: https://www.nngroup.com/articles/how-to-rate-the-severity-of-usability-problems/ [accessed 2017-03-17] [WebCite Cache]
    43. Custódio RAR, Almeida A, Correa JE, Almeida RMA, Mello CHP, Muller Júnior EL. Using heuristic analysis to support usability evaluation of a low risk medical device under development process. In: IFMBE Proceedings. Switzerland: Springer International Publishing; 2015 Presented at: World Congress on Medical Physics and Biomedical Engineering; 2015; Toronto, Ontario. [CrossRef]
    44. Horberry T, Teng YC, Ward J, Clarkson PJ. Safe design of medical equipment: employing usability heuristics to examine the issue of guidewire retention after surgery. In: HFESA 2013. 2013 Presented at: 49th Annual Human Factors and Ergonomics Society of Australia Conference 2013; December 2-4, 2013; Perth, Australia.
    45. Horberry T, Teng YC, Ward J, Clarkson PJ. Employing usability heuristics to examine the issue of guidewire retention after surgery. Ergonomics Australia 2014 Feb 10;1(1):1-5.
    46. Nielsen J. Finding usability problems through heuristic evaluation. In: CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, NY: ACM; 1992 Presented at: SIGCHI Conference on Human Factors in Computing Systems; May 03-07, 1992; Monterey, CA p. 373-380.
    47. Wickens CD, Lee JP, Liu Y, Gordon-Becker S, Gordon SE. An introduction to human factors engineering. Upper Saddle River, NJ: Pearson Prentice Hall; 2004.


    Abbreviations

    HIPAA: Health Insurance Portability and Accountability Act
    IOM: Institute of Medicine
    MUSC: Medical University of South Carolina
    SC: South Carolina


    Edited by P Santana-Mancilla; submitted 10.01.17; peer-reviewed by D Alhuwail, B Chaudry; comments to author 03.02.17; revised version received 22.02.17; accepted 23.02.17; published 24.04.17

    ©Sruthy Agnisarman, Shraddhaa Narasimha, Kapil Chalil Madathil, Brandon Welch, FNU Brinda, Aparna Ashok, James McElligott. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 24.04.2017.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on http://humanfactors.jmir.org, as well as this copyright and license information must be included.