This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on https://humanfactors.jmir.org, as well as this copyright and license information must be included.
Even in the era of digital technology, several hospitals still rely on paper-based forms for data entry for patient admission, triage, drug prescriptions, and procedures. Paper-based forms can be quick and convenient to complete but often at the expense of data quality, completeness, sustainability, and automated data analytics. Digital forms can improve data quality by assisting the user when deciding on the appropriate response to certain data inputs (eg, classifying symptoms). Greater data quality via digital form completion not only helps with auditing, service improvement, and patient record keeping but also helps with novel data science and machine learning research. Although digital forms are becoming more prevalent in health care, there is a lack of empirical best practices and guidelines for their design. The study-based hospital had a definite plan to abolish the paper form; hence, it was not necessary to compare the digital forms with the paper form.
This study aims to assess the usability of three different interactive forms: a single-page digital form (in which all data input is required on one web page), a multipage digital form, and a conversational digital form (a chatbot).
The three digital forms were developed as candidates to replace the current paper-based form used to record patient referrals to an interventional cardiology department (Cath-Lab) at Altnagelvin Hospital. We recorded usability data in a counterbalanced usability test (60 usability tests: 20 subjects×3 form usability tests). The usability data included task completion times, System Usability Scale (SUS) scores, User Experience Questionnaire data, and data from a postexperiment questionnaire.
We found that the single-page form outperformed the other two digital forms in almost all usability metrics. The mean SUS score for the single-page form was 76 (SD 15.8;
In conclusion, the digital single-page form outperformed the other two forms in almost all usability metrics; it had the least task completion time compared with those of the other two digital forms. Moreover, on answering the open-ended question from the final customized postexperiment questionnaire, the single-page form was the preferred choice.
Currently, when a primary percutaneous coronary intervention (PPCI) referral is made, the nurse activator in the coronary care unit will triage the patient using written notes. Typically, when a patient experiences chest pain, paramedics arrive and record an electrocardiogram. If the paramedic suspects a heart attack, they will then contact the PPCI department at a hospital and describe the symptoms and electrocardiogram findings to an activator nurse, who then completes a paper form shown in
The current paper-based form being used at Altnagelvin Hospital.
This is not unusual, as most hospitals and cardiac care units often rely on paper-based forms for data entry for patient admission or drug prescriptions and other general procedures. Working with paper-based systems can be challenging, especially when a health care staff works in a sensitive and highly stressful environment, such as cardiac care. Digitalization is slowly being introduced into the health service to improve the medical workflow at different stages and levels. Many applications serve many purposes, including facilitating communication between a patient and a provider, remotely monitoring patients, and measuring population health objectives, such as disease trends. The collected information can be used to make informed decisions about health care services, either at the population level or individual level, to improve care [
Similar to other fields, digitalization and digital transformation play an essential role in health care. Health care technologies are rapidly growing and evolving; for example, EHR systems are becoming routine [
Given the demand for effective digital forms, there is a need to research and discover the best-practice interaction design guidelines for designing digital health forms. In this study, we designed three different digital form styles to replace a paper form that is used for patient referrals to a PPCI service. To contribute to future digital form design guidelines in health care, the study also aims to compare the usability of all three forms to analyze which form styles work best for health care professionals. However, measuring usability is difficult because usability does not refer to a single property; rather, it combines several attributes [
The focus or aim is to compare different digital form designs to evaluate which digital form has greater usability.
The total study population consisted of 20 health care staff who were either cardiac nurses or research nurses.
Microsoft surface pro to display the digital forms and to facilitate user interaction, a microphone to record the user’s think-aloud data, and screencasting software to video record the user interactions with the digital forms
Questionnaires (System Usability Scale and User Experience Questionnaire) to measure usability and R-studio for data analysis
System Usability Scale usability score, usability errors, and task completion times
Counterbalanced experiment to avoid any learning bias
Typical patient scenarios were presented to the user to facilitate the form completions.
Summary analysis of System Usability Scale scores, User Experience Questionnaire results, task completion times, error rates using descriptive statistics, and boxplots
Hypothesis testing (
This study involved the analysis and comparison of three different digital form designs that were developed as candidates for recording patient referrals to a PPCI service at Altnagelvin Hospital (Northern Ireland, the United Kingdom). This study only aims to compare the digital forms, as there are already studies that compare paper forms with digital or electronic forms [
The three different digital forms were developed using the HTML 5 and cascading stylesheets (CSS3) following the model view controller paradigm. An open-source scripting library was used to convert the web form into a conversational form [
Screenshots of a part of the single-page form.
Screenshots of the screens from the multipage form.
Screenshots of the conversational form.
Expected pros and cons of the three digital forms.
Form type | Pros | Cons |
Single-page form |
Easy to understand Common form style and meets expectations User can view all questions and input fields expected of them User can predict the work required to complete the form Easy to navigate to all information on a single page |
High information rate. Busy looking screen with possible clutter User can be distracted by the number of questions required The screen can require more mental workload to interpret Information overloading can result in visual hierarchy issues |
Multipage form |
Deconstructing a task into subtasks reduces cognitive load Less distracting for users User can be guided and focused on a small set of related questions Creates a sense of progression |
Additional interactions (clicks) to navigate to the different sections Misleads the user into thinking the form is shorter than it is It might take longer to complete User needs to navigate to change answers from a previous form subsection |
Chatbot form |
Easy to use Fewer distractions given only one question is presented per interaction It is akin to everyday human interaction or to being interviewed and hence engenders focus Less cognitive demand It is novel |
Not a common form style Editing previous input could be cumbersome and require a lot of interactions It seems too playful for formal settings such as medicine Preset sequence to follow |
The participants identified to be suitable and interested in participating were given a participant information sheet, and written informed consent was obtained from all participants interested in the study (by the author).
This study tested three different digital forms in a simulated setting where each participant was given a brief tutorial on how to use the tablet PC (Microsoft Surface Pro) that hosted the digital forms. Each participant was provided with the same four PPCI triage–simulated scenarios written on a sheet as shown in
Flowchart depicts the usability testing session flow. SUS: System Usability Scale; UEQ: User Experience Questionnaire.
The researcher observed the participants while they completed the forms, and notes were taken to record usability issues. Form completion was recorded using a screen recording software (FreeScreenRecorder by Screencast-O-Matic [
The SUS is commonly used and is a validated questionnaire consisting of 10 items. The scoring of this questionnaire provides a usability score ranging from 0 to 100. An SUS score of >68 is considered above average, and anything <68 is considered below average. A study by Tullis and Stetson [
A customized postexperiment questionnaire was administered at the end of the session. The postexperiment questionnaire was a final customized researcher-created questionnaire. This questionnaire had 21 usability-related questions that focused more on the needs and types of preferred forms and preferred features.
The recorded data were then analyzed to compare the usability and user experience for each form. This process was used for each subject and also consisted of (1) the concurrent think-aloud protocol and a brief interview, (2) screen recording of the user interactions, and (3) usability evaluation of the final digital form prototypes (60 usability tests: 20 subjects×3 forms). Each participant was observed while they completed each digital form. The screencast was used to analyze and evaluate the user’s behavior.
The data were collected through observations made while the participants were interacting with the digital forms. We then computed the error rate, task completion time, and user satisfaction. For the error rate analysis, a possible error list was made for each form design, and then, the number of errors was noted for each digital form against each user. The least task completion time for a form and the lowest error rate for a particular form can indicate the best form eliciting the highest user satisfaction. User satisfaction was also more explicitly covered in the SUS and UEQ. The postexperimental questionnaire also asked the user about their preferred choice of digital form design.
Different statistical metrics are used, including median, mean, and SD for the variance. The paired two-tailed
Research governance permission was granted by the Western Health and Social Care Trust (WT 19/08, Integrated Research Application System 262557) and complied with the Declaration of International Research Integrity Association (
On the basis of the research, an SUS score of >68 is considered above average [
Boxplot for the average System Usability Scale score of each form. The single page had a mean System Usability Scale score of 76 (SD 15) and outperformed the usability of the multipage and conversational forms with mean System Usability Scale scores of 67 (SD 17) and 57 (SD 24), respectively. Even with a β coefficient of .015, the results are still significant.
The UEQ used in this study was modified from the original version by making it unidirectional and also included the one-sided factors. The single-page form mostly had higher averages for the positive attributes than the other two digital forms. The conversational form scored higher averages in the negative attributes, which suggests that the conversational form had the least usability.
Bar chart showing positive attribute results of the User Experience Questionnaire for all three forms. The single-page form has higher averages for the positive attributes than those of the other two digital forms.
Bar chart showing negative attribute results of the User Experience Questionnaire for all three forms. The conversational form had higher averages for the negative attributes than those of the other two forms, which suggests that the conversational form had the least usability.
Task completion refers to the total time a user takes to complete each form. Participants took the least time to complete the paper form. However, the least mean time was recorded for the single-page form, followed by the conversational form among the three digital forms. Users took longer to complete the multipage form.
On the other hand, the activator nurses who took the least time to complete the paper form took almost twice the amount of time to complete the digital form compared with the paper form (165, SD 55 s vs 301, SD 68 s;
Boxplot for the average form completion time of each form. The primary percutaneous coronary intervention activator nurses took the least time for the paper form, as they are currently using this for primary percutaneous coronary intervention referrals. However, the research nurses who had no prior exposure to this paper form took almost as much time as the time activator nurses took to complete the digital forms (mean 224 seconds, SD 54 seconds vs mean 298, SD 60 seconds;
Boxplot for the average form completion time of activators versus research nurses. (A) Activator nurses’ form completion time and (B) research nurses’ form completion time.
Form comparisons | |
Single-page form and multipage form | <.001 |
Single-page form and conversational form | .02 |
Single-page form and paper sheet | <.001 |
Multipage form and conversational form | .10 |
Multipage form and paper sheet | <.001 |
Conversational form and paper sheet | <.001 |
There was a weak correlation (
Scatterplot for the overall correlation between the System Usability Scale score and task completion time. There was a weak correlation (
Scatterplot for the overall correlation between the System Usability Scale score and each form's completion time. SUS: System Usability Scale.
Upon inspection of the video screen recordings, the use errors and their frequency were recorded. A use error can have 1 of 4 severity ratings according to Neilson’s 4-star severity scale, that is, cosmetic, medium, serious, or critical. There were no critical use errors; however, there were many serious use errors in the conversational form. The multipage form errors were 69% medium errors, whereas the single-page form had only 31% medium errors and very few cosmetic errors.
On the basis of this usability study, approximately 83 use errors (average severity 3.0) were discovered in the conversational form, 35 use errors (average severity 2.0, SD 0) were discovered in the multipage form, and 21 use errors (average severity 1.76, SD 0.44) were discovered in the single-page form. The severity of these use errors is shown in
Bar graph for each form’s error severity. The multipage form errors were 69% medium errors, whereas the single-page form had only 31% medium errors and very few cosmetic errors.
Approximately half of the participants preferred the single-page form. In response to an open-ended question, the users mentioned that the single page was “easy to complete,” “easy to understand,” “well-marked and separated,” and “clearer” and that “all the information is available to see at once.” For the multipage form, the users said the “entire information isn’t available” and that they “don’t like to navigate.” For the conversational form, the users said that it was “unpredictable” and “difficult to understand and use” and that they “couldn’t go back easily to the options if they need to or want to.”
This study has shown that a single-page digital form outperformed the multipage and conversational forms while performing usability evaluation for the three digital forms designed for PPCI referrals to better understand the usability needs of nurses. This is an interesting finding, as the conversational form was previously used successfully to aid in different areas [
The correlation analysis between the SUS score and task time showed no strong relationship, indicating that task completion time alone cannot measure the usability of a system. All the standard usability metrics considered in this research concluded that the single-page digital form performed better than the multipage and conversational forms. Moreover, while answering an open-ended question in the final questionnaire, more than half of the participants chose the single-page form as their preferred choice. Some of the reasons for preferring the single-page form were that it is easy to complete, easy to understand, well-marked and separated, clear, and all the information is available to see on one screen. For the multipage form, participants did not seem to like navigating between the pages. For the conversational form, participants found it more unpredictable; difficult to understand and use; and, most importantly, to be unable to conveniently go back to change data inputted if they needed to.
Usability assessment and appropriate form design or form design guidelines are vital for health care departments. For form filling in health care, if the form is not well designed, people will have to think harder to complete it. If they think harder, it means they will take longer to fill in the form, so they could miss information or skip it or even enter wrong information. If people take long time to fill the forms, it takes them away from the actual patient care. If they make mistakes and put in wrong information, any algorithms, data analysis, or dashboards that use those data would be wrong. Clinical strategies and decision making at the board level or hospital level based on those data would be wrong because a nurse had not completed a digital form properly. The fact that the digital form is being used routinely and at a high frequency makes their usability crucial because you will think that a system as simple as a form should not require a high mental workload. It should be as intuitive and as simple as possible. A digital form impacts algorithm development and policy decision making because much of the data are based on policy decision making, which means that if data are wrong, then the policies are also wrong. If people are not putting in the right data, then policy decisions will be faulty as well. In this day and age, we make many decisions based on the data, so data can be either new oil or a new snake oil if the data are misleading or wrong. Data are substantial if it is correct, but it can lead to bad decisions if data are not correct. The results from the study clearly show that a single page from has better usability overall than its multipage and conversational form counterparts. This has implications for form design moving forward but, in many ways, reinforces good user experience design guidelines when it comes to form design [
The digital forms were trialed at only one hospital with a small group of health care professionals, and the usability results may differ at other centers. However, the ethical approval board is in the process of including another hospital site in the study to increase the number of participants. The study was conducted in a simulated scenario in which the location and patient presentation were simulated. Perhaps in real scenarios, participants would be under more pressure (eg, time pressure). Usability data were not recorded for the paper version. No usability data are available for the paper form, as the usability questionnaires (SUS and UEQ) are designed to assess digital interfaces. Paper forms are what health care staff are very familiar with and might bias any comparisons made. For example, they have already adopted paper systems and have become experts in paper form filling. Hence, it can be argued that it is unfair to compare paper form completion with digital form completion because this compares expert use with novice use. Moreover, another key limitation is that perhaps single-page digital forms are preferred because that format is also widely used and users might have already become familiar with these form styles.
How will people complete digital forms in the future? This is an interesting question, especially in the era of artificial intelligence. Perhaps there will be more intelligent smart speakers that will be used for completing forms, for example, an artificial intelligence algorithm that listens to the patient’s details and completes the form using natural language understanding. However, talking to a computer requires more effort than selecting options in a form. Further research is required to explore these ideas.
In conclusion, the digital single-page form outperformed the other two forms in almost all usability metrics. The mean SUS score for a single page was 76 (SD 15), with the least task completion time when compared with the other two digital forms. Moreover, on answering the open-ended question, the single-page form was also the preferred choice. However, this preference might change over time as multipage and conversational forms become more common. For example, the conversational form’s SUS scores achieved a greater variance, indicating a possible dichotomy among participants regarding the perceived usability and usefulness of chatbot style form.
Simulated patient scenarios provided for form filling.
Ethical approval certificate/letter.
electronic health record
primary percutaneous coronary intervention
System Usability Scale
User Experience Questionnaire
This research was supported by the European Union’s INTERREG VA Programme, managed by the Special EU Programmes Body. The views and opinions expressed in this paper do not necessarily reflect those of the European Commission or the Special EU Programmes Body.
All of the authors were responsible for study conception; the design, analysis, and interpretation of results; and the revision of the manuscript.
None declared.