Published on in Vol 8, No 1 (2021): Jan-Mar

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/21884, first published .
Perceptual Gaps Between Clinicians and Technologists on Health Information Technology-Related Errors in Hospitals: Observational Study

Perceptual Gaps Between Clinicians and Technologists on Health Information Technology-Related Errors in Hospitals: Observational Study

Perceptual Gaps Between Clinicians and Technologists on Health Information Technology-Related Errors in Hospitals: Observational Study

Original Paper

1Department of Management Science and Systems, School of Management, State University of New York at Buffalo, Buffalo, NY, United States

2School of Medicine and Biomedical Sciences, State University of New York at Buffalo, Buffalo, NY, United States

*all authors contributed equally

Corresponding Author:

Theophile Ndabu, MBA

Department of Management Science and Systems

School of Management

State University of New York at Buffalo

203 Alfiero Center

Buffalo, NY, 14260

United States

Phone: 1 7166453271

Email: tnndabu@buffalo.edu


Background: Health information technology (HIT) has been widely adopted in hospital settings, contributing to improved patient safety. However, many types of medical errors attributable to information technology (IT) have negatively impacted patient safety. The continued occurrence of many errors is a reminder that HIT software testing and validation is not adequate in ensuring errorless software functioning within the health care organization.

Objective: This pilot study aims to classify technology-related medical errors in a hospital setting using an expanded version of the sociotechnical framework to understand the significant differences in the perceptions of clinical and technology stakeholders regarding the potential causes of these errors. The paper also provides some recommendations to prevent future errors.

Methods: Medical errors were collected from previous studies identified in leading health databases. From the main list, we selected errors that occurred in hospital settings. Semistructured interviews with 5 medical and 6 IT professionals were conducted to map the events on different dimensions of the expanded sociotechnical framework.

Results: Of the 2319 identified publications, 36 were included in the review. Of the 67 errors collected, 12 occurred in hospital settings. The classification showed the “gulf” that exists between IT and medical professionals in their perspectives on the underlying causes of medical errors. IT experts consider technology as the source of most errors and suggest solutions that are mostly technical. However, clinicians assigned the source of errors within the people, process, and contextual dimensions. For example, for the error “Copied and pasted charting in the wrong window: Before, you could not easily get into someone else’s chart accidentally...because you would have to pull the chart and open it,” medical experts highlighted contextual issues, including the number of patients a health care provider sees in a short time frame, unfamiliarity with a new electronic medical record system, nurse transitions around the time of error, and confusion due to patients having the same name. They emphasized process controls, including failure modes, as a potential fix. Technology experts, in contrast, discussed the lack of notification, poor user interface, and lack of end-user training as critical factors for this error.

Conclusions: Knowledge of the dimensions of the sociotechnical framework and their interplay with other dimensions can guide the choice of ways to address medical errors. These findings lead us to conclude that designers need not only a high degree of HIT know-how but also a strong understanding of the medical processes and contextual factors. Although software development teams have historically included clinicians as business analysts or subject matter experts to bridge the gap, development teams will be better served by more immersive exposure to clinical environments, leading to better software design and implementation, and ultimately to enhanced patient safety.

JMIR Hum Factors 2021;8(1):e21884

doi:10.2196/21884

Keywords



Background

The widespread use of information technology (IT) has contributed to improved patient safety in the hospital setting [1-5]. However, many different kinds of medical errors attributable to the use of IT in health care have negatively impacted patient safety [6,7]. The number of patients who experience adverse events is estimated to be 40% of all patients who visit primary and ambulatory care [8]. These safety events may lead to an extended hospital stay, additional side effects, or distress and in some cases death. In addition to the loss of life and health impairment, the consequences of adverse events include increased financial costs to patients and the society at large [9].

In hospital settings, several benefits, including health care delivery improvement and reduction in medication errors, have been attained through the use of health information technology (HIT) [3]. However, new patient safety errors attributable to the use of HIT continue to be a significant issue [7]. For example, according to a recent study [10], in Pennsylvania alone, a total of 889 medication error reports listed HIT as a factor contributing to events submitted to the Pennsylvania Patient Safety Authority in the first 6 months of 2016. The study also shows that dose omission, wrong dosage, and extra dosage were the most commonly reported events. The most common HIT systems implicated in the events were the computerized prescriber order entry system, the pharmacy system, and the electronic medication administration record. Several government agencies and academic and clinical practitioner committees have been concerned about the unintended consequences of introducing IT in clinical environments. Several articles [9-11] report such adverse patient safety events related to HIT and emphasize the need for more cohesive HIT development processes to reduce the gulf of evaluation between medical and IT teams.

This pilot study seeks to classify patient safety events in hospital settings and to understand the differing perspectives of HIT designers and users concerning the potential causal factors of technology-related medical errors. In addition, the study suggests prescriptive measures to prevent reoccurrences of errors. Understanding the perspectives of both medical and IT stakeholders could help resolve the root causes of medical errors. The proposed classification could be used in facilitating medical and technology stakeholders in working together and working through different perspectives on the causes of HIT-related errors to identify likely solutions and ultimately design better HIT artifacts. To better understand the significant differences, we selected from our list of errors collected through the literature review, 12 archetype errors that occurred in a clinical setting, and examined them using the lens of sociotechnical theory from both clinical and IT systems perspectives. In the next section, we introduce the sociotechnical framework and present the proposed error classification. Following this, the Methods section details data collection and analysis. Subsequently, the results and discussion are presented before the Conclusions section.

Sociotechnical Framework

The sociotechnical theory posits that organizational performance depends on the interactions between social and technical factors, grouped into 4 pillars: technology, process, people, and environment [12]. Prior research suggests that developing applications that cater to end-user needs requires designers and developers to understand the workflow structures, organizational culture, and environment in which these systems will operate [13]. Hence, patient safety improvement is contingent on the joint optimization of social and technical factors in the hospital setting.

This paper creates a more detailed taxonomy by adding subcomponents of the 4 central pillars to the sociotechnical framework [12,13]. The expanded taxonomy allows for a better classification of errors and the development of more precise solutions. Furthermore, we classify the errors in terms of the causes based on the feedback of medical experts and IT professionals. Using the results of this classification process, we provide more in-depth insights into the significant differences in medical and clinical staff members’ and IT professionals’ perceptions regarding these errors and offer a prescription to mitigate them.

Several studies have used the sociotechnical framework to examine several aspects of HIT implementation and use, including human-computer interaction [14], the impact of policy, infrastructure, and people on the quality of health information [15], ergonomic and macroergonomic aspects of health technologies [16-20], risk assessment of electronic medical record safety [18], and usability factors [14,18]. The sociotechnical framework has also been used to classify patient safety events [21-23]. However, these studies have classified errors on the sociotechnical framework’s high-level dimensions on which errors map the most (Table 1 shows a comparison of the 3 published papers closest to our efforts and details how this study is different). The sociotechnical framework suggests that multiple forces from multiple dimensions (and different hierarchical levels of a particular dimension) are at work when errors occur [24]. As patient safety events occur in a complex environment, there is a need for a classification that considers the impacts of multiple dimensions of the framework on each patient safety event’s occurrence. Table 1 provides a summary differentiating the studies closest to the work in this paper. These studies were included because the authors used the sociotechnical framework to classify medical errors [21,23] or HIT-related sentinel events [22].

Table 1. A comparison with previous studies based on the use of the sociotechnical framework
Studies (references)Methodologies for error classification

Errors classified in 1 high-level dimension only—fitting one dimension excludes othersErrors classified in one dimension and its subdimensions only—fitting one dimension excludes othersErrors classified in multiple high-level dimensionsClassification based on multiple dimensions and their subcomponentsOne error at a time
Safety huddles to proactively identify and address electronic health record safety [21]ab
Contribution of sociotechnical factors to health information technology–related sentinel events [22]
Exploring the sociotechnical intersection of patient safety and electronic health record implementation [23]
This study

aMethodology applicable to the study.

bMethodology not applicable to the study.

Medical error classifications have been developed using other approaches. The System Theoretic Accidents Models and Process framework has been used to classify medical errors in 3 broad categories: feedback, control action, and knowledge errors [25]. The Human Factors Classification Framework [26] has been adapted to health care to classify medical errors in 5 categories: decision errors, skill-based errors, perceptual errors, routine violations, and exceptional violations [27,28]. Other studies have developed taxonomies without the use of a particular framework [29-31]. Prior studies have not applied the sociotechnical framework on medical errors with the intent of exploring the root causes and potential avenues through which the errors can be fixed. Furthermore, the dimensions of sociotechnical frameworks described in the extant research literature have not considered the emergence of new technologies such as cloud computing, n-tier architectures, and new management paradigms, including DevOps and microservices architecture. We adapted and extended the sociotechnical framework with additional dimensions that reflect new trends in IT. A group of expert researchers in information systems and sociotechnical theory reviewed this model [32]. Feedback from these experts was incorporated to refine the classification model, which is presented in Figure 1.

Figure 1. Error classification model. UI: user interface.
View this figure

Proposed Classification

Sociotechnical theory emphasizes the interplay of the social and technical aspects of adopting and using technology [17,18,33]. The theory hinges on four basic constructs (technology, people, process, and environment) and the interaction between these constructs. In the expanded version of the sociotechnical framework, we detail the components of the technology dimension to include the IT infrastructure, which in turn comprises hardware, software, and apps. These also include emerging technologies, such as cloud computing, the internet of things, mobile apps, and the use of artificial intelligence, predictive and prescriptive analytics, and robotics. The technology dimension can also be partitioned based on the type of use, broadly classified as either administrative (including administrative IT and resource scheduling) or clinical. The need to investigate at this level of detail stems from the fact that the type of interaction varies based on the interacting subcomponents. Furthermore, the app layers can be viewed as comprising the user interface, middleware (including the logic layer), backend (including the logic layer), and data.

The process dimension includes administrative and clinical workflows. Administrative workflows related to IT include the collection, storage, processing, and presentation of information for more effective resource management, such as clinical and IT staff management, operating room scheduling, risk and safety management, billing and facility management, and inventory management to ensure the business management of the hospitals. The subdimensions of IT processes are software development, HIT implementation and maintenance, and training and support. Clinical processes include patient record management, clinical pathways, patient bed assignment, and physician notes. Some processes are both clinical and administrative; these include the inventory management of drugs and clinical supplies, surgery room and equipment scheduling, and patient discharge management. Processes in health care settings allow all stakeholders to perform tasks in a predetermined manner to obtain successful outcomes [24,34,35]. Patient safety errors manifest when there is a misalignment between the elements of IT and clinical processes.

The people dimension includes patients, clinical staff, and administrative staff. People interact with each other and with the technology available to them. The hospital employee space consists of providers with different competencies and clinical authorities and administrative staff with priorities that are often very different from those of clinical providers. Several examples are worth mentioning here. First, clinical staff members prioritize patients' clinical health, whereas IT personnel are more concerned with the processes involved in health care. Inconsistencies in their priorities often lead to errors. As people interact with the entire work system, a mismatch between people and any other components increases the risk of harm to patients. Human errors are also a threat to patient safety [36]. Therefore, it is essential to build user interfaces and systems that consider the priorities and goals of the different types of users of the system, and these goals go beyond the purely functional and technical requirements of the job.

The environment consists of the care setting (eg, ambulatory, emergency, and in-patient), regulatory (eg, compliance, privacy, and security related), and culture. Culture stems from management style, organizational policy, and other systemic factors. Furthermore, different types of employees prioritize different goals, and conflicts in achieving these goals are often manifest in the building, implementation, and functioning of systems. Patients receiving services are external to the health care organization. To ensure more effective health care service provisioning, patient participation in the process is very important. In some areas, tasks must be performed by patients away from the health care organization. Contextual environments and skills to perform the required tasks differ from those of health care providers [33,35]. Regulations can also have a constraining effect on the error-free functioning of all subsystems. A thorough classification of patient safety events should consider specific areas of interaction between the environment dimension and all other dimensions. We use this expanded classification model to understand the gap in the mental models of clinical staff and technology professionals regarding the root cause of errors and how they should be addressed. We articulate our research design in the next section.


Research Design

The research design is comprised of 2 significant steps: developing a shortlisted set of IT-related patient safety issues and the classification of the root causes of medical errors with the sociotechnical lens using expert interviews. Figure 2 depicts the flow of the study.

Figure 2. Research flow.
View this figure

Error Collection Using Literature Review

In this study, we first developed an extended sociotechnical framework that includes a finer level of granularity. Next, we systematically reviewed the literature on patient safety and medical errors from Ovid-MEDLINE, Embase, and Web of Science, which are leading medical databases in addition to Google Scholar. The systematic review process shown in Figure 2 aligns with commonly used steps of the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines [37], as depicted by several exemplar papers [38-40]. The searches were performed using the following search terms: (“Patient Safety” OR “Medical”) AND (“issue” OR “error”) AND (“health information technology” OR “information technology”). Initially, the title, abstract, and index terms were used to screen published journal papers, conference papers, proceedings, case studies, and book chapters. We also used ancestral search to locate potentially relevant articles. Subsequently, the shortlisted papers were reviewed entirely. Two reviewers performed the screening independently. The reviewers met regularly to discuss the inclusion of the studies. A third reviewer was consulted when there was a discrepancy. Interrater reliability indicated a high agreement (Cohen κ value of 0.95).

Inclusion criteria included studies that addressed patient safety by identifying specific issues that occurred in health care settings and linked these errors to HIT. Furthermore, we excluded studies that were not available as the full text in the final search; were not in English; or were reports, abstracts only, letters, or commentaries.

Expert Interviews

An invitation email to participate in the study was sent to the alumni of the University at Buffalo. The email contained the eligibility criteria consisting of ≥5 years of HIT experience and at least 1 IT-related professional certification. A separate invitation email mentioning the selection criteria was sent to medical experts through the Office of Business Coordination at the University at Buffalo. A minimum experience of 5 years working as a medical doctor or as a registered nurse was required to qualify for the interview. All participants who responded met the selection criteria and were included in the study.

To better understand the perspectives of different stakeholders, we conducted multiple semistructured interviews [41] with different stakeholders, namely 6 IT and 5 medical experts to map the errors on the different dimensions of the expanded sociotechnical framework. Experts could map an error on multiple (or on all) subdomains of the sociotechnical framework to show the different sociotechnical factors that could contribute to the error. The purpose of accounting for the different perspectives was to understand how each group understood the predicates of the problem and allow us to reflect on how best the error could be addressed. Interviews were selected based on their domain experience, education, and industry certifications. The IT experts, who were recruited from the alumni list of the State University of New York at Buffalo, were software development professionals with a master’s degree and IT professional certifications, such as the certified scrum master, the health level 7 control specialists, and the project management professional certifications. The minimum work experience cutoff for IT experts was 5 years for HIT in addition to possessing at least one IT-related professional certification.

IT experts who were interviewed had extensive IT experience (mean 10.33, SD 1.11 years) with significant HIT experience (mean 8.83, SD 2.03 years; Multimedia Appendix 1 uploaded as a supplementary file for brief profiles of IT interviewees). The medical experts interviewed were physicians and registered nurses with broad primary care experience from working with multiple health care institutions across the United States and Canada. They are all currently working with hospitals and institutions affiliated with the university at Buffalo (Multimedia Appendix 2). Medical experts had a mean experience of 16.6 (SD 7.33) years. The minimum and maximum numbers of years of HIT experience for IT experts were 5 and 12, respectively. The work experience of medical experts varied from 8 to 27 years. The questionnaire and interview process are detailed in Multimedia Appendix 3. Experts were asked to provide their opinions on why the selected errors (Multimedia Appendix 4 [42-48]) occurred and how the errors could be prevented. The extensive experience of both IT and medical experts in their respective domains qualifies them to map medical errors on the sociotechnical framework. The study was approved in November 2019 (IRB# STUDY00003838).


Search Results

The literature search resulted in 344 articles, 141 of which were duplicates. After removing articles based on their content, we retained 36 articles [10,28,42-47,49-76] that met the 2 criteria set for the study. We then extracted 67 unique patient safety events from the articles in which 12 specific issues related to IT use in the hospital setting were shortlisted. The process followed the PRISMA methodology [37] as detailed in Figure 3. The remaining errors occurred outside a health care setting and were excluded from the study. The error description includes the error context in the literature review format commonly known as problems, interventions, comparisons, and outcomes model [37]. The articles describing the errors contained a clear purpose, literature review, research methodology, results, and conclusions.

Figure 3. Data collection method.
View this figure

Study Characteristics and Error Classification

In this study, experts categorized errors based on their opinion of where the source of the error lies. Experts were provided with the definitions of the elements of the framework and were informed that an error could result from multiple sources. They were asked to map each error at the lowest level of one or multiple dimensions of the sociotechnical framework. The authors then interacted with the experts to understand the reasons behind their mapping selection. The interactions included questions related to suggestions on the best way to address the problems and prevent them from occurring. In line with extant literature on data analysis in qualitative research coding [77,78], expert interviews were subsequently deconstructed into keywords and phrases and then grouped into ideas and concepts. The output of the analysis is summarized in the “key observations” below, for example, in Error 1: “Copied and pasted charting in the wrong window: Before, you could not easily get into someone else's chart accidentally...because you would have to pull the chart and open it.”

Medical experts highlighted several contextual issues, such as the number of patients a health care provider is set to see in a short time frame, unfamiliarity with a new electronic medical record system, nurse transitions around the time of the error, and confusion due to patients having the same name. They emphasized process controls, including failure modes, as a potential fix. The technology experts discussed the lack of notification, poor user interface, and lack of end-user training as critical factors in this error. Error 2: “Incompatible data standards across multiple mobile applications led to the missing of vital data fields, which led to information loss.”

Like the first sample, medical experts attributed this error to system software–related interoperability issues. They also highlighted several changes in the International Classification of Diseases (ICD) during the transition from ICD 9 to ICD 10 as an example of a situation that could lead to errors. Technology experts, however, emphasized data formats, data transfer protocols, and service-orientated architecture as potential causes of errors.

Although we have detailed 2 instances here, the experts reviewed all 12 errors and identified the most likely set of possible dimensions to which the errors could be attributed. The sample errors used in the study are presented in Multimedia Appendix 2, and the results of analyzing these data are presented in Table 2, followed by several key observations.

Table 2. Classification by medical and IT experts.
ErrorsClassification by medical expertsClassification by ITa experts
Nurse was supposed to enter a prescription for Amoxicillin 250 mg PO q8h×7 days (21 dispensed). However, the nurse failed to change the default dosage amount and dispensed too much medication (30 dispensed)
  • UIb-clinical app implementation and maintenance
  • Clinical staff
  • UI-clinical app software development
  • Clinical staff
Copied and pasted charting in the wrong window: “Before, you could not easily get into someone else’s chart accidentally...because you would have to pull the chart and open it”
  • Clinical staff
  • Clinical app
  • Training
  • Clinical staff
  • In-patient
In general practice ward, the doctor consulted a patient with another patient’s records and prescribed medications according to the wrong records. The patient died the same day of taking it. No further details were available
  • Clinical staff
  • Clinical UI
  • Clinical middleware
  • Clinical UI
  • Implementation and maintenance
  • Staff-admin (IT)
The receptionist intended to alert the general practitioner via the practice software about a patient with chest pain but instead sent the message to himself. The patient later died from a myocardial infarction
  • UI-patient pathways
  • Clinical staff
  • UI-clinical app software development
  • Training and support
  • Patient pathways
A patient received only half of their usual quantity of blood pressure medication because a repeat prescription for the medication did not transfer to a new software system when the patient\'s historical records were migrated. Because they did not have enough medication the patient tried to stretch out the old dose by taking the medication on alternate days. The patient had a stroke but made a full recovery.
  • Software-systems
  • Patient pathways
  • Patient
  • Data-clinical
  • Software development
  • Implementation and maintenance
A child had a full body x-ray. Some of the images went missing from the archival system where they were digitized. The x-ray was repeated to acquire the missing images, re-exposing the child to high levels of radiation
  • Software-systems
  • Patient pathways
  • Data-clinical
A compound in high demand such as Rifampicin was not listed in the computerized physician order entry system. The consequence was that the physician could not order rifampicin.
  • Data-clinical
  • Ancillary
  • In-patient
  • Culture
  • Data-clinical
  • Software development
  • Staff-admin (operations)
  • Culture
When an update is made to the frequency field on an existing prescription, the frequency schedule ID is not simultaneously updated on new orders sent to the pharmacy via (application)
  • Software-development
  • Clinical-people
  • Software-systems
  • Data-clinical
  • Staff-admin (IT)
  • Software-systems
Monitoring and Eavesdropping on Patient Vital Signs by hacking into the packet transfer from an internet of things device to the central system
  • Middleware
  • Maintenance
  • People-staff (operations)
  • Compliance
  • Security
  • System software
  • Data-clinical
  • Software development
  • Compliance
  • Security
Vulnerabilities of the hospital’s IOT devices were exploited to initiate a denial-of-service attack to bring down hospital’s servers which disrupted normal functioning
  • Hardware
  • Software
  • IT implementation
  • Compliance
  • Security
  • Compliance
  • Security
Use of portable devices that are not password protected makes the patient record vulnerable to the invasion of privacy
  • Data-clinical
  • Software-development
  • Maintenance
  • Compliance
  • Privacy
  • System software
  • Software-development
  • Security
Incompatible data standards across multiple mobile applications led to the missing of vital data fields which led to information loss
  • Software-systems
  • Software-development
  • Data-clinical

aIT: information technology.

bUI: user interface.


Principal Findings

Some of the crucial observations include (1) The identified potential sources of the errors and solution areas differed considerably between clinicians and IT specialists; (2) both groups identified multiple factors as potential causes of the errors; (3) the clinicians often focused on postproduction (eg, implementation, maintenance, training, context, and the way the application is used) issues as causal factors; (4) IT experts focused on software functionality, software development, and technical implementation issues as causal factors; (5) on most occasions when IT experts identified an issue as a “data” problem, clinicians seemed to think that the problem lay elsewhere, including the software system, software development, or patient pathways; (6) both groups seem to be congruent with the issues of compliance and security; and (7) IT experts rarely identified clinical pathways or workflows as an issue.

The classification of the identified medical errors using the framework is presented in Table 2. The continued occurrence of many errors is a reminder that current HIT software testing and validation do not seem adequate in terms of ensuring the functioning of the software within the health care organization. The attribution of the errors to different aspects of the sociotechnical framework by clinicians and IT professionals informs us that technologists and clinicians generally differ in their perspectives on factors that impact IT-related safety events. Software experts are often not acclimatized to the environment in which HIT software and tools are used, which could be a cause to the problem.

Although IT and medical experts’ perceptions are similar in security and privacy, IT specialists often tend to assume that the issues are either software or hardware or user interface related. In contrast, clinicians tend to consider environmental, contextual, and process factors as contributors to patient safety events. The benefit of such a classification suggests that designers and developers who fix the errors consider the artifact's environment and the people using the artifact. A key realization is that such errors will continue to occur if health IT system developers do not fully grasp the importance of technology functioning in an environment of care delivery where the patient needs are paramount.

A careful review of the IT experts’ classification of errors highlights the view that IT experts consider technology as the source of most errors and suggest solutions that are mostly technical. The IT experts highlighted the software systems and development as the top 2 sources of most errors. Similarly, the suggestions of potential fixes mostly revolve around the software. However, a common refrain that accompanied their answers was, “The doctor should double-check...” In contrast, clinicians tended to assign the source of errors within the people, process, and contextual (environmental) dimensions for the most part.

The difference in perspective could be explained by the fact that clinicians tend to deal with the system after implementation. In contrast, IT experts tend to look at the same problem from an IT development perspective. For example, for “Error 1,” for which IT experts were asked how they would prevent a doctor from using the wrong chart when he had multiple charts open, the answer was always to restrict access to 1 open chart at a time. However, clinicians prefer having multiple windows open so that they can quickly consult with multiple patients in different rooms without having to close out and reopen a chart. For them, the issue is, “How easy is it for a physician to realize the mistake,” and “Physicians should still be able to open multiple charts.” The differing perspectives between designers and developers of the technology and its users can contribute to medical errors.

The development teams of clinical applications typically include clinicians who provide domain expertise. However, our study indicates that this may not be sufficient as IT experts do not fully grasp the clinical environment and how workloads and other patient-related variabilities impact the use of the software. Therefore, as a future investigation, we suggest that software companies immerse developers in clinical environments for a short period, so that the understanding of the environment is built into their psyche and translates into a more robust design.

HIT systems can be made less error prone if programmers and systems developers understand the health care organization's operating environment. Current systems do not have fail-safe mechanisms that could prevent some of the errors. For example, consider the documented error, “the nurse was supposed to enter a prescription...the nurse failed to change the default amount and dispensed too much medication”; from a software perspective, better checks and warnings can be developed. In this specific instance, a system challenge asking the nurse to review the dosing amount could have prevented the problem. From a process perspective, nurses could be trained to reexamine the dosage. Creating a poka-yoke (like a check-off box for dose amount) would force nurses to check the dosing before refilling the prescriptions. As the clinical experts and IT experts suggested slightly different predicates for the error, a solution that addresses the issue from both technical and from a process and workforce training perspective would provide multiple layers of defense against such failures. The different views expressed by IT and clinical experts can be used to create technical and process solutions so that there is a more robust defense against these types of errors.

Limitations and Future Studies

The results of this study should be interpreted cautiously, as there are several limitations to this study. The first shortcoming is related to the smaller number of participants interviewed in this study. Only 11 interviews comprising 5 medical providers and 6 HIT professionals were conducted. Therefore, this study should be considered a pilot study suggesting the differences in the mental models of the clinical and technical staff, which potentially leads to ineffective systems analysis and ultimately manifests as errors in practice. In addition, both IT and medical experts have, for the most part, acquired their education and expertise at affiliated institutions in the Northeast of the United States. Future studies should examine the hypothesis that medical experts are more likely to attribute medical errors to contextual factors, whereas IT experts on technical factors use a nationally representative sample.

Second, we shortlisted 12 unique errors that occurred in a hospital setting; the findings of this study cannot be generalized beyond that context. Furthermore, we extracted the errors used in this study from articles written in the English language. Future studies could examine errors that occurred in medical homes, patients’ homes, or other nonhospital settings or include studies written in other languages.

Third, the study did not examine errors that were discovered by HIT users before the occurrence of a patient safety event. Future studies should examine near-miss errors to determine their potential root causes and fixes using the lens of sociotechnical theory.

Conclusions

This study classifies medical errors gathered from extant literature based on an expanded sociotechnical framework. Interviews from health care and IT experts reveal differing perspectives on why medical errors occur in clinical settings. Health care experts were more likely to attribute the source of an error to the implementation and use of an IT tool, whereas IT experts were likely to identify software design and functionality as causal factors of medical errors. From the results of this study, we offer several error-prevention prescriptions that can be tested in future research. First, IT experts should observe the functioning of HIT postimplementation and collect metrics related to its impact on (1) physician consultation time, (2) physician efficiency, (3) patient-physician relationship, (4) training needs, and (5) how the software fits into the workflow and culture of the organization. Software developers should be trained to be sensitive to the provider and patient needs because their lack of exposure to postproduction issues and usage contexts leads to the development of applications that do not cater to all user situations. Understanding these situations may lead to building software constraints and improved user training. Although software development teams have historically included clinicians as business analysts or subject matter experts to bridge the gap, development teams will be better served by more immersive training and exposure to clinical environments, leading to better software design and software implementation strategies.

Authors' Contributions

TN, PM, R Sharman, R Singh contributed equally.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Interviewees—information technology experts.

DOCX File , 15 KB

Multimedia Appendix 2

Interviewees—medical experts.

DOCX File , 13 KB

Multimedia Appendix 3

Interview process and questionnaire.

DOCX File , 13 KB

Multimedia Appendix 4

List of errors.

XLSX File (Microsoft Excel File), 11 KB

  1. Buntin M, Burke MF, Hoaglin MC, Blumenthal D. The benefits of health information technology: a review of the recent literature shows predominantly positive results. Health Aff (Millwood) 2011 Mar;30(3):464-471 [FREE Full text] [CrossRef] [Medline]
  2. Kruse CS, Beane A. Health information technology continues to show positive effect on medical outcomes: systematic review. J Med Internet Res 2018 Feb 05;20(2):41 [FREE Full text] [CrossRef] [Medline]
  3. Mishuris RG, Linder JA, Bates DW, Bitton A. Using electronic health record clinical decision support is associated with improved quality of care. Am J Manag Care 2014 Oct 01;20(10):445-452 [FREE Full text] [Medline]
  4. Jones S, Rudin RS, Perry T, Shekelle PG. Health information technology: an updated systematic review with a focus on meaningful use. Ann Intern Med 2014 Jan 07;160(1):48-54 [FREE Full text] [CrossRef]
  5. Menachemi N, Rahurkar S, Harle CA, Vest JR. The benefits of health information exchange: an updated systematic review. J Am Med Inform Assoc 2018 Sep 01;25(9):1259-1265 [FREE Full text] [CrossRef] [Medline]
  6. Kim M, Coiera E, Magrabi F. Problems with health information technology and their effects on care delivery and patient outcomes: a systematic review. J Am Med Inform Assoc 2017 Mar 01;24(2):246-250 [FREE Full text] [CrossRef] [Medline]
  7. Sittig D, Wright A, Coiera E, Magrabi F, Ratwani R, Bates DW, et al. Current challenges in health information technology-related patient safety. Health Informatics J 2020 Mar;26(1):181-189 [FREE Full text] [CrossRef] [Medline]
  8. Auraaen A, Slawomirski L, Klazinga N. The economics of patient safety in primary and ambulatory care. OECD Health Working Papers 2018 Nov 29;106:1-57 [FREE Full text] [CrossRef]
  9. Van Den Bos J, Rustagi K, Gray T, Halford M, Ziemkiewicz E, Shreve J. The $17.1 billion problem: the annual cost of measurable medical errors. Health Aff (Millwood) 2011 Apr;30(4):596-603 [FREE Full text] [CrossRef] [Medline]
  10. Lawes S, Grissinger M. Medication errors attributed to health information technology. Pennsylvania Patient Safety Advisory 2017 Mar;9(1):1 [FREE Full text]
  11. Kohn L, Corrigan J. To err is human: building a safer health system. Washington, DC: National Academy Press 2000. [CrossRef] [Medline]
  12. Cooper R, Foster M. Sociotechnical systems. American Psychologist 1971;26(5):467-474. [CrossRef]
  13. Berg M, Toussaint P. The mantra of modeling and the forgotten powers of paper: a sociotechnical view on the development of process-oriented ICT in health care. International J of Med Informatics 2003 Mar;69(2-3):223-234. [CrossRef]
  14. Sittig D, Singh H. A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care 2010 Oct;19 Suppl 3:68-74 [FREE Full text] [CrossRef] [Medline]
  15. Craig S, Kodate N. Understanding the state of health information in Ireland: A qualitative study using a socio-technical approach. Int J Med Inform 2018 Jun;114:1-5. [CrossRef] [Medline]
  16. Woods D, Hollnagel E. Joint cognitive systems: patterns in cognitive systems engineering. In: Cognitive Ergonomics. Boca Raton, Florida: CRC Press; Mar 27, 2006.
  17. Carayon P, Karsh B, Gurses AP, Holden RJ, Hoonakker P, Schoofs Hundt A, et al. Macroergonomics in health care quality and patient safety. Reviews of Human Factors and Ergonomics 2013 Sep 26;8(1):4-54. [CrossRef]
  18. Sittig D, Singh H. Toward more proactive approaches to safety in the electronic health record era. Jt Comm J Qual Patient Saf 2017 Oct;43(10):540-547 [FREE Full text] [CrossRef] [Medline]
  19. Singh H, Upadhyay DK, Torretti D. Developing health care organizations that pursue learning and exploration of diagnostic excellence: an action plan. Acadamic Medicine 2020;95(8):1172-1178. [CrossRef]
  20. Rogers M, Sockolow PS, Bowles KH, Hand KE, George JE. Use of a human factors approach to uncover informatics needs of nurses in documentation of care. Int J Med Inform 2013 Nov;82(11):1068-1074 [FREE Full text] [CrossRef] [Medline]
  21. Menon S, Singh H, Giardina TD, Rayburn WL, Davis BP, Russo EM, et al. Safety huddles to proactively identify and address electronic health record safety. J Am Med Inform Assoc 2017 Mar 01;24(2):261-267 [FREE Full text] [CrossRef] [Medline]
  22. Castro GM, Buczkowski L, Hafner JM. The contribution of sociotechnical factors to health information technology–related sentinel events. The Joint Commission Journal on Quality and Patient Safety 2016 Feb;42(2):70-AP3. [CrossRef]
  23. Meeks DW, Takian A, Sittig DF, Singh H, Barber N. Exploring the sociotechnical intersection of patient safety and electronic health record implementation. J Am Med Inform Assoc 2014 Feb 01;21(e1):28-34 [FREE Full text] [CrossRef] [Medline]
  24. Donabedian A. The quality of medical care. Science 1978 May 26;200(4344):856-864. [CrossRef] [Medline]
  25. Mason-Blakley F, Habibi R, Weber J, Price M. Assessing stamp EMR with electronic medical record related incident reports: case study: manufacturer and user facility device experience database. In: Proceedings of 2017 IEEE International Conference on Healthcare Informatics (ICHI). 2017 Presented at: 2017 IEEE International Conference on Healthcare Informatics; 23-26 Aug. 2017; Park City, UT, USA   URL: https://doi.org/10.1109/ICHI.2017.97 [CrossRef]
  26. Feyer A, Williamson AM. The role of work practices in occupational accidents. In: Proceedings of the Human Factors Society Annual Meeting. 2016 Aug 08 Presented at: Human Factors and Ergonomics Society Annual Meeting Proceedings p. 1100-1104   URL: https://doi.org/10.1177/154193129103501517 [CrossRef]
  27. Mitchell R, Williamson A, Molesworth B. Application of a human factors classification framework for patient safety to identify precursor and contributing factors to adverse clinical incidents in hospital. Appl Ergon 2016 Jan;52:185-195 [FREE Full text] [CrossRef] [Medline]
  28. Mitchell R, Williamson A, Molesworth B. Use of a human factors classification framework to identify causal factors for medication and medical device-related adverse clinical incidents. Safety Science 2015 Nov;79:163-174 [FREE Full text] [CrossRef]
  29. Taib I, McIntosh AS, Caponecchia C, Baysari MT. A review of medical error taxonomies: a human factors perspective. Safety Science 2011 Jun;49(5):607-615 [FREE Full text] [CrossRef]
  30. Zhang J, Patel VL, Johnson TR, Shortliffe EH. A cognitive taxonomy of medical errors. J Biomed Inform 2004 Jun;37(3):193-204 [FREE Full text] [CrossRef] [Medline]
  31. Doumbouya MB, Kamsu-Foguem B, Kenfack H, Foguem C. Argumentative reasoning and taxonomic analysis for the identification of medical errors. Engineering Applications of Artificial Intelligence 2015 Nov;46:166-179. [CrossRef]
  32. Ndabu T, Sharman R, Mulgund P, Garg R. Design principles for healthcare applications: a patient safety focus. 2019 Jun 04 Presented at: Design Science Research in Information Systems and Technology; 2019, June 4; Worcester, Massachussetts   URL: https://drive.google.com/file/d/1Iu2SpWllKF3OsjZrBTSs9RACTJhNzz6y/view
  33. Kushniruk A. Evaluation in the design of health information systems: application of approaches emerging from usability engineering. Computers in Biology and Medicine 2002 May;32(3):141-149. [CrossRef]
  34. Lee H, Choi B. Knowledge management enablers, processes, and organizational performance: an integrative view and empirical examination. J of Mgmnt Info Sys 2014 Dec 23;20(1):179-228. [CrossRef]
  35. Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci 2016 Mar 10;11(1):33 [FREE Full text] [CrossRef] [Medline]
  36. Samy GN, Ahmad R, Ismail Z. Security threats categories in healthcare information systems. Health Informatics J 2010 Sep;16(3):201-209 [FREE Full text] [CrossRef] [Medline]
  37. Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 2009 Jul 21;6(7):1000097 [FREE Full text] [CrossRef] [Medline]
  38. Chan WS, Leung AY. Use of social network sites for communication among health professionals: systematic review. J Med Internet Res 2018 Mar 28;20(3):117 [FREE Full text] [CrossRef] [Medline]
  39. Mulgund P, Sharman R, Anand P, Shekhar S, Karadi P. Data quality issues with physician-rating websites: systematic review. J Med Internet Res 2020 Sep 28;22(9):15916 [FREE Full text] [CrossRef] [Medline]
  40. McLennan S, Strech D, Reimann S. Developments in the frequency of ratings and evaluation tendencies: a review of german physician rating websites. J Med Internet Res 2017 Aug 25;19(8):299 [FREE Full text] [CrossRef] [Medline]
  41. Galletta A. Mastering the semi-structured interview and beyond: from research design to analysis and publication. In: New York University Scholarship Online. Oxfordshire, United Kingdom: Oxford University Press; 2013.
  42. Cheung K, van DVW, Bouvy ML, Wensing M, van DBPMLA, de SPAGM. Classification of medication incidents associated with information technology. J Am Med Inform Assoc 2014 Feb;21(e1):63-70 [FREE Full text] [CrossRef] [Medline]
  43. Kushniruk AW, Triola MM, Borycki EM, Stein B, Kannry JL. Technology induced error and usability: the relationship between usability problems and prescription errors when using a handheld application. Int J Med Inform 2005 Aug;74(7-8):519-526. [CrossRef] [Medline]
  44. Leyden J. Medical device vuln allows hackers to falsify patients' vitals. McAfee: Patient monitoring systems open to hack attacks. 2018 Aug 14.   URL: https://www.theregister.co.uk/2018/08/14/patient_monitor_hack/ [accessed 2020-08-11]
  45. Magrabi, Baker M, Sinha I, Ong MS, Harrison S, Kidd MR, et al. Clinical safety of England's national programme for IT: a retrospective analysis of all reported safety events 2005 to 2011. Int J Med Inform 2015 Mar;84(3):198-206 [FREE Full text] [CrossRef] [Medline]
  46. Myers RB, Jones SL, Sittig DF. Review of Reported Clinical Information System Adverse Events in US Food and Drug Administration Databases. Appl Clin Inform 2011;2(1):63-74 [FREE Full text] [CrossRef] [Medline]
  47. Williams P, Woodward A. Cybersecurity vulnerabilities in medical devices: a complex environment and multifaceted problem. MDER 2015 Jul:305 [FREE Full text] [CrossRef]
  48. Ratanawongsa N, Matta GY, Bohsali FB, Chisolm MS. Reducing misses and near misses related to multitasking on the electronic health record: observational study and qualitative analysis. JMIR Hum Factors 2018 Feb 06;5(1):e4 [FREE Full text] [CrossRef] [Medline]
  49. Ammenwerth E, Schnell-Inderst P, Machan C, Siebert U. The effect of electronic prescribing on medication errors and adverse drug events: a systematic review. J Am Med Inform Assoc 2008;15(5):585-600 [FREE Full text] [CrossRef] [Medline]
  50. Ash J, Sittig DF, Campbell EM, Guappone KP, Dykstra RH. Some unintended consequences of clinical decision support systems. AMIA Annu Symp Proc 2007 Oct 11:26-30 [FREE Full text] [Medline]
  51. Ash J, Sittig DF, Poon EG, Guappone K, Campbell E, Dykstra RH. The extent and importance of unintended consequences related to computerized provider order entry. J Am Med Inform Assoc 2007;14(4):415-423 [FREE Full text] [CrossRef] [Medline]
  52. Bates D, Cohen M, Leape LL, Overhage JM, Shabot MM, Sheridan T. Reducing the frequency of errors in medicine using information technology. J Am Med Inform Assoc 2001;8(4):299-308 [FREE Full text] [CrossRef] [Medline]
  53. Bates DW. The costs of adverse drug events in hospitalized patients. JAMA 1997 Jan 22;277(4):307 [FREE Full text] [CrossRef]
  54. Borycki E, Dexheimer JW, Cossio C, Gong Y, Jensen S, Kaipio J, et al. Methods for addressing technology-induced errors: the current state. Yearb Med Inform 2018 Mar 06;25(01):30-40 [FREE Full text] [CrossRef]
  55. Campbell E, Sittig DF, Ash JS, Guappone KP, Dykstra RH. Types of unintended consequences related to computerized provider order entry. J of the Amer Med Info Asso 2006 Sep 01;13(5):547-556 [FREE Full text] [CrossRef]
  56. Clarke A, Adamson J, Watt I, Sheard L, Cairns P, Wright J. The impact of electronic records on patient safety: a qualitative study. BMC Med Inform Decis Mak 2016 Jun 04;16:62 [FREE Full text] [CrossRef] [Medline]
  57. Coiera E, Ash J, Berg M. The unintended consequences of health information technology revisited. Yearb Med Inform 2018 Mar 06;25(01):163-169 [FREE Full text] [CrossRef]
  58. Colpaert K, Claus B, Somers A, Vandewoude K, Robays H, Decruyenaere J. Impact of computerized physician order entry on medication prescription errors in the intensive care unit: a controlled cross-sectional trial. Crit Care 2006 Feb;10(1):R21 [FREE Full text] [CrossRef] [Medline]
  59. Cullen D, Sweitzer BJ, Bates DW, Burdick E, Edmondson A, Leape LL. Preventable adverse drug events in hospitalized patients: a comparative study of intensive care and general care units. Crit Care Med 1997 Aug;25(8):1289-1297 [FREE Full text] [CrossRef] [Medline]
  60. Harrison A, Siwani R, Pickering BW, Herasevich V. Clinical impact of intraoperative electronic health record downtime on surgical patients. J Am Med Inform Assoc 2019 Oct 01;26(10):928-933 [FREE Full text] [CrossRef] [Medline]
  61. Nebeker JR, Hoffman JM, Weir CR, Bennett CL, Hurdle JF. High rates of adverse drug events in a highly computerized hospital. Arch Intern Med 2005 May 23;165(10):1111-1116 [FREE Full text] [CrossRef] [Medline]
  62. Karsh B, Weinger MB, Abbott PA, Wears RL. Health information technology: fallacies and sober realities. J Am Med Inform Assoc 2010;17(6):617-623 [FREE Full text] [CrossRef] [Medline]
  63. Kaushal R, Shojania KG, Bates DW. Effects of computerized physician order entry and clinical decision support systems on medication safety: a systematic review. Arch Intern Med 2003 Jun 23;163(12):1409-1416 [FREE Full text] [CrossRef] [Medline]
  64. Koppel R, Metlay JP, Cohen A, Abaluck B, Localio AR, Kimmel SE, et al. Role of computerized physician order entry systems in facilitating medication errors. JAMA 2005 Mar 09;293(10):1197-1203 [FREE Full text] [CrossRef] [Medline]
  65. Kumar P, Lee HJ. Security issues in healthcare applications using wireless medical sensor networks: a survey. Sensors (Basel) 2012;12(1):55-91 [FREE Full text] [CrossRef] [Medline]
  66. La Pietra L, Calligaris L, Molendini L, Quattrin R, Brusaferro S. Medical errors and clinical risk management: state of the art. Acta Otorhinolaryngol Ital 2005 Dec;25(6):339-346 [FREE Full text] [Medline]
  67. Lapetina E, Armstrong EM. Preventing errors in the outpatient setting: a tale of three states. Health Aff (Millwood) 2002;21(4):26-39 [FREE Full text] [CrossRef] [Medline]
  68. Magrabi F, Ammenwerth E, Hyppönen H, de Keizer N, Nykänen P, Rigby M, et al. Improving evaluation to address the unintended consequences of health information technology. Yearb Med Inform 2018 Mar 06;25(01):61-69 [FREE Full text] [CrossRef]
  69. Martin G, Ghafur S, Cingolani I, Symons J, King D, Arora S, et al. The effects and preventability of 2627 patient safety incidents related to health information technology failures: a retrospective analysis of 10 years of incident reporting in England and Wales. The Lancet Digital Health 2019 Jul;1(3):127-135 [FREE Full text] [CrossRef]
  70. Metzger J, Welebob E, Bates DW, Lipsitz S, Classen DC. Mixed results in the safety performance of computerized physician order entry. Health Aff (Millwood) 2010 Apr;29(4):655-663 [FREE Full text] [CrossRef] [Medline]
  71. Pontefract S, Hodson J, Slee A, Shah S, Girling AJ, Williams R, et al. Impact of a commercial order entry system on prescribing errors amenable to computerised decision support in the hospital setting: a prospective pre-post study. BMJ Qual Saf 2018 Mar 23;27(9):725-736 [FREE Full text] [CrossRef]
  72. Schiff G. Medical error: a 60-year-old man with delayed care for a renal mass. JAMA 2011 May 11;305(18):1890-1898 [FREE Full text] [CrossRef] [Medline]
  73. Turner P, Kushniruk A, Nohr C. Are we there yet? Human factors knowledge and health information technology - the challenges of implementation and impact. Yearb Med Inform 2017 Aug;26(1):84-91 [FREE Full text] [CrossRef] [Medline]
  74. Vélez-Díaz-Pallarés M, Álvarez Díaz AM, Gramage Caro T, Vicente Oliveros N, Delgado-Silveira E, Muñoz García M, et al. Technology-induced errors associated with computerized provider order entry software for older patients. Int J Clin Pharm 2017 Aug;39(4):729-742 [FREE Full text] [CrossRef] [Medline]
  75. Wang Y, Coiera E, Gallego B, Concha OP, Ong MS, Tsafnat G, et al. Measuring the effects of computer downtime on hospital pathology processes. J Biomed Inform 2016 Feb;59:308-315 [FREE Full text] [CrossRef] [Medline]
  76. Zheng K, Abraham J, Novak L, Reynolds T, Gettinger A. A survey of the literature on unintended consequences associated with health information technology: 2014–2015. Yearb Med Inform 2018 Mar 06;25(01):13-29 [FREE Full text] [CrossRef]
  77. Flick U. An introduction to qualitative research. University of Michigan 2002:1-310 [FREE Full text]
  78. Gonzalez M. The coding manual for qualitative research: a review. The Qualitative Report 2016;21(8):1546 [FREE Full text]


HIT: health information technology
ICD: International Classification of Diseases
IOT: Internet of Things
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses
UI: user interface


Edited by B Price; submitted 29.06.20; peer-reviewed by H Hao, O Ben Assuli, A Gupta, E Kasaeyan Naeini, D Lyell; comments to author 19.08.20; revised version received 06.11.20; accepted 17.12.20; published 05.02.21

Copyright

©Theophile Ndabu, Pavankumar Mulgund, Raj Sharman, Ranjit Singh. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 05.02.2021.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on http://humanfactors.jmir.org, as well as this copyright and license information must be included.