Background: The needs of the emergency department (ED) pose unique challenges to modern electronic health record (EHR) systems. A diverse case load of high-acuity, high-complexity presentations, and ambulatory patients, all requiring multiple transitions of care, creates a rich environment through which to critically examine EHRs.
Objective: This investigation aims to capture and analyze the perspective of end users of EHR about the strengths, limitations, and future priorities for EHR in the setting of the ED.
Methods: In the first phase of this investigation, a literature search was conducted to identify 5 key usage categories of ED EHRs. Using key usage categories in the first phase, a modified Delphi study was conducted with a group of 12 panelists with expertise in both emergency medicine and health informatics. Across 3 rounds of surveys, panelists generated and refined a list of strengths, limitations, and key priorities.
Results: The findings from this investigation highlighted the preference of panelists for features maximizing functionality of basic clinical features relative to features of disruptive innovation.
Conclusions: By capturing the perspectives of end users in the ED, this investigation highlights areas for the improvement or development of future EHRs in acute care settings.
Modern electronic health record (EHR) systems face difficulties meeting the unique needs of the emergency department (ED) [- ]. High volumes of patients through the ED drive documentation burden; high-acuity cases demand efficient deployment of care measures; diagnostic uncertainty increases the need for clinical decision support tools; and the interdisciplinary, collaborative environment drives a need for EHRs to support efficient transitions of care [ ]. In addition to these challenges, changes to the field of emergency medicine over the last several decades increase the need for highly efficient and capable information systems. As the complexity of patient’s presentations to the ED increases, measures of departmental crowding rise [ ]. Complexity and nuance to treatment plans further increase need to leverage digital health tools in the management of complex patients to improve clinical decision-making and patient outcomes, albeit with increasing complexity of our digital systems [ - ]. The current COVID-19–mediated health human resource crisis has only exacerbated these challenges.
The International Standards Organization defines usability as “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use” . In the context of the ED, the specified goals of end users of an EHR may take on a variety of perspectives, given the different demands of this clinical space. A study evaluating the user-centered design principles of 11 EHR developers found that more than half of the developers had limited to inadequate interactions with clinicians in the development process of their products [ ]. Despite the complexity of the unique needs of an ED EHR, there is a gap in the literature examining the perspective of the end user in an emergency medicine setting.
Delphi methods are a validated survey method to establish consensus opinion from a panel of experts . The traditional Delphi process involves 3 rounds of information gathering: an initial round consisting of open-ended, qualitative questions followed by 2 rounds of Likert-scale rankings that allow for relative prioritization [ ]. This process may be modified by introducing an initial set of parameters to narrow the scope of discussion [ - ]. A modified Delphi method offers the benefit of allowing focused discussion around specific attributes of a given problem. Delphi methods are unique in their ability to handle mixed types of information, both qualitative and quantitative in nature. They have previously been employed in the emergency medicine setting across several areas of investigation: investigating role definition of allied health team members, the development of violence screening criteria, the establishment of violence reduction strategies, and the selection of key performance indicators [ - ]. Delphi methods offer a validated method of synthesizing diverse perspectives about the current state and future improvements to ED EHRs.
To support hospital systems and practitioners develop future procurement criteria, and prioritize modifications, additions, or upgrades to their existing EHRs, we completed a systematic assessment of end user needs and priorities in the ED. This study aims to understand the nuances of perspective in physician end users regarding the ideal ED EHR.
Identification of Key Usage Categories
In phase 1 of our study, 2 independent reviewers completed review of academic literature on MEDLINE to build a list of usage categories of EHR. The reviewers also searched gray literature through web-based hand searches for topics related to information systems in acute care settings. After an iterative review of literature relating to both emergency medicine settings and EHR, 5 usage categories were developed inductively by the 2 reviewers. The findings were discussed with a working group comprised of 4 investigators with expertise in emergency medicine, health systems, and health informatics. The working group came to an agreement about 5 proposed key usage categories that were inputted into phase 2 to narrow the focus of discussion.
Establishing Group Consensus Through Delphi Methods
Phase 2 used Delphi methods that involved sequential rounds of survey and data dissemination to experts in both emergency medicine and information systems regarding their perspectives on each of the 5 usage categories. Recruitment of expert panelists was done through purposive sampling beginning with 4 investigators identifying candidates with expertise in both the clinical environment of the ED and health informatics at 6 tertiary- and quaternary-care centers across southwestern Ontario, including 3 level 1 trauma centers. Subsequently, the identified candidates were also invited to provide information on other potential informants. In total, 12 expert panelists were recruited across several hospital systems with extensive experience in both emergency medicine and health information systems. The panelists were spread across 3 separate disciplines (7 of 12 in emergency medicine, 3 of 12 in pediatrics emergency medicine, and 2 of 12 in general internal medicine). Several panelists held multiple leadership roles in their departments, with 4 of 12 acting as either chief or deputy chief, 7 of 12 acting as department lead across roles in quality and safety, virtual care, artificial intelligence and machine learning, and quality improvement. Several panelists also performed adjacent clinical duties with 4 of 12 serving as Trauma Team Leaders. Two panelists also fulfilled C-level positions at their respective hospital systems for roles in medical informatics. All panelists were associated with the University of Toronto in teaching and academic roles.
The Delphi study was conducted in 3 rounds of surveys . Survey administration was conducted using the Research Electronic Data Capture (REDCap 12.0.29) tools hosted at the University of Toronto [ , ]. To reduce bias in both survey responses and response analysis, the identity of all panelists was kept anonymous through the Delphi rounds. Panelists and investigators were unaware of the identity of panelist’s responses and panelists were not aware of the identity of other members of the Delphi panel until the conclusion of the study. The analysis of outputs from each round was conducted by 2 independent reviewers and consensus was established before circulation of findings to panelists between each round.
The first-round survey involved qualitative information gathering through free-text responses. Free-text responses were analyzed using NVivo (NVivo Version 12). First, responses were coded deductively, using usage categories defined in Phase I of the study. Second, sentiment coding was performed by NVivo’s sentiment analysis with manual adjustment and necessary recoding based on consensus by the 2 independent reviewers. Outputs were circulated to panelists for review. The second-round survey gathered quantitative information on the perceived importance of first-round outputs using Likert scales and qualitative free-text responses about areas of disagreement from first-round responses. The quantitative outputs from the second-round survey were analyzed using Microsoft Excel (MSO Version 2205; Microsoft Inc) to generate descriptive statistics around measured variables and the qualitative outputs from the second-round survey were circulated to the panelists . The third-round survey focused on establishing a ranked list of priorities based on the second-round outputs with the highest perceived importance resulting in a ranked list of priorities for each usage category.
Phase 2 received the approval of the Research Ethics Board through the University of Toronto (protocol #00040996).
In total, the perspectives captured by the expert panel spanned 6 separate hospital sites and 5 separate EHRs. Across all 3 rounds of survey, there was full retention of the original cohort of 12 expert panelists with no loss to follow-up between rounds. By using 5 key usage categories established by the working group members in phase 1 of the project (), the first round of surveys gathered free-form responses about the current needs of each category and generated a list of 10 features per usage category for a total of 50 features. Through the second-round survey, the panelists narrowed down the list to 25 features across key usage categories. Finally, in the third round of the survey, the panelists prioritized the top 5 features in each usage category relative to one another, for a total of 25 priorities ( ). Analysis of free-text responses produced statements of strengths and weaknesses for each category ( ). Several panelists raised ideas that may fall under the term of potential disruptive innovation, defined by Clayton Christensen as, “an innovation that makes things simpler and more affordable, and ‘technology’ is a way of combining inputs of materials, components, information, labor, and energy into outputs of greater value” [ ]. Based on the priorities defined in and the free-text responses by , possible features and innovations have been mapped to a typical journey through the ED, as a conceptualization of what an EHR may look like with these suggestions implemented ( ).
|Information input||The methods by which patient information is added or modified by care providers through multiple mediums [- ]||Mobile device access, dictation support, and multidisciplinary access|
|Digital health tools||Features that augment or streamline the provision of care by providers [- ]||Clinical decision support and computerized physician order entry|
|Usability||The extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use [, - ]||Personalized dashboards, customizable quick picks within order sets, and inbox and task management|
|Clinical workflow||EHRa features that impact patient flow through the ED [- ]||Multidisciplinary communication and tools for communicating with external care providers after a visit to the emergency department|
|Research and data analytics||EHR features that allow for the ability to investigate research questions or conduct quality improvement studies [- ]||Artificial intelligence and machine learning algorithms or adherence to interoperability standards|
aEHR: electronic health record.
Key priorities defined by Delphi outputs (1-5 to indicate their priority with 1 being the highest and 5 being the lowest).
- Support for multiauthor documentation
- Include the ability to input picture documentation
- Integrate digital ambient scribes to expedite note taking
- Enable quick picks or user favorites for easily accessed orders
- Auto-populate fields with information that has already been given during the visit (ie, triage assessment, consults from other services) or already available (ie, past visits or community health record databases)
Digital health tools
- Streamlined governance structures to support pushing and pulling data from an electronic health record (EHR)
- Integration of digital ambient scribes to expedite documentation time and order set suggestions
- Identification of high-risk patients (ie, poor prognosis and sepsis alerts)
- Order entry and clinical decision support that builds on existing history for a given patient and continues to build on this history for subsequent visits
- Embed clinical tools such as clinical practice guidelines or common risk stratification tools
- Improve inbox and task management within EHR by allowing users to customize layout of their inbox
- Streamline mobile access options that prioritize information input, similar to eCommerce or food delivery applications
- Implementation of customizable home screen
- Streamline access to other sources of information (ie, community health record databases and previous medication reconciliations)
- Streamline the number of required systems for different tasks or minimize disruption to workflow through improved integration
- Support of patients beyond the hospital setting such as discharge instructions with prescriptions sent to an email or via SMS
- Support for uploading documentation templates
- Access imaging results within the EHR
- Ability to communicate with others both inside the hospital setting (ie, paging consults, porter services, and housekeeping) and beyond the hospital setting (ie, community physicians, and emergency medical services)
- Automatic data pulls from previous clinical documentation rather than manual chart review
Research and data analytics
- Improved governance structures that afford more flexibility to the end user with respect to access
- Increase information access using role-based access (ie, quality improvement lead, chief, and research roles), allowing for expedited data pulls and enabling queries for simple questions
- Enhance standardization of coded information (ie, diagnosis, chief complaints, and patient outcome) within sites and across sites
- Embedded quality improvement tools
- Embedded search engines to query and trend simple questions
|Digital health tools|
|Research and data analytics|
aEHR: electronic health record.
bED: emergency department.
Category I: Information Input
Overall, it was found that panelists preferred that current EHRs improve on existing capabilities before trying to tackle potential disruptive innovations . Panelists specifically listed and ranked digital ambient scribes, which process information from a patient–physician interview into a note in an attempt to reduce documentation burden, and auto-population of documentation from other sources of clinical information, lower than basic functionality such as multiauthor documentation and support for documentation of other forms of media. As strength, it was found that panelists thought that EHRs have streamlined the collation and standardization of information. A limitation of current information input capabilities of EHRs is the lack of support for multiauthor documentation, increasing the need to repeatedly gather, and document redundant information that has already been collected by other members of the patient’s care team. This drives documentation burden and creates inefficiencies.
Category II: Digital Health Tools
It was largely believed by panelists that human factors limit the implementation of digital health tools such as machine learning algorithms that provide clinical decision support, as opposed to the technical capacities of the current EHRs. Furthermore, the priorities list shows that panelists prioritize tools supporting clinicians in acute care settings such as identifying high-risk patients as opposed to pulling previous information from other sources such as previous charts or clinical portals. Panelists mostly expressed that EHRs have streamlined the ability to conduct repetitive, previously tedious tasks. However, they state that innovation requires large amounts of coordination and health human resources, so while the potential may exist for implementation, there may not be the current appetite or means to sustain this change.
Category III: Usability
The priorities of end users in this category saw 2 sentiments of thought, which first may seem conflicting. On the one hand, there was an interest in having increased customizability options within ED EHRs, such as the enablement of customization of quick picks and inbox management. However, there was also the argument for adaptation on the part of the end user to the features and limitations of the EHR. Overall, panelists believed that EHRs have increased standardization of care delivery through order sets that are vetted by central decision support teams, ensuring that orders are up to current care standards. However, in their current form, EHRs are limited in the customization options that they provide for their end users, even with respect to personal workflow features such as inbox task management, or “look and feel” customizations such as the layout of a given dashboard.
Category IV: Clinical Workflow
Panelists again prioritized basic functionality (ie, discharge planning, interdisciplinary communication) as opposed to disruptive innovation. Although EHRs have increased ease of collaboration among teams in the ED through collation of documentation from triage, panelists still raised concerns around the limitations of interoperability between hospital systems and other systems such as primary care EHRs. Additionally, even within a single-hospital system, it was found to be difficult to communicate with other services that did not use the same EHR or charting method (ie, different clinical systems or paper charting).
Category V: Research and Data Analytics
Overall, panelists express that there was limitation with fluid access and usability of information. An undeniable strength of the EHR is that it has augmented the ability to collect, store, and access structured data. However, panelists identified that the ability to access the data in a meaningful way is still limited due to the format of stored data. Although it is possible to access volumes of information, the standardization of information input is lacking, such that any information sought for research purposes will still require manual recoding. Suggestions in this realm included improving drop-down menus to provide standardization of documentation input.
The key usage categories developed in our investigation and the panelists’ priorities determined by Delphi outputs span several steps of a patient’s journey through the ED (). These priorities highlight the balancing act that must occur in each usage category with the development and deployment of ED EHRs. With respect to information input, support for multiauthor documentation helps to reduce redundancy of information gathering and input, and support for innovations helps reduce documentation burden. With respect to digital health tools, improved governance structures could support the development and deployment of innovations that may aid in decision-making. With respect to usability, an optimized EHR for the ED would have customizability options for workflow and maintain strong standardization for deployment of care, such as order sets. With respect to clinical workflow support for communication beyond the hospital helps to ensure efficient and safe patient discharges, while consolidated information systems ensure efficient access to conducted investigations. With respect to research and data analytics, improved accessibility allows for more contribution from end users with respect to the development of new knowledge and useful clinical insights.
In Gawande’s  article titled, “Why Doctors Hate Their Computers,” Gawande writes of EHRs: “I’ve come to feel that a system that promised to increase my mastery over my work has, instead, increased my work’s mastery over me.” His assertion mainly centers around the collection of large amounts of unused information from a patient–physician encounter, which drives documentation burden and decreases patient–physician interaction time. Previous studies have estimated that the ED physicians may spend as much as 25% of the total time caring for a single patient on documentation [ ]. Aligned with the previous literature and clinical experiences of documentation burden, Gawande highlights a key issue where EHRs can decrease efficiency and become a burden rather than a valuable tool.
These concerns are aligned with the findings from this investigation, with panelists broadly prioritizing functionality over disruptive innovation, and issues such as interoperability and the reduction of documentation burden being prioritized across several usage categories. For example, with respect to information input, support for multiauthor documentation and picture integration was prioritized over features such as digital ambient scribes or population from past documentation. Another example is seen in panelists’ priorities with respect to clinical workflow, where panelists prioritized discharge communication methods over auto-population of patient information from previous documentation. Panelists were sampled from a variety of care settings employing several different EHRs at each site, suggesting that no single EHR vendor comprehensively captures the priorities identified in this investigation.
By examining the discrepancies between the identified priorities of panelists and the qualitative responses of strengths and limitations, it is possible to identify areas for impactful improvements. For example, with respect to digital health tools, streamlined governance structures were identified as both a top priority () and listed as a limitation ( ). Another usage category that demonstrated this was in research and data analytics, where panelists identified streamlined governance structures and increased role-based access as priorities ( ) and identified privacy as a limiting factor for gathering information ( ). Integrating this information identifies areas of high priority and can potentially inform prioritization of where system administrators can best optimize their own EHRs or build evidence-informed criteria in future acquisitions.
Compared to the deployment of Delphi methods in other emergency medicine clinical questions, the modifications to the process of this investigation optimized for depth of discussion in defined usage categories. The specific modification to the traditional process entailed defining the 5 usage categories through literature review which subsequently served as inputs to the Delphi model. Other investigations either integrate the literature review as one of the 3 traditional rounds or rely on free-text responses as a means to providing a focus of discussion [, ]. A trade-off of the selected modification is that it prevents panelists from suggesting their own mental schema of usage categories of EHRs; however, this trade-off was made to achieve a deeper understanding of priorities within discrete categories. An additional benefit of a preliminary literature review is that focused discussion ensured concrete outputs from each round, which may have contributed to the complete retention of panelists across the 3 rounds of the Delphi process. Overall, through a preliminary literature review and a Delphi process with narrow targets based on prior inputs, the modified Delphi method strikes an appropriate balance between breadth and depth in the examination of ED EHRs.
One potential limitation to this study is the generalizability of findings. Panelists are familiar with both ED care settings and health informatics in tertiary-care hospitals in southern Ontario, all with enterprise-wide deployments of their hospital EHR. This may lead to panelist-specific prioritization of other clinically adjacent activities such as academic research or data organization. Subspecialty interests may introduce additional variance to captured perspectives. Furthermore, this investigation focused on capturing the perspectives of physicians as the end user, which does not capture the perspectives of other disciplines that engage with an ED EHR.
Improving EHRs to effectively meet the unique priorities of the ED demands a thorough understanding of the priorities of end users. A modified Delphi approach allows an in-depth analysis of perspectives of expert panelists in discretely defined usage categories. Capturing the perspectives of an expert panel from tertiary and quaternary care centers across Southwestern Ontario and served by diverse EHR vendors, the findings of this study highlight end-user prioritization of functionality over disruptive innovation. At a provider level, these findings will lead to meaningful reflection and discussions with department leadership about how an EHR can fit local needs. At an institution level, these findings will have implications for choosing future EHRs and adaptation of existing systems. At a developer level, these findings will have further sensitized developers to the preferences of end users in high-acuity settings. The future steps in discussions around EHR improvement should involve gathering the perspectives of allied health professionals who also engage with EHRs and with patients as they are the beneficiaries of improvements to information systems. Furthermore, comparison of perspectives gathered in the ED to perspectives from other areas of the hospital would establish commonalities, common pain points, and enhance our understanding of the information system preferences of end users.
The authors would like to thank expert panelists for their participation and contribution of their insights.
The data sets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.
The authors confirm contribution to the paper as follows: research conception and design: MY, AA, TJ, and SM; project implementation: MY and SM; manuscript preparation: MY, AA, TJ, and SM. All authors reviewed and approved the final version of the manuscript. The authors received no financial support for the research, authorship, or publication of this paper.
Conflicts of Interest
- Casalino E, Wargon M, Peroziello A, Choquet C, Leroy C, Beaune S, et al. Predictive factors for longer length of stay in an emergency department: a prospective multicentre study evaluating the impact of age, patient's clinical acuity and complexity, and care pathways. Emerg Med J 2014;31(5):361-368. [CrossRef] [Medline]
- Shim RS, Druss BG, Zhang S, Kim G, Oderinde A, Shoyinka S, et al. Emergency department utilization among Medicaid beneficiaries with schizophrenia and diabetes: the consequences of increasing medical complexity. Schizophr Res 2014;152(2-3):490-497 [FREE Full text] [CrossRef] [Medline]
- Moy AJ, Aaron L, Cato KD, Schwartz JM, Elias J, Trepp R, et al. Characterizing multitasking and workflow fragmentation in electronic health records among emergency department clinicians: using time-motion data to understand documentation burden. Appl Clin Inform 2021;12(5):1002-1013 [FREE Full text] [CrossRef] [Medline]
- Janke A, Venkatesh A. 117EMF trends in high-intensity billing and visit complexity of treat-and-release emergency department visits in the US, 2006-2018. Ann Emerg Med 2021;78(4):S47-S48. [CrossRef]
- Rowe BH, McRae A, Rosychuk RJ. Temporal trends in emergency department volumes and crowding metrics in a western Canadian province: a population-based, administrative data study. BMC Health Serv Res 2020;20(1):356 [FREE Full text] [CrossRef] [Medline]
- Fernandes M, Vieira SM, Leite F, Palos C, Finkelstein S, Sousa JMC. Clinical decision support systems for triage in the emergency department using intelligent systems: a review. Artif Intell Med 2020;102:101762. [CrossRef] [Medline]
- Horng S, Sontag DA, Halpern Y, Jernite Y, Shapiro NI, Nathanson LA. Creating an automated trigger for sepsis clinical decision support at emergency department triage using machine learning. PLoS One 2017;12(4):e0174708 [FREE Full text] [CrossRef] [Medline]
- Raja AS, Ip IK, Prevedello LM, Sodickson AD, Farkas C, Zane RD, et al. Effect of computerized clinical decision support on the use and yield of CT pulmonary angiography in the emergency department. Radiology 2012;262(2):468-474 [FREE Full text] [CrossRef] [Medline]
- ISO 9241-11:2018(en) Ergonomics of human-system interaction - part 11: usability: definitions and concepts. International Organization for Standardization. 2018. URL: https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-2:v1:en [accessed 2022-01-02]
- Ratwani RM, Fairbanks RJ, Hettinger AZ, Benda NC. Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors. J Am Med Inform Assoc 2015;22(6):1179-1182. [CrossRef] [Medline]
- Helmer O. Systematic use of expert opinions. RAND Corporation. 1967. URL: https://www.rand.org/pubs/papers/P3721.html [accessed 2023-02-17]
- Mitzman J, King AM, Fastle RK, Hopson LR, Hoyle JD, Levasseur KA, et al. A modified Delphi study for development of a pediatric curriculum for emergency medicine residents. AEM Educ Train 2017;1(2):140-150 [FREE Full text] [CrossRef] [Medline]
- Greenberg A, Angus H, Sullivan T, Brown AD. Development of a set of strategy-based system-level cancer care performance indicators in Ontario, Canada. Int J Qual Health Care 2005;17(2):107-114. [CrossRef] [Medline]
- Khan Y, Brown AD, Gagliardi AR, O'Sullivan T, Lacarte S, Henry B, et al. Are we prepared? The development of performance indicators for public health emergency preparedness using a modified Delphi approach. PLoS One 2019;14(12):e0226489 [FREE Full text] [CrossRef] [Medline]
- Ebrahimi M, Mirhaghi A, Mazlom R, Heydari A, Nassehi A, Jafari M. The role descriptions of triage nurse in emergency department: a Delphi study. Scientifica 2016;2016:5269815. [CrossRef]
- Morphet J, Griffiths D, Plummer V, Innes K, Fairhall R, Beattie J. At the crossroads of violence and aggression in the emergency department: perspectives of Australian emergency nurses. Aust Health Rev 2014;38(2):194-201. [CrossRef] [Medline]
- Paek SH, Jung JH, Kwak YH, Kim DK, Ryu JM, Noh H, et al. Development of screening tool for child abuse in the Korean emergency department: using modified Delphi study. Medicine (Baltimore) 2018;97(51):e13724 [FREE Full text] [CrossRef] [Medline]
- Wakai A, O'Sullivan R, Staunton P, Walsh C, Hickey F, Plunkett PK. Development of key performance indicators for emergency departments in Ireland using an electronic modified-Delphi consensus approach. Eur J Emerg Med 2013;20(2):109-114. [CrossRef] [Medline]
- Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap): a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009;42(2):377-381 [FREE Full text] [CrossRef] [Medline]
- Harris PA, Taylor R, Minor BL, Elliott V, Fernandez M, O'Neal L, REDCap Consortium. The REDCap consortium: building an international community of software platform partners. J Biomed Inform 2019;95:103208 [FREE Full text] [CrossRef] [Medline]
- Microsoft excel: version 2205. Microsoft. URL: https://mspoweruser.com/office-version-2205-for-windows-insiders/ [accessed 2023-02-17]
- Christensen CM, Grossman JH, Hwang J. The Innovator's Prescription: A Disruptive Solution for Health Care. New York: McGraw-Hill Education; 2017.
- Cimino JJ. Improving the electronic health record: are clinicians getting what they wished for? JAMA 2013;309(10):991-992 [FREE Full text] [CrossRef] [Medline]
- Weiskopf NG, Weng C. Methods and dimensions of electronic health record data quality assessment: enabling reuse for clinical research. J Am Med Inform Assoc 2013;20(1):144-151 [FREE Full text] [CrossRef] [Medline]
- Chan KS, Fowles JB, Weiner JP. Review: electronic health records and the reliability and validity of quality measures: a review of the literature. Med Care Res Rev 2010;67(5):503-527. [CrossRef] [Medline]
- Kaufman DR, Sheehan B, Stetson P, Bhatt AR, Field AI, Patel C, et al. Natural language processing-enabled and conventional data capture methods for input to electronic health records: a comparative usability study. JMIR Med Inform 2016;4(4):e35 [FREE Full text] [CrossRef] [Medline]
- Handler JA, Feied CF, Coonan K, Vozenilek J, Gillam M, Peacock PR, et al. Computerized physician order entry and online decision support. Acad Emerg Med 2004;11(11):1135-1141 [FREE Full text] [CrossRef] [Medline]
- Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ 2005;330(7494):765 [FREE Full text] [CrossRef] [Medline]
- Bennett P, Hardiker NR. The use of computerized clinical decision support systems in emergency care: a substantive review of the literature. J Am Med Inform Assoc 2017;24(3):655-668 [FREE Full text] [CrossRef] [Medline]
- Bright TJ, Wong A, Dhurjati R, Bristow E, Bastian L, Coeytaux RR, et al. Effect of clinical decision-support systems: a systematic review. Ann Intern Med 2012;157(1):29-43 [FREE Full text] [CrossRef] [Medline]
- Khairat S, Burke G, Archambault H, Schwartz T, Larson J, Ratwani RM. Perceived burden of EHRs on physicians at different stages of their career. Appl Clin Inform 2018;9(2):336-347 [FREE Full text] [CrossRef] [Medline]
- Tang PC, Patel VL. Major issues in user interface design for health professional workstations: summary and recommendations. Int J Biomed Comput 1994;34(1-4):139-148. [CrossRef] [Medline]
- Linder JA, Schnipper JL, Tsurikova R, Melnikas AJ, Volk LA, Middleton B. Barriers to electronic health record use during patient visits. AMIA Annu Symp Proc 2006;2006:499-503 [FREE Full text] [Medline]
- Kaipio J, Lääveri T, Hyppönen H, Vainiomäki S, Reponen J, Kushniruk A, et al. Usability problems do not heal by themselves: national survey on physicians' experiences with EHRs in Finland. Int J Med Inform 2017;97:266-281 [FREE Full text] [CrossRef] [Medline]
- Fujita K, Onishi K, Takemura T, Kuroda T. The improvement of the electronic health record user experience by screen design principles. J Med Syst 2019;44(1):21. [CrossRef] [Medline]
- Wright A, Sittig DF, Ash JS, Bates DW, Feblowitz J, Fraser G, et al. Governance for clinical decision support: case studies and recommended practices from leading institutions. J Am Med Inform Assoc 2011;18(2):187-194 [FREE Full text] [CrossRef] [Medline]
- Spellman Kennebeck S, Timm N, Farrell MK, Spooner SA. Impact of electronic health record implementation on patient flow metrics in a pediatric emergency department. J Am Med Inform Assoc 2012;19(3):443-447 [FREE Full text] [CrossRef] [Medline]
- Mathison D, Chamberlain J. Evaluating the impact of the electronic health record on patient flow in a pediatric emergency department. Appl Clin Inform 2011;2(1):39-49 [FREE Full text] [CrossRef] [Medline]
- Salmasian H, Landman AB, Morris C. An electronic notification system for improving patient flow in the emergency department. AMIA Jt Summits Transl Sci Proc 2019;2019:242-247 [FREE Full text] [Medline]
- Verma A, Wang AS, Feldman MJ, Hefferon DA, Kiss A, Lee JS. Push-alert notification of troponin results to physician smartphones reduces the time to discharge emergency department patients: a randomized controlled trial. Ann Emerg Med 2017;70(3):348-356. [CrossRef] [Medline]
- Kannampallil T, Li Z, Zhang M, Cohen T, Robinson DJ, Franklin A, et al. Making sense: sensor-based investigation of clinician activities in complex critical care environments. J Biomed Inform 2011;44(3):441-454 [FREE Full text] [CrossRef] [Medline]
- Vankipuram A, Traub S, Patel VL. A method for the analysis and visualization of clinical workflow in dynamic environments. J Biomed Inform 2018;79:20-31 [FREE Full text] [CrossRef] [Medline]
- Kruse CS, Goswamy R, Raval Y, Marawi S. Challenges and opportunities of big data in health care: a systematic review. JMIR Med Inform 2016;4(4):e38 [FREE Full text] [CrossRef] [Medline]
- Kruse CS, Stein A, Thomas H, Kaur H. The use of electronic health records to support population health: a systematic review of the literature. J Med Syst 2018;42(11):214 [FREE Full text] [CrossRef] [Medline]
- Murdoch TB, Detsky AS. The inevitable application of big data to health care. JAMA 2013;309(13):1351-1352. [CrossRef] [Medline]
- Gawande A. Why doctors hate their computers. The New Yorker. 2018. URL: https://www.newyorker.com/magazine/2018/11/12/why-doctors-hate-their-computers [accessed 2022-09-01]
- Füchtbauer LM, Nørgaard B, Mogensen CB. Emergency department physicians spend only 25% of their working time on direct patient care. Dan Med J 2013;60(1):A4558 [FREE Full text] [Medline]
|ED: emergency department|
|EHR: electronic health record|
|REDCap: Research Electronic Data Capture|
Edited by A Kushniruk; submitted 29.09.22; peer-reviewed by KH Kim, C Slightam, D Chrimes; comments to author 27.11.22; revised version received 16.01.23; accepted 11.02.23; published 10.03.23Copyright
©Matthew Yip, Alun Ackery, Trevor Jamieson, Shaun Mehta. Originally published in JMIR Human Factors (https://humanfactors.jmir.org), 10.03.2023.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on https://humanfactors.jmir.org, as well as this copyright and license information must be included.