Published on in Vol 9, No 2 (2022): Apr-Jun

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/33630, first published .
Evaluation of the First Year(s) of Physicians Collaboration on an Interdisciplinary Electronic Consultation Platform in the Netherlands: Mixed Methods Observational Study

Evaluation of the First Year(s) of Physicians Collaboration on an Interdisciplinary Electronic Consultation Platform in the Netherlands: Mixed Methods Observational Study

Evaluation of the First Year(s) of Physicians Collaboration on an Interdisciplinary Electronic Consultation Platform in the Netherlands: Mixed Methods Observational Study

Original Paper

1Department of General Practice and Elderly Care Medicine, University of Groningen, Groningen, Netherlands

2Prisma, Siilo Holding BV, Amsterdam, Netherlands

3See Acknowledgments

*these authors contributed equally

Corresponding Author:

Sanne M Sanavro, MD

Department of General Practice and Elderly Care Medicine

University of Groningen

PO Box 196

Groningen, 9700 AD

Netherlands

Phone: 31 50 3616731

Fax:31 50 3632964

Email: sannesanavro@hotmail.com


Background: Complexity of health problems and aging of the population create an ongoing burden on the health care system with the general practitioner (GP) being the gatekeeper in primary care. In GPs daily practice, collaboration with specialists and exchange of knowledge from the secondary care play a crucial role in this system. Communication between primary and secondary care has shortcomings for health care workers that want to practice sustainable patient-centered health care. Therefore, a new digital interactive platform was developed: Prisma.

Objective: This study aims to describe the development of a digital consultation platform (Prisma) to connect GPs with hospital specialists via the Siilo application and to evaluate the first year of use, including consultations, topic diversity, and number of participating physicians.

Methods: We conducted a mixed methods observational study, analyzing qualitative and quantitative data for cases posted on the platform between June 2018 and May 2020. Any GP can post questions to an interdisciplinary group of secondary care specialists, with the platform designed to facilitate discussion and knowledge exchange for all users.

Results: In total, 3674 cases were posted by 424 GPs across 16 specialisms. Most questions and answers concerned diagnosis, nonmedical treatment, and medication. Mean response time was 76 minutes (range 44-252). An average of 3 users engaged with each case (up to 7 specialists). Almost half of the internal medicine cases received responses from at least two specialisms in secondary care, contrasting with about one-fifth for dermatology. Of note, the growth in consultations was steepest for dermatology.

Conclusions: Digital consultations offer the possibility for GPs to receive quick responses when seeking advice. The interdisciplinary approach of Prisma creates opportunities for digital patient-centered networking.

JMIR Hum Factors 2022;9(2):e33630

doi:10.2196/33630

Keywords



In the Dutch health care system, general practitioners (GPs) have a coordinating role as generalists, functioning as gatekeepers to secondary care. This model requires that patients initially consult a GP who provides expert generalist medical care for their health care problem and considers the need for referral to more specialist care.

Unfortunately, pressures on the health care system have increased due to the growth in both the chronicity and the complexity of health problems [1,2]. Although GPs care for over 95% of medical problems that present during consultations, referral to secondary care has also increased, resulting in greater health care costs and growing waiting lists [3,4].These issues can be addressed by providing GPs with closer support from secondary care, assuming there are effective routes for knowledge exchange [5-9]. However, the most commonly used tools for communication between primary and secondary care have important shortcomings. For example, GPs and hospital specialists are often mutually unavailable at the same time, meaning that telephone conversations can be interruptive. Whereas e-consultations may solve the problem of asynchronous availability, they are limited by being monodisciplinary, one-on-one, and mostly noninteractive [6,10-13]. Digital response times may also vary by specialism. By contrast, team-based case collaboration on a patient-centered network of health care professionals could facilitate communication and knowledge transfer [14-16]. The secure Siilo app offers a useful platform to host such a service [17,18].

In this study, we describe the development of the Dutch Prisma platform within the secure Siilo app and evaluate the usage and consultations in the first 2 years since its introduction, including the diversity of topics and number of physicians involved.


Study Design

We performed a retrospective mixed methods study using quantitative information from the Prisma platform and a qualitative evaluation of consecutive cases posted on the platform from its inception in July 2018 to May 2020.

The Prisma Platform

The Prisma platform initially facilitated digital interprofessional consultation for patients with orthopedic problems, but more recently, it has expanded to include other specialties. GPs with full access to the closed digital environment of the platform generate cases by providing anonymized patient information with a question. All GPs and specialist users are connected in so-called tiles by specialty (eg, orthopedics, internal medicine, palliative care) to facilitate engagement by consultants with complementary expertise (eg, rheumatologists, orthopedic surgeons, sports medicine physicians, and radiologists participate via the orthopedics tile). All users can engage with each tile and upload attachments or links to relevant information, such as laboratory results, pictures, or guidelines. The main language used on the platform is Dutch.

Two GP groups are active on this platform: 1 with full access (able to generate cases and respond to others) and 1 with a read-only account. Specialists participated voluntarily; separate from their hospital work and without reimbursement for their activity on the platform. Because they were not reimbursed, the number of GPs was limited during the development phase to avoid overloading the specialists. All users, both GPs and specialists, were located in various regions of the Netherlands. Specialists preferably respond within 24-48 hours by answering questions, seeking more information, or engaging in discussion. All GPs with access to the platform can read and respond to posted cases. In this way, the platform allows for a dynamic exchange of information and learning to support the GP in daily practice. Throughout the process, the GP remains responsible for the care provided to the patient and will decide, in consultation with the patient, how to proceed with further treatment.

Data Collection

A data analyst at Siilo provided pseudonymized details for all consecutive cases, replacing usernames with a job title and a number (eg, GP-1, GP-2, neurologist-1). Each post was summarized as a user code, timestamp, and verbatim transcript, and these were grouped by case for each tile. Data were analyzed qualitatively and quantitatively. As we performed a retrospective descriptive study, we did not predefine our sample size.

Qualitative Analysis

Text files were imported into the Atlas.ti program [19] for qualitative assessment by a research group comprising 20 senior medical students (coders) supervised by an internist (SS), a medical sociologist (DJ), a GP epidemiologist (MB), and a senior researcher (HW). The Prisma affiliate (PK) was not involved in this phase.

We used a predefined coding tree to structure the qualitative assessment (Multimedia Appendix 1). Before applying this to all cases, a random sample of 10 cases was initially coded by all coders. The results of this preliminary coding were then checked in pairs and discussed in 5 subgroups with 2 supervisors. Coders were actively invited to discuss the applicability of codes and to add new codes if needed. After this, coders were grouped by tile and at least 50 cases per tile were coded in duplicate with mutual blinding. This was followed by group discussion in consensus meetings per subgroup, after which the remaining cases were coded.

The coding tree comprised the following: basic patient characteristics, such as age, gender, and comorbidity; the topic of the question; and both the type of question and the type of answer (eg, diagnostic, therapeutic, or referral for both). Codes for symptoms and diseases followed the International Classification in Primary Care (ICPC) [20], with multiple codes permitted.

Quantitative Analysis

All codes were imported into IBM SPSS (IBM Corp.) for quantitative analysis. We merged the 16 tiles into 5 categories based on similarities and group sizes: “internal medicine” included internal medicine, infectious disease, palliative care, and medically unexplained physical symptoms; “observation” included gastroenterology, neurology, pulmonology, rheumatology, and cardiology; “surgical” included orthopedics, urology, traumatology, and ear, nose, and throat disease; “female/child” included gynecology and pediatrics; and “dermatology” as a single category. The tile for psychiatry was analyzed and published separately and is therefore excluded from this analysis [21].

An overview of activity on the platform is displayed by plotting the number of GPs (active users and read-only accounts) and the number of cases against time. We estimated the number of users, number of specialisms, number of specialists, and the response time for each case based on user codes and timestamps, and we analyzed the code frequencies for age, gender, case topic (based on the ICPC code), question type, and answer type for each category. Descriptive data were presented as percentages of all cases or as means and SDs. Finally, we used a Sankey diagram to show the linkage between questions and answers.


Descriptive Data

The data set started with 25,954 messages for 4013 cases; of these, 1872 messages were excluded for 339 cases. First, we excluded 292 cases because of data extraction errors (n=34), small size, and difficulty to categorize within groups (geriatrics, n=5; ophthalmology, n=40) and because they were already analyzed in a separate study (psychiatry, n=213) [21]. Next, we divided the data within the research team and analyzed the 3721 cases. We excluded another 47 cases because of wrong tile placement (n=19), double case placement (n=10), technical errors (n=8), not coded (n=7), withdrawal by GP (n=2), missing (n=1) (Multimedia Appendix 2). The 3674 included cases were posted by 424 different GPs (median 9 cases per GP), for whom 97 (22.9%) first posts were in response to another case and 327 (77.1%) posts were for new cases.

Growth of the Prisma platform over time is shown as the number of GPs (active users and read-only accounts; Figure 1), the total number of cases, and the number of cases per tile (Figure 2). The number of cases per category was 677 for internal, 674 for observation, 860 for surgical, 875 for female/child, and 588 for dermatology. Figures 3 and 4 show the number of specialists and specialisms involved per tile category, respectively. For all categories, except dermatology (196/588, 33.3%), most cases included more than 2 users per case. For the internal, observation, and surgical categories, 3 or more specialisms were involved per case in 46.6% (317/680), 32.3% (217/672), and 40.7% (350/860), respectively. In the internal and observation categories, 4 or more health care professionals were engaged per case in 57.2% (389/680) and 54.0% (363/672), respectively.

Figure 1. Platform use; number of active and read-only GPs on the platform. GP: general practitioner.
View this figure
Figure 2. Overall cases of network activity and network activity by tile category. ENT: ear, nose, throat; GYN: gynaecology; MUPS: medically unexplained physical symptoms; PAL: palliative care; UROL: urology.
View this figure
Figure 3. Number of users involved per case. Data are illustrated in 5 tile categories.
View this figure
Figure 4. Number of specialisms involved per case. Data are illustrated in 5 tile categories.
View this figure

Case characteristics are presented in Multimedia Appendix 3. No answer was given for 35 cases, with the median time to first response being 76 minutes (IQR 17-320) for the other cases. The shortest response time was seen in the surgery category (median 44 minutes) and the longest was in the dermatology tile (median 252 minutes). Overall, 3508/3674 (95.48%) cases contained specific patient information or patient-specific questions, with the remaining 166 (4.52%) cases including questions that were not specific to the patient. Slightly more than half of all queries concerned females (1948/3674, 53.02%), except for those in the surgical tile where there was a slight male majority (437/860, 50.8%). GPs did not report gender in 8.92% (313/3508) of the patient-specific cases. They also posted a question about more than 1 patient in 4 cases (eg, family members or several patients with the same complaint). Patient age ranged from newborn to 101 years (mean 39.9 years) and the mean age differed by tile category. The GP did not report age for 701 cases.

Topics discussed covered the full range of ICPC codes (Multimedia Appendix 4). The 3 main topics by ICPC code were in the skin, musculoskeletal, and general symptom domains.

Type of Questions and Answers

Among the 3674 cases, we identified 6691 different questions (mean 1.8 per case) and 10,922 answers (mean 3.03 per case). Multimedia Appendix 5 shows the type of question and answers posted.

Questions concerned (differential) diagnosis in 50.90% (1870/3674), appropriate nondrug treatment in 33.15% (1218/3674), and drug treatment in 27.60% (1014/3674). It was notable that the focus of questions differed between tile categories. Most concerned diagnosis in the internal (358/677, 52.9%), observatory (361/674, 53.6%), and dermatology (424/588, 72.1%) categories; most concerned treatment in the surgical category (431/860, 50.1%); and most concerned medication in the female/child category (378/875, 43.2%).

The Sankey diagram in Figure 5 illustrates the dynamics between the type of question and the type of answer. We have illustrated only the 9 most common combinations (used more than 100 times), including any other answer type or combination in the “other” group. Consistent with the type of question asked by GPs, most answers concerned (differential) diagnosis, which was often combined with responses about referral, further diagnostics, or a combination of these 3 responses. However, the type of question posed by GPs did not always lead to answers within the same topic, such as questions about referral often leading to advice about how to proceed (eg, perform further diagnostics and refer, GP-based follow-up, or start therapy and refer). In this way, one can see that simple referral questions can lead to varied advice possibilities (Multimedia Appendices 6-8).

Figure 5. Sankey diagram of dynamics from questions to answers, 9 largest groups.
View this figure

Principal Findings

This mixed methods study has shown the growth and evolution of a digital interdisciplinary consultation platform over almost 2 years. Posted questions not only covered a broad spectrum of the population by age and sex but also covered a wide variety of specialist topics. Of note, there was a steep increase in the number of cases for dermatology, which could be explained by existing familiarity with tele-dermatology in Dutch primary care [10] or potentially highlight a practice weakness among GPs.

In most cases, 2 or more users engaged with the GP who initiated the question. An exception to this was the dermatology tile, in which it was typical for only 1 other user to respond. The number of involved specialisms also differed between tiles, being largest for internal medicine. This illustrates a novelty of this approach compared with other consultation formats where a GP only has contact with 1 medical specialist. This approach is in line with the future vision to build primary and secondary care networks around the patient [16,22,23].

The short response times suggest that the Prisma platform facilitates rapid and efficient consultation. This contrasts with telephone consultations, which are often hampered by mutual unavailability. Our data indicate that answers are given to most questions by the end of a GP’s working day so that patient care is not delayed for more than a few hours.

Although it is difficult to compare our study with previous studies because of the difference in design of the platform that was analyzed, the time response outcomes are superior to those in previous studies [4,6,11,24]. It should be considered that they may reflect a precursor effect of enthusiasm among engaged specialists.

The differences in question type between tile categories may indicate differences in work content. Internal medicine, observation, and dermatology focused on diagnosis; surgery focused on treatment; and female/child focused on medication. An alternative hypothesis could be that different specialisms have specific needs of GPs in the treatment process.

The Sankey graph in Figure 5 and Multimedia Appendices 6-8 illustrates the dynamics between questions asked by GPs and answers given by specialists. The large number of questions related to diagnosis had multiple combinations with other questions, reflecting the complexity of evaluation (eg, when the diagnosis is unclear, the next step is also uncertain). Overall, (differential) diagnosis was the most frequently used theme, but this does not appear as a separate group in the graph because it was mostly used in combination with other themes. In comparison to this, questions on medication had most single questions and a clear dynamic to single answers.

The dynamics on referral questions are also interesting, with only a minority of questions receiving a single answer about referral. For example, we found combinations of advice for additional diagnostics in primary care or advice to refer with explanations about diagnosis. We hypothesize that medical specialists used this platform not only to ensure adequate referral but also to share knowledge. There was also a difference between referral questions and answers: not all questions about referral led to answers about referral, and vice versa (ie, referral advice was sometimes given without a specific request).

We found similarities and differences when comparing our findings with the limited amount of preceding research on electronic consultations [15]. In this earlier research, most questions for hematology and rheumatology concerned diagnosis, while questions in the infectious disease and dermatology categories typically concerned therapy. Another research focusing specifically on internal medicine in a hospital in Netherlands involved one-on-one electronic consultations, and revealed “diagnostic tools” to be the most common answer [6].

Limitations

First, the large sample size and categorization means that a more detailed analysis by specialty is missing in this study. Second, because structure was lacking in the questions posted by GPs, complete data on patient characteristics cannot be guaranteed; however, this did not impair the content analysis. Third, text coding was done by 20 different coders, which might have resulted in interobserver variations in interpretation, despite our efforts to minimize this as much as possible through teamwork. Finally, the data in this analysis were observational in nature, preventing us from making firm conclusions on either observed correlations or patient outcomes.

Future Research

This evaluation focused on the activities of health care professionals, but to date, the patient perspective has not been analyzed. Although the platform performs well in supporting the needs of the GP for further assessment, treatment, and when needed, more appropriate referral to specialists, we do not know how these relate to needs, experiences, and outcomes in patient cohorts. To generate and implement a novel health care collaboration on a large scale, time and cost-efficiency calculations will also be indispensable [25]. In our study, the response time was more rapid than previously reported for e-consultations [6,24], which have already been shown to reduce not only waiting times for GPs and patients but also costs for patients and waiting lists for hospitals [26]. We are currently conducting a stepped-wedge randomized controlled trial to evaluate the impact of the Prisma platform on patient outcomes and referrals to specialists.

Concerning the content of questions posted on the Prisma platform, an in-depth analysis could still be interesting and useful. Gaps in support for GPs could be uncovered by exploring diagnostic uncertainties (between noncomplex symptoms that meet ICPC diagnostic criteria and practice guidelines), common reasons for referral, and the impact of regional agreements [27]. It is possible that these gaps could be filled by creating a database of information collected on the platform. This could facilitate GPs to ask questions and search for possible answers based on prior responses.

Conclusion

This observational research shows that a new digital platform facilitated rapid and interactive communication between GPs and specialists for nonurgent questions. This platform is clearly distinguished from one-to-one consultations by facilitating the involvement of multiple physicians. The platform supports the transfer of knowledge from medical specialists to GPs while allowing different viewpoints from relevant experts.

Acknowledgments

We are grateful to the Department of General Practice, University Medical Centre Groningen, and to Prisma for providing technical support. Prisma Platform Study Group: Ineke Knijp, Ruben BR de Boer, Lisa Havinga, Iris R Vroom, Esmee L van der Geest, Maaike A Hulshof, Daphne BM Visser, Frank J Dorgelo, Ivar Maatje, Hugo Quaedvlieg, Lianne Sijbring, Anne van der Meer, Esmée Fath el Bab, Lucinda E Haaze, Willeke Schelhaas, Susan JM Bergamin, Kimberly Boerma, Anne J Lammers, Dajana Erceg, Tamara Schouten, and Steven N Koning. We thank Dr Robert Sykes for providing technical editing services for the final drafts of this manuscript.

Conflicts of Interest

PK is the founder of Prisma and provided the text messages of the platform for this research, but was not involved in the data analysis process of this work.

Multimedia Appendix 1

Code tree (in Dutch).

PDF File (Adobe PDF File), 165 KB

Multimedia Appendix 2

Flowchart inclusion cases.

PNG File , 24 KB

Multimedia Appendix 3

Characteristics of cases posted in different tiles.

DOC File , 37 KB

Multimedia Appendix 4

International Classification in Primary Care codes used in cases.

DOC File , 47 KB

Multimedia Appendix 5

Type of questions and type of answers.

DOC File , 53 KB

Multimedia Appendix 6

Sankey diagram for differential diagnostic questions. DT: drug treatment; NDT: nondrugs treatment.

PNG File , 1867 KB

Multimedia Appendix 7

Sankey diagram for drug treatment questions.

PNG File , 2742 KB

Multimedia Appendix 8

Sankey diagram for referral questions.

PNG File , 390 KB

  1. van Oostrom SH, Picavet HSJ, de Bruin SR, Stirbu I, Korevaar JC, Schellevis FG, et al. Multimorbidity of chronic diseases and health care utilization in general practice. BMC Fam Pract 2014 Apr 07;15:61 [FREE Full text] [CrossRef] [Medline]
  2. Schäfer WLA, Boerma W, Spreeuwenberg P, Schellevis F, Groenewegen P. Two decades of change in European general practice service profiles: conditions associated with the developments in 28 countries between 1993 and 2012. Scand J Prim Health Care 2016;34(1):97-110 [FREE Full text] [CrossRef] [Medline]
  3. Kroneman M, Boerma W, van den Berg M, Groenewegen P, de Jong J, van Ginneken E. Netherlands: Health System Review. Health Syst Transit 2016 Mar;18(2):1-240 [FREE Full text] [Medline]
  4. Liddy C, Moroz I, Afkham A, Keely E. Sustainability of a Primary Care-Driven eConsult Service. Ann Fam Med 2018 Mar;16(2):120-126 [FREE Full text] [CrossRef] [Medline]
  5. Berendsen A, Benneker W, Meyboom-de Jong B, Klazinga N, Schuling J. Motives and preferences of general practitioners for new collaboration models with medical specialists: a qualitative study. BMC Health Serv Res 2007 Jan 05;7:4 [FREE Full text] [CrossRef] [Medline]
  6. Muris D, Krekels M, Spreeuwenberg A, Blom M, Bergmans P, Cals JWL. General practitioners' use of internal medicine e-consultations. Ned Tijdschr Geneeskd 2020 Feb 10;164:24-31. [Medline]
  7. Wagner EH, Glasgow RE, Davis C, Bonomi AE, Provost L, McCulloch D, et al. Quality Improvement in Chronic Illness Care: A Collaborative Approach. The Joint Commission Journal on Quality Improvement 2001 Feb;27(2):63-80. [CrossRef]
  8. de Bever S, Bont J, Scherpbier N. Strengthening general practice by extending specialty training? Br J Gen Pract 2019 Apr 25;69(682):222-223. [CrossRef]
  9. Greenwood-Lee J, Jewett L, Woodhouse L, Marshall D. A categorisation of problems and solutions to improve patient referrals from primary to specialty care. BMC Health Serv Res 2018 Dec 20;18(1):986 [FREE Full text] [CrossRef] [Medline]
  10. Tensen E, van der Heijden JP, Jaspers M, Witkamp L. Two Decades of Teledermatology: Current Status and Integration in National Healthcare Systems. Curr Dermatol Rep 2016;5:96-104 [FREE Full text] [CrossRef] [Medline]
  11. Liddy C, Moroz I, Mihan A, Nawar N, Keely E. A Systematic Review of Asynchronous, Provider-to-Provider, Electronic Consultation Services to Improve Access to Specialty Care Available Worldwide. Telemed J E Health 2019 Mar;25(3):184-198 [FREE Full text] [CrossRef] [Medline]
  12. Soriano Marcolino M, Minelli Figueira R, Pereira Afonso Dos Santos J, Silva Cardoso C, Luiz Ribeiro A, Alkmim M. The Experience of a Sustainable Large Scale Brazilian Telehealth Network. Telemed J E Health 2016 Nov;22(11):899-908 [FREE Full text] [CrossRef] [Medline]
  13. Kwok J, Olayiwola J, Knox M, Murphy E, Tuot D. Electronic consultation system demonstrates educational benefit for primary care providers. J Telemed Telecare 2017 Jun 14;24(7):465-472 [FREE Full text] [CrossRef]
  14. McIntyre T, Kelly E, Clarke T, Green C. Design and implementation of an acute Trauma and Orthopaedic ePlatform (TOP) referral system utilising existing secure technology during the COVID-19 pandemic. Bone & Joint Open 2020 Jun 01;1(6):293-301 [FREE Full text] [CrossRef]
  15. Waugh M, Voyles D, Thomas M. Telepsychiatry: Benefits and costs in a changing health-care environment. Int Rev Psychiatry 2015;27(6):558-568 [FREE Full text] [CrossRef] [Medline]
  16. Miller R, Scherpbier N, van Amsterdam L, Guedes V, Pype P. Inter-professional education and primary care: EFPC position paper. Prim Health Care Res Dev 2019 Oct 04;20:E138 [FREE Full text] [CrossRef]
  17. Security Whitepaper. Siilo.   URL: https://www.siilo.com/assets/downloads/Siilo-Security-Whitepaper.pdf [accessed 2021-09-16]
  18. Ezra O, Toren A, Tadmor O, Katorza E. Secure Instant Messaging Application in Prenatal Care. J Med Syst 2020 Feb 22;44(4):73 [FREE Full text] [CrossRef] [Medline]
  19. Friese S. Atlas.ti 8, Windows User Manual. Atlasti. 2015.   URL: https:/​/downloads.​atlasti.com/​docs/​branding/​atlasti_brochure_v9_EN_interactive_202110.​pdf?_ga=2.​83602454.​1041389821.​1647378949-1120635778.​1647378949 [accessed 2022-03-15]
  20. International Classification of Primary Care, Second Edition (ICPC-2). ICPC. 2014.   URL: https://www.transhis.nl/wp-content/uploads/2014/12/icpc-2-2pager-nederlands.pdf [accessed 2022-03-15]
  21. Bock NW, Wouters H, Lammers AJ, Blanker MH. Online Consultations Between General Practitioners and Psychiatrists in the Netherlands: A Qualitative Study. Front Psychiatry 2021;12:775738 [FREE Full text] [CrossRef] [Medline]
  22. Reddy S. Integrated health care: it’s time for it to blossom. Aust. Health Review 2016;40(4):428-430 [FREE Full text] [CrossRef]
  23. World Health Organization (WHO). WHO global strategy on people-centred and integrated health services. WHO. Geneva, Switzerland: WHO; 2015.   URL: https:/​/apps.​who.int/​iris/​bitstream/​handle/​10665/​155002/​WHO_HIS_SDS_2015.​6_eng.​pdf?sequence=1&isAllowed=y [accessed 2022-03-20]
  24. Ahmed S, Kelly Y, Behera T, Zelen M, Kuye I, Blakey R, et al. Utility, Appropriateness, and Content of Electronic Consultations Across Medical Subspecialties. Annals of Internal Medicine 2020 May 19;172(10):641-647 [FREE Full text] [CrossRef]
  25. Liddy C, Drosinis P, Deri Armstrong C, McKellips F, Afkham A, Keely E. What are the cost savings associated with providing access to specialist care through the Champlain BASE eConsult service? A costing evaluation. BMJ Open 2016 Jun 23;6(6):e010920 [FREE Full text] [CrossRef] [Medline]
  26. Keely E, Liddy C, Afkham A. Utilization, benefits, and impact of an e-consultation service across diverse specialties and primary care providers. Telemed J E Health 2013 Oct;19(10):733-738 [FREE Full text] [CrossRef] [Medline]
  27. Dekhuijzen PNR, Smeele IJM, Smorenburg SM, Werkgroep Ketenzorg COPD. [Guideline for the non-pharmacological treatment of COPD]. Ned Tijdschr Geneeskd 2006 Jun 03;150(22):1233-1237. [Medline]


GP: general practitioner
ICPC: International Classification in Primary Care


Edited by A Kushniruk; submitted 21.09.21; peer-reviewed by A Tannoubi, AR Mohseni; comments to author 19.10.21; revised version received 11.11.21; accepted 20.01.22; published 01.04.22

Copyright

©Sanne M Sanavro, Henk van der Worp, Danielle Jansen, Paul Koning, Marco H Blanker, Marco Prisma Platform Study Group. Originally published in JMIR Human Factors (https://humanfactors.jmir.org), 01.04.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on https://humanfactors.jmir.org, as well as this copyright and license information must be included.