Real-time Remote Expert-guided Echocardiography by Medical Students

Fourteen medical students were recruited after formal invitation posted on a website and email correspondence to perform remotely guided echocardiography. Any other selection was not done apart from the exclusion criterion of no additional ultrasound experience other than what is taught in their education. One student was excluded due to COVID-19 restrictions. All the students had undergone a cardiac anatomy course as part of their education. Out of 13 participants, seven had completed a brief practical course on ultrasound, including echocardiography, as part of their education. Thirteen healthy volunteers were recruited from acquaintances and a group of students. Of the 13 participating students, 11 were healthy volunteers after performing remote-guided study echocardiography.

The students were randomly assigned into two groups for the first task of obtaining a parasternal long-axis 2D cine loop. Group one received remote guidance through a smartphone videoconference, and group two received guidance through designated remote guidance software on the ultrasound machine. In the second task, all students received guidance using remote guidance software to obtain five specific ultrasound images from the study protocol. These five images were: (1) parasternal long-axis colour Doppler cine loop of the mitral and aortic valves, (2) parasternal short-axis 2D cine loop of the left ventricle, (3) apical four-chamber 2D cine loop, (4) apical four-chamber colour Doppler cine loop of the Mitral valve, and (5) pulsed wave Doppler of mitral flow including measurements of E- and A-wave velocities. See Fig. 1.

Fig. 1figure 1

Graphical representation of study task flow

The instructor (HS) provided an introductory lecture of 30 min on echocardiography and the basic settings of the ultrasound machine. Directly after, the students started performing echocardiography according to the protocol.

The students were remotely guided by the first author (HS), who is a resident doctor of paediatrics with 1 year of echocardiography training. The guide was positioned in a separate room, without direct contact with the students. All voice communication was done by standard mobile phones. Echocardiographic images were streamed in real-time to the laptop of the guide via a 4G network. Based on the guide’s continuous assessment of the ultrasound stream, real-time instructions were provided to the students on the movement of the probe according to predefined movement terminology. The aim was to acquire the correct images and complete the study protocol within an acceptable time frame. For the smartphone group in task 1, a Zoom (Zoom Video Communications, Inc., San Jose, CA, USA) videoconference was set up, with a 5.7-inch screen phone mounted on a separate stand, showing the screen of the ultrasound machine. The guide could not see the probe position during the exam for the smartphone group. Furthermore, the remote guidance group received guidance through REACTS software (Philips, Amsterdam, Holland), a designated software for remote guidance. The software streams the ultrasound image and a separate webcam showing the probe position for probe position guidance purposes to the laptop of the guide. This is illustrated in more detail in the Graphical Abstract. All healthy volunteers were placed in the left lateral decubitus position to optimise the acoustic windows.

The time spent for each acquisition was measured from when the probe touched the volunteer’s skin to when the store button was pressed. The image loops were recorded using retrospective capture for the students and prospective capture for reference echocardiography. No upper time limit was set. The images were stored as raw DICOM data on the ultrasound machine and transferred to the echocardiographic server of the hospital, where they were accessed for quality evaluation. The diagnostic image quality for each projection was rated by two independent and experienced sonographers on a scale of 0–3. A rating of 0 was «not usable quality», 1 was «bad quality», 2 was «medium quality», and 3 was «good quality». For each image, the two sonographers also answered a yes/no questionnaire regarding the usability of the assessment of the selected cardiac structures.

Reference examinations were performed by an experienced echocardiographer (ML) and served as the gold standard for all measurements. All measurements of left ventricular fractional shortening were derived from the 2D parasternal short-axis images and measured by ML. The measurements were performed 3 months after the examinations using EchoPAC software v.204 (General Electric Company, City of Boston, USA). All image reviewers were blinded to the examination type. End-diastole was defined by the R-wave of the QRS complex, and the systolic measurement was performed on the image with the smallest diameter of the left ventricle. All measurements were performed in triplicate, and the mean value was calculated. Finally, the fractional shortening measurements were reviewed for correctness by a second expert (HB), blinded to patient ID and examination type. In a few cases, the measurements were corrected before statistical analysis.

Equipment

The ultrasound machine used was a Philips EPIQ 7G with cardiac software, software release 7.0.5 (Koninklijke Philips, Amsterdam, Netherlands). The image reading software used was ComPACS (MediMatic Srl, Genova, Italy) and the EchoPAC software plugin v.203 (General Electric Company, City of Boston, USA). The webcam used with the Phillips EPIQ to show probe position during remote guidance was a Logitech C920S PRO HD WEBCAM, Max Resolution:1080p/30 fps–720p/30 fps, Camera megapixel: 3 (Logitech, Lausanne, Switzerland). The smartphone used for remote guidance was a Sony Xperia L3 with a dual camera 13 MP, f/2.2, 26 mm (wide), 1/3.0″, PDAF 2 MP, f/2.4, (depth) (Sony Corporation, Tokyo, Japan). The remote guidance software used on the Phillips EPIQ was Reacts and Collaboration Live, INNOVATIVE IMAGING TECHNOLOGIES INC., and Reacts (Montreal, Quebec, Canada) (Koninklijke Philips, Amsterdam, Holland). The smartphone was connected to the laptop of the guide via Zoom. Both the smartphone and ultrasound machine guidance software were connected to a ASUS ROG STRIX G laptop (ASUSTeK Computer Incorporated, Taipei City, Taiwan) (R.O.C.). Data were analysed using STATA 16.0 (StataCorp LLC, Texas, USA). The network router used was the Huawei H138-380 wireless 4G router (Shenzhen, Guangdong Province, China).

Student experience

The student experience was rated with a questionnaire with answers on a 6-point Likert scale. The questionnaire covered their subjective ability to solve the task, their evaluation of the communication with the guide, their ability to orient in the ultrasound image, their ability to get to the correct image, and their level of stress or relaxation during the remote guidance examination.

Statistics

Student demographics were analysed using a two-sample Student’s t-test with bootstrap to compare the number of semesters per student group. The proportion of students with the ultrasound course as part of their education was analysed by a proportion test.

To evaluate the agreement and variation in left ventricular ejection fraction between student and reference-acquired images, we used a two-sample Student’s t-test, Bland–Altman plot, and variation coefficient. The image quality was compared using a two-sample Student’s t-test. The agreement of reference versus student visualisation of structures was analysed using Cohen’s Kappa.

The level of two-sided significance was 0.05.

Comments (0)

No login
gif