Development of a novel observed structured clinical exam to assess clinical ultrasound proficiency in undergraduate medical education

Logistics discussion

Ultimately, the OSCE for clinical ultrasound was able to be integrated into a bootcamp for medical students about to start their clinical clerkships. Although our OSCE is not comprehensive, it assesses what we believe to be core uses for clinical ultrasound, regardless of specialty. A major strength of the OSCE was the breadth of skills assessed. Compared to previous ultrasound OSCEs, this approach of evaluating a broad set of skills is novel. Our OSCE assessed professionalism, including introductions and hand washing, as well as technical and diagnostic sonography skills such as imaging indication, probe selection, imaging mechanics and ergonomics, image acquisition, interpretation, pathology identification, and troubleshooting techniques. We believe that adding these components can better assess a sonographer’s skill set, and that image acquisition should not be the sole item evaluated.

It should be noted that this examination was administered at a major urban academic medical school that is well resourced. Our ultrasound TA program, availability of SP’s, ultrasound faculty, fellowship program, and dedicated teaching sonographer all contributed to this effort. Obviously, not all schools will have these resources readily available, and this could be a potential barrier to implementation. The budget to administer the exam included the time for the SPs. There were no other expenses incurred.

Scoring and results discussion

The scoring of the OSCE is an important topic of discussion and can play a role in how to use the results of this pilot and exam. The binary approach to scoring was used to help decrease subjectivity, and allows for fast, straight-forward scoring. The incorporation of critical diagnoses was added to ensure that students can acquire and interpret key pathology that would require intervention. It is essential that students recognize crucial pathology to avoid misdiagnoses. This is a vital point given that clinical ultrasound is not something all supervising physicians are facile with and may rely on a trainee’s interpretation of the result. Supervising physicians unfamiliar with ultrasound should not take this approach. Providers should use this new tool in accordance with their comfort level and obtain a comprehensive or radiology ultrasound when needed.

The TA program and training session were major strengths that strived to decrease subjectivity in the scoring of the OSCE. The binary checklist scoring system helped decrease variability in scoring among examiners as seen in the results, however, we did not assess inter-rater reliability (Fig. 3). Additionally, all the TA’s are well-versed in clinical ultrasound and participated in a training session on how to administer the OSCE and had an examiner guide on scoring to reference. A factor that likely influenced scores and could be modified was the timing of the OSCE compared to the didactic session. Not surprisingly, those who had didactics prior to the OSCE exam had an advantage. Although this was unavoidable during our initial pilot, this could be modified in future implementations of the exam. Similarly, exam performance improved as the week went on. This was likely due to student discussion of the exam, which was discouraged but likely still occurred.

Many students did not finish the exam, and most of the blank responses were in the lung section, which was the last section of the exam. Future iterations of the exam could randomize question order to determine whether the lung section was left blank because of difficulty or timing. Although examinees had the opportunity to skip questions and come back to them, it is unclear whether the number of blank responses indicate there was not enough time for the OSCE or that the OSCE was too long, too difficult, or a combination of these. 83.0% of students missed at least one critical diagnosis. The longitudinal curriculum in ultrasound covers many topics. Along with all of their other coursework, ultrasound may not be a top priority for students, or they may not have had significant exposure to these critical diagnoses. In addition, the assessment was formative and not graded, but it seems that many students got stuck trying to acquire one view or answering a particular question instead of skipping and moving on. This could indicate that the students were unfamiliar with this type of assessment where timing is of the essence or were unsure of the contents of the OSCE. A checklist of tested items could be provided in advance for future OSCE administrations.

We propose a scoring system with a minimum passing score, such as 75% (30 out of 40 checklist items). Students must also identify each critical diagnosis item to pass. Importantly, the students will receive a dedicated one-on-one feedback session at the end of each OSCE to identify strengths and weaknesses and develop ways to improve. Ideally, students will have these clinical ultrasound OSCEs integrated throughout their preclinical curriculum and can be used as a springboard for other related lessons [3].

The OSCEs in the preclinical curriculum would be formative OSCEs (FOSCE), which are not formally scored, but are an opportunity for students to hone their skills and receive feedback. The OSCE will be formally scored with the proposed scoring regimen right before their clinical clerkships begin. This will be known as the summative OSCE (SOSCE). Although the student performance on this pilot OSCE was lower than expected, it is likely that with the above curriculum almost all students would ‘pass.’ It is proposed that the students will be allowed to use their handheld device during their clerkships only after receiving a passing score on their summative OSCE. Given the plethora of opportunities during the preclinical years to take FOSCEs and improve their skills, it is unlikely any students would fail. This will serve as a mechanism to ensure students have the skills and knowledge to use clinical ultrasound safely.

Future directions and conclusion

Ultimately, this pilot study was a success. We were able to successfully develop and implement a novel OSCE incorporating our desired metrics, which met the main goal of the study. Of course, there were limitations as previously discussed and there is much work to be done in this area with room for improvement and expansion. The creation of a more generalizable exam that could be administered in medical schools with less resources is an example. Ideally, formative OSCEs will be integrated seamlessly within the preclinical curriculum and into sessions involving anatomy, problem-based learning, pathology, and physical exam skills. This will require support from medical school deans and administration.

Future directions include studying the reliability and validity of the assessment. Another goal could be to conduct the proposed OSCE at other institutions to create a more standardized assessment. Major barriers to this type of assessment are time and resources. Technology is rapidly evolving and utilizing virtual reality and/or augmented reality simulators along with artificial intelligence may help reduce faculty burden and aid in OSCE administration. This may allow for more frequent and formative OSCE administration.

In summary, there is increasing inclusion of CUS in UME curricula; however, proficiency assessment tools are limited. OSCEs have repeatedly shown to be a valuable assessment tool in medicine, and our OSCE exam is one that could be adapted to assess students' readiness to use clinical ultrasound prior to clinical clerkships. Our proposed system of repeated formative assessment with a final summative assessment will help students assess their CUS strengths and weaknesses to continuously strive for improvement of their knowledge, skills, and confidence to start exploring the field of clinical ultrasound.

留言 (0)

沒有登入
gif