When University of Florida (UF) radiologists began searching for a more objective way to assess residents’ preparedness for independent coverage of emergency and critical care imaging rotations, they drew inspiration from an unlikely source: flight school. Before student pilots take to the sky, they typically practice their skills in a flight simulator. The simulator looks identical to an actual cockpit and tests students’ abilities to navigate an aircraft through conditions they might encounter in the air, including fog, rain, turbulence, and mechanical malfunctions. The exercise helps determine whether students are ready for the responsibility of piloting a real-life aircraft.
Following the flight school model, UF has developed “Simulation in Emergent and Critical Care Imaging” — a web-based program that tests residents by simulating a typical call rotation in an emergency room (ER). “We’re putting residents in a virtual-reality simulator and throwing them cases to see how they would manage the multiple tasks of independent responsibility,” says Linda Lanier, MD, associate professor and associate chair in the department of radiology in the UF College of Medicine. “Just like pilot simulations, the residents have to practice what they would do in certain circumstances. It mimics their shift in every way.”
UF began developing the simulation as an in-house teaching tool approximately three years ago. Now the university is working with the ACR to deliver its simulation product to radiology resident programs nationwide through the ACR Radiology Curriculum Management System (RCMS), an existing cloud-based platform that allows radiologists to collect, develop, distribute, and access educational content and simulations. “The ACR helped integrate the simulation into the web platform so that we could mass distribute this program,” Lanier explains. “The goal is for radiology programs to chart what their residents miss during the simulation and use that information to alter their curriculum.”
A Thorough Assessment
In the past, radiology programs have evaluated residents’ proficiencies through multiple-choice exams and exercises that require residents to give differential diagnoses on a handful of images. Anthony A. Mancuso, MD, professor and chair in the department of radiology in the UF College of Medicine, thinks such tests are unreliable and biased. “Multiple choice exams are inadequate ways to test doctors,” he asserts. “The multiple-choice format cannot test the essential critical thinking, problem solving, and reliable consulting skills that radiologists must possess to materially contribute to medical decision-making. Such tests are also inherently open to gaming and guessing at answers, so it becomes more about how good of a test taker a person is than the real practice of radiology.”
To resolve those issues and objectively evaluate residents, the UF team set out to create a program that simulates the tasks residents actually perform during ER call rotations. Lanier says that the team decided to focus on emergent and critical care radiology because ER rotations are a “baptism by fire” for residents who have never been engaged in independent coverage before. “Mistakes in the emergency room have immediate consequences,” she explains. “If you miss something really critical, the patient could die or be paralyzed forever, so it’s important to know that residents are ready for that responsibility.”
The team identified 700 disease scenarios that residents must be prepared to see in the ER, such as appendicitis and trauma. It then pulled actual ER cases from its teaching file that reflect those entities. After developing methods for creating anonymity and displaying the cases, the team used the simulation to test its own residents. It then loaded the simulation onto a server, which it physically shipped to two other programs — one affiliated with UF and the other not — to test their residents. “We tried to deliver it over the web, but at that time, we didn’t have the expertise to create a single web-based imaging viewer that was compatible with each institution’s extensive Internet firewall,” Lanier notes. Residents did, however, enter their responses into a web-based system for UF to grade, and the results helped those programs identify areas where residents required additional training.
Following that pilot project, UF began working with the ACR to incorporate its simulation product into the College’s existing RCMS platform. RCMS will allow institutions nationwide and around the world to access the UF simulation through the web using an ACR credential, a modern web browser, and an Internet connection, says Jose Cayere, the ACR’s director of product development. UF and the ACR conducted a market viability study to gauge interest in the program and assess its commercial readiness. Eight radiology programs participated in the study and used the UF-ACR jointly developed simulation delivery system. “All of the participating program directors have indicated that the simulation is one of many tools they are using to determine the sequence in which their residents take independent coverage,” Lanier says. “Generally, the highest scoring residents are assigned to independent rotations first, giving those who didn’t score well time to receive remediation training.”
Aligns With ACGME
As UF began developing the simulation, the Accreditation Council for Graduate Medical Education (ACGME), an organization responsible for accrediting residency education programs, launched its Milestones
initiative, which outlines competency-based benchmarks that residents and fellows must meet throughout their training. As part of that initiative, the council has also called for more objective ways to confirm that residents meet the milestones. Mancuso says that UF’s simulation achieves that goal, especially when it comes to addressing the ACGME’s requirement that residents must demonstrate an “entrustable professional activity” (EPA) during the third year. “For the radiology EPA, residents must show that they can function independently without direct in-house supervision, and we think the best way to demonstrate that is through simulation,” he says.
How It Works
Each simulation takes eight hours, just like a standard ER shift. During that time, residents receive 65 cases at random and must interpret the complete Digital Imaging and Communications in Medicine (DICOM) image sets for those cases. Residents must decide whether each case is normal or abnormal — a task that Lanier says is a unique feature of the simulation because traditional training methods do not test residents’ abilities to call cases normal. When residents call cases abnormal, they must identify the key abnormalities and indicate how they would communicate the findings to the referring physician. Residents currently type their responses into the system, but Cayere says the ACR has integrated
medical voice recognition to allow students to dictate their responses.
Once residents have completed the simulation, UF scores their responses and sends the scores and performance analysis to the institution. Currently, UF’s radiology department hand-grades the exams using a custom scoring module to increase
the efficiency of the scoring, recording, and reporting, while enabling automation to generate scores and reports. UF is working with the ACR to streamline the initial report grading process using natural language processing (NLP), which is designed to recognize acceptable phrases that are typically used in ideal reports. “A hundred residents might need to be scored in a week, so that’s 6,500 responses the faculty has to read and evaluate, which gets to be quite a job,” Cayere says. “Technologically, the NLP is ready to perform the automated grading now, but it still needs a little more time to ‘learn’ the various ways a radiologist may report findings before being deployed in a high-stakes assessment environment. Until then, UF will continue to grade the exams by hand while we run the NLP in the background and compare the results to the faculty’s actual grading of the assessment.”
UF and the ACR plan to take the simulation live in 2015. Institutions will contact UF to schedule their simulations and pay the fee to access it. “The intent is to charge institutions to access the simulation,” Lanier explains. “But to this point, the program’s development has been paid for out of our department’s funds — my salary and the salaries of other staff members who have been working on this.” The ACR has also put a great deal of work and resources into the program and will receive a portion of the revenue it generates.
As the product comes online, UF will continue to load cases into RCMS that correspond with each of the 700 disease entities it has identified and will continue to develop the pool of “need-to-know” case content. The university uploads the cases to the ACR server using Transfer of Images and DataTM (TRIAD)
— an image and data exchange platform developed by the ACR that automatically encrypts sensitive patient information for transfer. Uploaded cases can be used to build a simulation around an institution’s specific needs. “If an institution would like a simulation with 20 percent gastrointestinal, 10 percent genitourinary, and 14 percent neuro, and half of those neuro cases have to be vascular, the platform will collect simulation cases from its repository that fit those specific criteria and present it to the institution as a custom assessment,” Cayere says.
UF is also developing a curriculum that corresponds with the simulation and remediation materials that institutions can use to help students with low scores. The remediation materials include videos of radiologists taking users through the cases step-by-step and demonstrating the proper way to call the case and develop the report. The curriculum and remediation materials will be available to institutions that sign up for the simulations, along with links to additional content, including educational cases on specific disease entities and traumas. Cayere says some of that content will come from UF and the ACR, while other links will feature material from the web and other partner institutions and groups.