Cree GaskinPDF Download PDF

It is widely believed that artificial intelligence (AI) and machine learning programs will eventually proliferate throughout radiology. But few of these “smart” technologies are commercially available for clinical practice today, and, even if more were available, some radiologists are reluctant to adopt these systems for fear they could one day supplant them — although, many experts agree, that won’t happen any time soon.1

Other radiologists, like those at the University of Virginia (UVA) Health System, recognize the value of AI and its related technologies to streamline their workflow and detect findings that they may be unable to see on their own. They understand that these intelligent tools and algorithms, which can learn to identify findings on imaging studies, will empower radiologists to deliver more effective and efficient care, much like new modalities and quality standards have in the past. So when UVA’s radiologists got an opportunity to test a beta version of a software application that leverages AI to detect findings on certain imaging studies, they eagerly embraced it.

Now, the app is allowing UVA’s radiologists to focus on acute findings, while helping them identify pertinent incidental findings that they might not have seen in the past and provide additional information to referring physicians for improved patient care. “We believe that computer algorithms have the long-term potential to help radiologists perform their daily clinical work by enhancing their abilities to interpret large numbers of complex medical images,” says Cree M. Gaskin, MD, professor and chief of musculoskeletal imaging and intervention, associate chief medical informatics officer, and vice chair of informatics and operations at UVA. “We became interested in this particular application because it already offers several clinically relevant algorithms in a system that integrates well with our PACS.”

The App

The app UVA’s radiologists are trialing for research and evaluation purposes is called the Radiology Assistant from Israeli-based medical technology company Zebra Medical Vision, which draws on an extensive library of anonymized imaging cases to inform its algorithms. In beta testing, the app includes algorithms that identify five findings on chest, abdomen, and pelvis CT scans: coronary calcium scores, pulmonary emphysema, liver steatosis, spine compression fractures, and bone mineral density. Soon, the app will also include algorithms that detect additional findings on body imaging, as well as findings on breast and brain imaging. 

UVA’s radiologists began using the app this spring and are already reporting benefits from its integration. Specifically, the app is reminding the radiologists to include these relevant incidental findings in their reports, which is helpful given the large volume of cases they read each day and is particularly useful for residents who are learning to read exams and report findings. What’s even more impactful is that, in the case of bone mineral density, the app is helping the radiologists identify findings that they couldn’t see before.

“We’re familiar with noting coronary calcium, pulmonary emphysema, spine compression fractures, and steatosis of the liver, but we were previously unable to comment on bone density unless osteopenia or fractures were visible, meaning the disease was already advanced,” explains Arun Krishnaraj, MD, MPH, associate professor and chief of the body imaging division and vice chair of quality and safety at UVA. “Now we’re including low bone density because the app is prompting us even before we can visually identify findings of the disease. It’s like having an extra set of eyes to help us provide additional information to referring physicians for optimal patient care.”

A Measured Approach

While UVA’s radiologists are regularly using the app now, Gaskin and his team took a measured approach to implementing it. Initially, they installed the app on a workstation that only Gaskin could access in the musculoskeletal division. From there, he loaded chest, abdomen, and pelvis CTs to see how the app would respond and how it would impact the radiologists’ workflow. “We didn’t want to put something on all of the workstations that would be distracting or cumbersome,” Gaskin explains. “Isolating the initial implementation allowed us to resolve any problems before pulling in anyone else.”

Arun Krishnaraj Once it was clear the app would be easy to use, Gaskin began showing it to the department’s body imaging radiologists. He emphasized that the tool would integrate seamlessly into their workflow, while enhancing their quality and contribution to patient care. “Showing tools like this in the flesh allows the radiologists to see how unobtrusive they are and how they can add value by helping to identify findings that radiologists couldn’t see otherwise,” Gaskin says.

After observing the app in action, the radiologists understood its potential to improve care and agreed to install it on their workstations. “I’ve been talking with my colleagues and residents a lot about AI and machine learning, because I want everyone to be aware of this emerging technology and understand that it’s just like any new modality or tools that we use,” says Krishnaraj, who points to the ACR Bulletin machine learning special section as a valuable educational resource. “So when the radiologists in my division saw the app, they generally already had this mindset that this technology is going to make us better, more efficient, and more accurate.”

How It Works

To the radiologists, the app appears as a simple icon on their workstations, while the identification of these specific findings happens automatically in the background. It starts when imaging acquisition is completed and the CT scanner sends the images to the PACS. The PACS then forwards copies of the images to an onsite server dedicated to the app, which uses algorithms to read the images and then stores its findings for each relevant study on the server.

When a radiologist opens one of these studies in PACS, the app recalls the stored results and uses a color-coded system to relay the findings to the radiologist. If all of the findings are normal, the icon turns green, and the radiologist takes no further action with the app. If any of the findings are abnormal, the icon turns red, prompting the radiologist to click on the icon to review the results. “With the color-coding, you can look out of the corner of your eye and see whether any abnormalities are detected,” Gaskin explains. “No time is lost to observe the color, and only a single click is necessary to review abnormal results.”

After reviewing the app’s findings, the radiologist can decide whether to include them in her report. For example, if the app indicates that a patient’s bone density is abnormal and the patient has no history of osteoporosis or other bone-density issues, the radiologist might recommend a dedicated DEXA scan for further evaluation. But if the patient’s medical record shows that she has already been diagnosed with osteoporosis, the radiologist would not include the recommendation in her report because the referrer and patient are likely already aware of the condition and treatment is likely already underway. “The app aids in detection of specific findings, but the radiologist remains the clinical expert,” Gaskin says.

Zebra AppIndustry Partner

In testing the app, the radiologists are providing feedback to the vendor about its functionality and interface. They also intend to track cases in which they’ve recommended DEXA scans to see whether patients get the scans and whether the results match the app’s findings. “If the app incidentally detects low bone density and a DEXA scan confirms its existence, the patient can start treatment to prevent fractures and other symptoms that could lead to a diminished quality of life downstream,” Krishnaraj explains. “Now, suddenly, we’ve impacted that patient’s life, and we’ve helped lower the costs to the health care system as a whole by addressing the issue much earlier than we would have in the past.”

UVA’s radiologists are hopeful about the app’s capabilities and look forward to implementing additional algorithms as they become available. They view their work in assisting with the development of this and other advanced technologies (UVA’s radiologists are also part of the IBM Watson Health Imaging cooperative) as part of their responsibility to ensure patients receive the best possible care. “We’re interested in advancing technology and improving care, so it makes sense for us to partner with industry to achieve that,” Gaskin says. “We understand what tools are needed to improve care and our industry partners have the expertise to develop these tools. Both sides need each other.”

While some radiologists may be apprehensive about integrating such advanced technologies into their workflow, UVA’s radiologists are enthusiastic about adopting them and encourage others to do the same. “AI, machine learning, and computational assessment algorithms aren’t something to be avoided but rather embraced,” Gaskin says. “These tools have a real potential to enhance, but not replace, what radiologists do, allowing us to expand our expertise beyond traditional image interpretation to deliver better and more affordable care to our patients.”

Endnote

1. Modern Healthcare. Artificial Intelligence Takes on Medical Imaging. http://www.modernhealthcare.com/article/20170708/TRANSFORMATION03/170709944 Accessed June 13, 2017.

Next Steps

• Educate your team about AI, machine learning, computational assessment algorithms, and other advanced technologies, emphasizing their potential to enhance the radiologist’s role in patient care.

• Cultivate partnerships with industry leaders who are developing advanced AI technologies, and consider becoming a testing site.

• Introduce the technologies in a thoughtful way to ensure radiologists’ workflow is not disrupted in the process.

Join the Discussion

Want to join the discussion about how artificial intelligence will elevate the radiologist’s role in patient care? Let us know your thoughts on Twitter at #imaging3.

Have a case study idea you’d like to share with the radiology community? Please submit your idea to http://bit.ly/CaseStudyForm.

                                         Jenny Jones, Imaging 3.0 Project Specialist