ACR Bulletin

Covering topics relevant to the practice of radiology

Choosing AI

Jump to Article
Physicians are integrating AI into radiology practices of all sizes, helping them streamline their work and enhance patient care.

AI is not replacing us, but it is going to fundamentally change how we practice.

—Nina E. Kottler, MD, MS
March 23, 2023

Radiologists are increasingly using AI and machine learning across all types of practice settings, including community hospitals, private practices and larger academic institutions. The goal when successfully integrating AI within your radiology group’s system is ultimately to improve clinical workflow and patient care — while enhancing efficiency.

“We have been very pleased with our AI tools, which mostly use workflow automation for productivity,” says Adam B. Prater, MD, MPH, medical director of data science and analytics at Radiology Associates of North Texas and former director of imaging informatics at Grady Memorial Hospital. As a member of the ACR Commission on Informatics, Prater is part of the College’s movement to help radiology teams understand how AI can fit into their operations through resources including the ACR Data Science Institute® (DSI).

Prater’s practice uses an AI tool that summarizes a report as the radiologist is dictating and then creates the impression. “The AI tool customizes to the individual radiologist,” he says, “so it gets better over time.” The implementation of AI has been a success, he says. “There may not be as much traditional ROI in terms of how much more productive it makes our radiologists, but we have found that our radiologists report being less burned out when they have it,” Prater says. “That little bit of brain rest that you get multiple times during a shift — especially from a complicated study — is almost like having a resident sitting next to you who knows what you want to say. It’s kind of a happiness quotient.”

VENDORS AND NEEDS

The market is full of medical AI algorithms, which employ machine learning (ML) that directs a computer to operate on its own. The FDA has approved more than 500 medical AI tools to date, and the overwhelming majority, almost 400 of them, apply to radiology. Choosing which product to employ can be daunting. “We have an ambitious goal to implement several AI algorithms, both homegrown and commercial, over the next year,” says Gloria L. Hwang, MD, associate chair for clinical performance improvement in the radiology department at Stanford University. “We have several AI algorithms up and running already because our department saw a clinical need and believed that AI algorithms were likely the best way to meet our needs.”

The department at Stanford continues to bring in new AI technology strategically. “Many vendors come to us with shiny tools,” Hwang says. “Before considering the tools, we ask our department members to step back and ask themselves, ‘What problem are we trying to solve?’ That means considering whether the AI tool satisfies a priority clinical need, whether the implementation requirements align with the infrastructure we have, and whether the AI solution is better than non-AI solutions to meet that need. If the answer to all those questions is yes, we take the additional step of looking at competitors to make sure we are implementing the best option. This is a lot of work up front, but when well executed, an AI-based solution can become an indispensable part of the clinical workflow.”

All of these AI tools need to improve patient care quality and safety, Hwang emphasizes. “If this is not an expected outcome of AI, it has a limited lifespan in the healthcare environment.”

AI AND INTEGRATION

As early adopters of the technology ask for what they need and guide developers in creating new use cases, the selection of AI tools available to all sizes of radiology operations continues to grow. The possibilities seem endless.

“Part of why AI tools have generated so much interest in medical imaging is the promise of faster acquisition of data, less noise, less artifact and higher image fidelity. These things promise new ways to get patient images that previously did not exist,” says Richard J. Bruce, MD, vice chair of informatics for the University of Wisconsin-Madison radiology department. “AI is also very patient-centric. It can lead to improvements in the patient workflow — with fewer exams and less radiation.”

Until you venture into using AI, you won’t know what can be helpful, Bruce says. “There are two ways we bring AI into the clinical space,” he says. “One is to look at AI companies as potential partners. This means we have to provide value and they have to provide value.”

The other way is developing AI internally. “That route has a lot of moving pieces — some of which we won’t be good at,” Bruce says. “We have to ask how we will actually package algorithms and integrate them with other clinical tools — with little or no help.

“Not so long ago, the number of commercially available algorithms was quite small. We saw this, however, as a tsunami that was coming,” Bruce says. “Even if we didn’t have the data to suggest that we needed a given algorithm, we knew we should start getting experience with AI that would inform future decisions.”

The University of Wisconsin has taken the stance, he says, that when working with vendors, it is worth giving almost everything a try.

“We have implemented several algorithms where staff feedback has been that the false positive rate is far too high — and that it wastes time and is not helpful,” Bruce says. At the other end of the spectrum, there are a couple of tools the organization has deployed that elicited feedbacklabeling AI as a game changer.

“Driving the improvement process forward for patients and radiologists is what is most important to us,” Bruce says. “What happens when an algorithm does not perform the way you want it to or expect it to?” Rather than conceding that AI is not meeting needs and discontinuing its use, the University of Wisconsin has reached out to its vendor partner and presented internal data to adjust and adapt.

“We have provided the vendor with quite a large volume of cases of data that demonstrated suboptimal performance in certain scenarios,” Bruce says. “This took us almost a year of significant effort, but the result was the vendor updating its algorithm. I think they learned a lot in the process about assumptions they made regarding underlying data. At the end of the day, that is a success story.”
From a technical perspective, the market has been maturing and there are multiple platforms available, so more radiologists are jumping into AI. But they need to be prepared for what is still in many ways uncharted territory.

“The technical barriers to integration have improved significantly,” Bruce says. “However, there still are not widespread, universally adopted standards for how algorithms might plug into a given tool.” Standards could address how AI tools can be delivered into the radiologists’ workflow and ecosystem, he says. They could address issues with PACS integration or the electronic records system or how results are sent downstream.

“A standard has not emerged, and so there remain many challenges,” Bruce says. “One of our biggest challenges is that maybe five years from now we may be looking at thousands of algorithms. We have to figure out how we could manage that scale.”

EFFICIENCY AND COST

The reasons for using AI are just as varied as the technology solutions themselves. Among them: handling an increasing workload with fewer people.

“Where we stand now is with a shortage of radiologists — with a demand for imaging services growing and the complexity of imaging services increasing,” Hwang says. “Something has got to give. If there is anything out there to offload some of the more tedious aspects, then by all means we need to find solutions.”

The question is, can AI do it better than a human? Because if not, it still may be more cost-effective or better to have a human do the task, Hwang says. “You have a big gaping hole that needs to be filled, but the solution may not be using AI as a more efficient, faster way to solve a problem,” she says.

At the end of the day, you have this shiny new algorithm, but it’s not filling the hole. “Then you still have the hole and an expensive AI tool integrated into your system,” she says.

Part of why AI tools have generated so much interest in medical imaging is the promise of faster acquisitions of data, less noise, less artifact and higher image fidelity. These things promise new ways to get patient images that previously did not exist.

—Richard J. Bruce, MD

“If you make things more efficient with the same level of care — and if it costs less to use this algorithm than to hire one more radiologist — that might make sense for your practice,” Hwang says. “Or, if you can avoid a delay in care due to a missed finding — which is costly to the patient and potentially legally costly to a hospital or practice — that would be an important consideration in securing an AI tool.”

If you adopt AI, you must be prepared internally or through a third party to be able to identify how the AI behaves in your environment. “I would say take the hard sell by vendors with a grain of salt if you’re going to move forward and adopt AI,” Hwang says. “This is not a trivial undertaking. There should be a strong clinical need and you must find the right vendor partner. Put your energy into making each vendor work for you — shop around — but don’t spread yourself too thin in trying to adopt every AI solution that gets thrown your way.”

STAKEHOLDERS AND CHALLENGES

The FDA has cleared more than 20 AI algorithms for breast imaging, says Manisha Bahl, MD, MPH, a breast imager at Massachusetts General Hospital and associate professor of radiology at Harvard Medical School. Choosing which ones to implement is key because the process of adopting AI technology is not simple.

The steps involved in clinical implementation of an AI product include identifying all stakeholders, selecting the appropriate product to purchase, evaluating it with a local data set, integrating it into the workflow, and monitoring its performance over time.Despite the potential benefits of improved quality and increased efficiency with AI, several barriers, such as high costs and liability concerns, may limit its widespread implementation.

“One of the first steps involved in the AI implementation process is to identify stakeholders,” Bahl says. “Stakeholders can be a large group — including end users like radiologists, other clinicians, technologists, clinical leadership, IT staff, data scientists, AI experts, compliance and legal representatives and ethics experts.”

One challenge is that not all of the various players will be convinced of the need for AI. “We have yet to demonstrate the ROI to some stakeholders because AI research in breast imaging up to this point is largely based on retrospective reader studies and retrospective simulation studies, and we haven't yet studied the impact of AI on what is most important to us, which is patient outcomes,” Bahl says.

“AI in breast imaging may help us improve patient outcomes through higher cancer detection rates, lower false-negative rates and lower false-positive rates,” she says. “AI could also improve our efficiency by detecting and characterizing lesions, auto-reporting normal exams and prepopulating reports.”

Smaller operations can be more nimble in deciding to adopt AI. But they will face other challenges, she says: “Many practices may not currently have the capability to support and manage AI in a scalable and sustainable manner. An individual practice, healthcare organization or enterprise must have processes in place to be used for algorithm selection, workflow, integration and quality assurance.” In some cases, a hybrid governance structure involving both the radiology practice and hospital leadership may
be appropriate, she adds.

“There are certainly differences among practice types. In terms of barriers, implementation costs can be high for an AI product. Also, the fee structure for many AI products is based on use,
although flat rates are available,” Bahl says. In addition to investing in the product, other costs to consider revolve around infrastructure updates or improvements, as well as product user training.

RADIOLOGY AND COORDINATION

Radiology Partners in El Segundo, Calif., deployed AI algorithms several years ago and has seen positive results. “I am a huge fan of AI,” says Nina E. Kottler, MD, MS, the practice’s associate chief medical officer in clinical AI and an associate fellow at the Stanford Center for AI in Medicine & Imaging. “We have gained experience in identifying use cases that provide value for our patients and our practice, selecting AI products, piloting vendors’ algorithms, and creating our own,” Kottler says.

“Our radiologists have adapted to using AI tools and have integrated them into their clinical workflow,” Kottler says. “The business case for implementing AI is not as variable from practice to practice as you might think. Variability comes into play when you are talking to different radiology stakeholders — a radiology practice versus a fee-for-service hospital system versus an integrated, value-based hospital system versus an outpatient imaging center. Value is in the eye of the beholder.”

Beyond finding a business case to afford the cost of innovation, groups need to be prepared for the next steps. AI models need to be evaluated and implemented. There are two main components
to any AI implementation, and both should be evaluated before a vendor and model are chosen: technical and clinical.

“The technical implementation means you must figure out a way to get the right data into the right AI algorithm, then to get the relevant AI results to the relevant clinical applications so they are integrated with the radiologist workflow,” she says. “This process needs to happen before the radiologist opens the study.” Participating in a pilot gives you time to figure out that process.

The second part is the clinical implementation, which involves evaluating the accuracy of the AI system to ensure the product is high-quality for your patients and that the radiologists will use it. The work needed to optimize engagement by radiologists is generally underestimated. With any new tool, you must get radiologists to accept it, and there is a lot of education and change management that goes along with that process, Kottler says.

“It is best to jump in and get your feet wet,” she says. It is also helpful to talk to people who have implemented AI to see what technical and clinical barriers they needed to overcome, especially if that
group is on a scale similar to your own, Kottler suggests. “Try using standards so that everyone isn’t recreating the wheel, so to speak. And have an AI champion in your practice who can begin to collect teaching cases for continuous education and the measurement of potential ROI.”

Kottler says there are two aspects of AI she’s excited about. “One is something that we can do today — using AI to standardize our data,” she says. “Humans are highly variable, and because humans are creating our healthcare data, it tends to be variable and unstructured. Computer systems, on the other hand, are highly structured. A combination of computer vision and natural language processing, or NLP, can be applied to structure unstructured data and map it to a national standard.”

If you think about how AI fundamentally works, you will find it’s good at adding structure to unstructured data, she says. “We looked at our database, maybe two years ago, of how many study names we had, across our thousands of hospitals, for an X-ray of the wrist. We thought it would be maybe 100, but it turns out we have more than 500 different ways of naming a wrist X-ray,” Kottler says.

Imagine the exponential number of ways you could name each series of a CT scan, she says. “It’s the Wild West out there in terms of series names.” AI could help standardize not just the procedure name, but the series names as well, she says.

The other aspect Kottler finds exciting is the idea of using AI to do things humans can’t do today. That can mean everything from using imaging (pixel data) to make personalized predictions for malignancy risk on screening examinations, to combining information from genomics, radiomics, molecular imaging and other data to provide a personalized lesion evaluation and optimal treatment options.

“Regardless of the use case, I see AI augmenting the human,” Kottler says. “I am a strong believer that the integration of human and AI is our future, period.”

Those in the know about the technology are putting out a call to action that everyone in the specialty should be learning about AI tools. “If we are not doing that, we are not going to be in control of our own future,” she says. “AI is not replacing us, but it is going to fundamentally change how we practice.”

ENDNOTES

1. Bahl M. Artificial Intelligence in Clinical Practice: implementation considerations and barriers. J. Breast Imaging. 2022; 4(6):632–639.

 

Author Chad E. Hudnall  senior writer, ACR Press