September 10, 2021

AI-Powered Best Practice Recommendation Program

One radiology group leveraged artificial intelligence to significantly improve adherence to evidence-based guidelines.
  • Radiology Partners Inc. wanted to expand its best practice recommendation (BPR) program and knew it needed artificial intelligence (AI) to help.
  • With no data scientists in house at the time, the radiologists worked with an outside vendor before adding an in-house team.
  • The practice now has AI algorithms for five BPRs, helping it achieve 100% adherence to some guidelines and more than 90% adherence to others.

In 2015, Radiology Partners Inc. began rolling out a best practice recommendation (BPR) program to help its radiologists more consistently use evidence-based guidelines, such as the ACR Appropriateness Criteria®, to make population-health-focused follow-up care recommendations. The group started with just three BPRs, which the radiologists manually referenced each time they read a correlating case. The approach reduced reporting variability as the radiologists made follow-up care recommendations based on guidelines that are known to improve care and decrease costs, but it was labor intensive.

The group knew that if it wanted to expand the program to include more BPRs, its radiologists would need help. That’s when it turned to artificial intelligence (AI). “We wanted the radiologists to have a digital assistant to help them use and apply the BPRs as we scaled the program,” says Nina Kottler, MD, MS, associate chief medical officer for clinical AI and vice president of clinical operations at Radiology Partners. “That meant creating an AI program that uses natural language processing to understand what the radiologists are saying as they dictate their reports and automatically identify the appropriate follow-up recommendations for each pathology. We looked around, and that kind of AI system didn’t exist, so we decided to create it.”
Nina Kottler, MD, MS associate chief medical officer for clinical artificial intelligence and vice president of clinical operations at Radiology Partners
Nina Kottler, MD, MS, associate chief medical officer for clinical artificial intelligence (AI) and vice president of clinical operations at Radiology Partners, has led the integration of AI throughout the practice.
With the help of a team of data scientist consultants, the group began developing and implementing the AI system in 2017. Since then, the system has allowed the group to expand its BPR program to include five recommendations that address abdominal aortic aneurysm, lung nodules, incidental thyroid nodules, ovarian cysts, inferior vena cava filters, COVID-19, and adrenal nodules. The program helped the group increase adherence to some guidelines by more than 80%, results that are encouraging the group to expand the program and its use of clinical AI.

“We’re using evidence-based research to determine which follow-up recommendations our radiologists should provide to referring clinicians and, ultimately, to patients. This widespread standardization of care wouldn’t be possible without AI,” says Kelly Denney, who started out as an AI consultant on the project and is now the Radiology Partners’ director of data science and clinical analytics. “The technology gives our radiologists the extra support they need to drive added value in radiology.”

Forming a Partnership

When Radiology Partners decided that it needed AI to expand its BPR program, it didn’t have any data scientists on staff. Instead, the group’s information technology team recruited the help of a vendor. Working with the consultants, Kottler and her colleagues explained that they needed an AI system to identify the appropriate BPR based on each radiologist’s dictation. To start, the consultants worked on a BPR for incidental thyroid nodules, but things didn’t go as expected. “When the consultants showed us their initial presentation and the direction they were headed, I said, ‘This is all wrong,’” Kottler explains. “I talked to the head of our IT team and told him it wasn’t working and that we’re going to have to fire the vendor, because they don’t get it.’”

Before dismissing the vendor and starting over, however, Kottler reconsidered how her team had been working with the consultants. “I realized that they are brilliant data scientists, but they don’t know anything about healthcare or radiology, and they probably don’t even know what the thyroid does,” Kottler says. “Instead of giving them requirements and telling them what to do, we realized that we had to create a partnership with them, so I actually flew out to their office in Columbus, Ohio, and spent time talking with them about how we work and what we were trying to achieve.”
Kelly Denney, director of data science and clinical analytics for Radiology Partners
Kelly Denney, director of data science and clinical analytics for Radiology Partners, is part of a team that helped build the group’s BPR algorithms.
Over three days, Kottler taught the consultants about anatomy, physiology, and thyroid function. She also discussed the language that radiologists use when dictating their reports and reviewed the best practice guidelines, why they were created, and their purpose. “That gave the consultants an idea of why we were creating this AI system, which was way more important than telling them specifically what to create,” Kottler says. In turn, the consultants taught the radiologists more about how natural language processing and AI work. “We all learned a lot about the necessary components to build a successful solution,” Denney says.

From there, the consulting team delivered an AI algorithm for the incidental thyroid nodule BPR that exceeded the radiologists’ expectations. “We decided from then on that was how we were going to work with the consulting team: Either I or one of my colleagues would spend a few days with them before we even started on a new BPR, orienting them with the anatomy and walking them through the best practice and why it was created,” Kottler says. “It was an iterative process of educating them about what we were doing and the meaning behind it so that they could apply that understanding to create an AI system that provided value to our work.”

The approach was such a success that after a year of working with the radiology team on a consulting basis, the AI team asked to join the radiology practice full time. “We asked Radiology Partners to hire us to continue the work on the tool that we’d created,” explains Denney, adding that the tool requires long-term management for sustained accuracy and updates as the BPRs evolve. “The collaborative working relationship we’d built with Nina and others from Radiology Partners was exactly the type of culture that made us want to come to work every day because it wasn’t really work — it was fun and rewarding.”

Radiology Partners agreed to hire the team. It now has eight data scientists and four clinical analysists on staff, and it plans to add more to further expand the BPR program. “Our chief executive officer and chief operations officer are strong believers in the value of the AI tool and understand the value of having a team of people on staff who are familiar with the tool and who can maintain it and update it appropriately,” says Kottler, noting that all AI tools require maintenance. “The business case was clear: The tool helps our radiologists adhere to our BPRs, resulting in improved population-health management and better patient follow-up.”

Integrating the Algorithms

Once Kottler and the consultants successfully developed the first AI algorithm, they turned their attention to integrating the tool into the radiologists’ workflow. This involved working with the radiology group’s IT team to ensure that the radiology workstations had the proper permissions to run the AI algorithm. “We needed to send the algorithm out to the workstations and incorporate it into the radiologists’ startup menu,” Kottler explains. “You would think that would be easy, but it actually takes some backend work to make sure your architecture is set up to deploy the algorithm. You have to get the tool on their workstations before you can roll it out clinically.”

As the team integrated the algorithm to one of Radiology Partners’ practices, Kottler began educating the radiologists about the tool. To start, she delivered a presentation to the radiology leaders at that practice. During the recorded presentation, Kottler presented slides, discussed an overview of the technology, outlined the burning platform for why the group needed the technology, and answered questions. After that, she shared the recording with her team and then worked with them to conduct one-on-one in-person training with each radiologist, making sure to follow up a few days later to see how they were doing. The approach worked well but required a lot of time and resources, Kottler says.
Kent Hutson, MD, CPE, director of innovation clinical operations for Radiology Partners
Kent Hutson, MD, CPE, director of innovation clinical operations for Radiology Partners, says that the algorithms integrate seamlessly into the radiologists’ workflow for improved patient care.
To make the training more manageable across Radiology Partners’ network of more than 60 local practices, especially during the COVID-19 pandemic, Kottler and her team began using a virtual meeting platform to train trainers who would in turn train the radiologists as the group deployed its tool to more of its local practice. “We combined a clinical trainer with an IT trainer, and they worked remotely to train each radiologist on their actual workstations,” Kottler says. Now that the radiologists are generally familiar with the tool, Kottler and her team are moving away from the formal training sessions to making training videos that the radiologists watch on their own. “Each video is five to seven minutes long, and each one demonstrates a certain functionality of the AI tool works and what assumptions the tool is making,” Kottler explains. “Then after the radiologists pass a quiz, we roll the tool out for in-practice use.”

Generating Buy-In

As Kottler and her team deploy each additional algorithm, they run a pilot program to gather feedback from the radiologists about the tool’s performance, accuracy, and efficiency. A feedback mechanism within the AI system makes it easy for the radiologists to comment on the algorithms. “When we get that feedback, we reply to the radiologists to let them know that we’ve heard them and that we will make any necessary changes,” Kottler says. “We also conduct a one-question survey that asks the radiologists to rate from zero to 10 how likely they are to recommend the product to a friend, client, or colleague. We’ve gotten really high scores from the radiologists.”

In addition to soliciting feedback from the radiologists, Kottler and her team work with radiologist champions at each of Radiology Partners’ practices. To recruit these champions, Kottler and her team share a list of responsibilities with practice leaders and ask them to identify radiologists who they think would be a good fit for the role. The responsibilities include educating radiologists about the available programs, encouraging the radiologists to use the AI algorithms, helping to track how well each radiologist is adhering to the BPRs, and following up with radiologists who fail to consistently apply the BPRs.

“People tend to respect and listen to the people they work with directly, so you need local practice leaders to follow up with the radiologists to make sure they’re using the algorithms,” Kottler says. “To make them effective, you need to arm those champions with information about who is and who is not using the tool, specifically for those rads who could improve on BPR adherence. Then, the champions can encourage those rads to use the tool. This type of local rad-to-rad discussion tends to be very effective.”

For the radiologists, using the tool is relatively simple, says Kent Hutson, MD, CPE, director of innovation clinical operations for Radiology Partners. As the radiologists dictate their reports, the algorithms work in the background, listening for terms that correlate with the BPRs. Depending on the radiologist’s preferences, the tool’s window will slide into view as soon as it identifies a correlating BPR, or it will wait until the radiologist gets to the impression section of their report and then pop up with the BPR follow-up information. “At that point, the radiologist looks at the recommendation and clicks a thumbs up button, indicating that they agree with the algorithm, before inserting the recommendation into their report. If they disagree with the generated recommendation, they click a thumbs down button, and leave the information out of their report,” Hutson explains.

Focusing on Impact

The BPR program has helped the radiologists leverage evidence-based guidelines more consistently. For instance, Kottler says that before the first Radiology Partners practice instituted the program, its radiologists followed the abdominal aortic aneurysm guidance 4% of the time, the ovarian cyst guidance 4% of the time, and the incidental thyroid nodule guidance 56% of the time. Two weeks after the first practice implemented the AI-enabled BPR program, that group's adherence to the BRP for abdominal aortic aneurysm increased to 92%, adherence to the BRP for ovarian cysts increased to 100%, and adherence to the BPR for thyroid nodules increased to 99%. “This program is helping us reduce variation and improve quality across the board for all of our radiologists,” Hutson says. “These value-enabling activities are what we need to pursue in healthcare. That’s where we want to go with medicine in general. We want quality improvement initiatives that are evidence based and applied transparently so that clinicians can act on the best evidence.”

Recognizing that it would have been unlikely to achieve these results without AI, Radiology Partners began expanding its use of the advanced technology in 2019. While the group’s internal AI team focused on the BPR program, Radiology Partners worked with a vendor to pilot two algorithms for detecting and triaging intracranial hemorrhage and pulmonary embolism, two potential life-threatening conditions.

The results of the pilot showed that the algorithms helped the radiologists detect 2.4% more intracranial hemorrhage and 4.4% more pulmonary embolism findings, and the radiologists gave the tools a satisfaction rating of 8.7 out of 10.  “We ran both of those detection algorithms over six months across a large part of our practice,” Kottler says. “There were great results across the board and decided that this was something that we wanted to make available to everyone.” In the spring of 2021, the group entered into a contract to offer the vendor’s seven FDA-cleared algorithms across its network. (For a list of FDA-cleared algorithms, visit the ACR Data Science Institute’s AI Central .)

While these new algorithms assist with imaging interpretation, Radiology Partners is also adding AI for non-interpretative activities. For instance, the group is piloting a natural language processing algorithm that automatically creates the impression of the radiology report for X-ray, CT, and MRI; and an algorithm designed to help the radiologists provide more coordinated care by ensuring cases requiring follow-up actually get that follow-up. “Up to this point, we’ve been mainly focused on using AI as an assistant to help the radiologist as they interpret exams. But now we’re looking forward and backward in that workflow, from the order all the way through to the interpretation and then even follow up and peer learning, to see where else AI can drive value,” Denney explains. “For AI to be valuable, it has to improve patient care. It’s not just a cool use of technology; it’s what matters to our patients.”

With this in mind, Kottler encourages more radiologists to get comfortable using AI to ensure that these tools have a positive impact on patient care. “This will change our profession,” she says. “As radiologists, we will become the information experts who provide context for the massive amount of data that AI and other information-extracting technologies (radiomics, genomics, molecular imaging, etc.) present and who make the data actionable for referring physicians and patients. We should be driving that change. To do that, we must take the wheel and embrace this technology because we can’t drive anything from the backseat.”

Creative Commons

AI-Powered Best Practice Recommendation Program by American College of Radiology is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Based on a work at www.acr.org/imaging3. Permissions beyond the scope of this license may be available at www.acr.org/Legal.


Share Your Story

Have a case study idea you’d like to share with the radiology community? To submit your idea, please click here.

Now It's Your Turn

Follow these steps to begin integrating AI into your group’s workflow for improved patient care, and tell us how you did at imaging3@acr.org or on Twitter at #Imaging3.

  • Identify a use case for AI in your practice and look to see whether the tools exist for your use case.
  • Collaborate closely with data scientists to ensure the AI tools provide the value you need to improve patient care.
  • When implementing algorithms, collect radiologist feedback and respond to the feedback to help generate buy-in.

Author

Jenny Jones, Imaging 3.0 Manager

Join the Discussion

Twitter_320

Want to join the discussion about how radiologists can lead quality improvement projects for improved image ordering? Let us know your thoughts on Twitter at #imaging3.

#Imaging3 on Twitter

Call for Case Studies

StayUptoDate_320

Have a suggestion for a future case study? Please share your idea with us!

Submit your idea »