ACR Bulletin

Covering topics relevant to the practice of radiology

Understanding Your Data: The Next Step To Better Quality

Jump to Article
Tapping into the ACR's National Radiology Data Registry (NRDR) network unlocks benchmarks for the best patient care.

Without participation in the NRDR's registries, we would have no benchmarks.

—Vikki M. Casey, BS, CPHQ
March 01, 2024

There is always room for operational improvement. In medicine, paying attention to what could and should be done better can be the difference between exceptional patient care and potentially harmful outcomes. Sharing best practices, lessons learned and focused strategies around quality improvement is critical to honing processes and performance. For siloed radiology groups operating as if they are on an island without accessible data, benchmarks can build bridges.

Data registries are often overlooked or underused by radiologists who want to optimize quality patient care. The ACR National Radiology Data Registry® (NRDR®) has served for years as a data warehouse to assist participating facilities with their quality improvement and performance efforts by comparing their data to other radiology sites, both locally and nationally. Contributing your own data and gaining access to what other facilities are doing can prove invaluable when establishing and fine-tuning care delivery and quality improvement projects.

The NRDR houses the Dose Index Registry (DIR), CT Colonography Registry (CTC), Lung Cancer Screening Registry (LCSR), National Mammography Database (NMD), 3D Printing Registry (3DP), General Radiology Improvement Database (GRID), Clinical Decision Support Registry (CDSR) and the Qualified Clinical Data Registry (QCDR). One of the NRDR’s primary goals is to increase the value of participation in these registries through easy access and a friendly user experience that reveals the power of collective data feedback.

A facility’s data, once integrated into the registries, is made available for exploring a wide range of questions about clinical and operational procedures as radiologists look to identify new areas of quality improvement. 

Before diving into the data, there are a few questions you should ask: How do we monitor and execute our existing performance-based programs? Do we have dedicated quality improvement staff? Who on staff could best use registry data to flag potential issues and facilitate solutions? Which registry or registries potentially match our needs?

Getting Started

“When people talk about quality, data and registries, it can conjure up an image of a lot of work with questionable benefits. Radiologists and their teams tend to primarily focus on the delivery of clinical care to patients with less resources focused on reviewing their performance to either validate the quality of that care or to find gaps on which to focus improvement efforts,” says Ella A. Kazerooni, MD, MS, FACR, NRDR steering committee chair and professor of radiology and internal medicine at the University of Michigan. “I want to flip the script on that, because ensuring the quality of what you do is fundamental to your job and to delivering high-quality, safe care to our patients. It’s not an add-on, it’s not a sidecar. If you aren’t validating the quality of care you are providing, you don’t know how good it is or where you may have gaps in quality that can lead to patient harm.”

I believe that operational excellence, or being outstanding at what we do in radiology, is not viable unless you are measuring your performance and comparing data.

—Ben Wandtke, MD, MS

You need to determine what is right for your practice in terms of registry use, Kazerooni says. Figure out who you should involve from your team to look at the data and how it might make a positive difference. “Our goal is to bring registry data closer to the practices, not make it seem like a huge and distant administrative burden. It is intended to give them real-time information and interaction and tools for quality improvement,” she says.

“Using the Lung Cancer Screening Registry, for example, helped our own practice identify outliers in radiation exposure,” Kazarooni says. “Instead of making it a human problem, we asked questions and problem-solved as a team, developing structural changes in CT technologist workflow combined with education. Registry data can show us how we are performing in getting patients back from their annual lung cancer screenings compared to other practices on a national or regional level, or matched to practice type. The new ACR QI step-by-step templates are designed around key performance indicators like this to help practices get those numbers up so that the value of annual screening can be realized for our patients and their families.”

Kazerooni recalls participating in a statewide Blue Cross Blue Shield of Michigan cardiac CT registry in the early days of coronary CT that helped track factors such as radiation dose and appropriate use. “You don’t see radiation, because it’s odorless and tasteless, unless you’re watching the numbers,” she recalls. “By looking at the radiation doses sent into the registry from practices across the state, we identified a practice that had higher-than-expected radiation doses, but they didn’t know because they weren’t looking. When we found this, we were able to work with physicians and staff at that site and their CT manufacturer to put in place an imaging protocol within the recommended levels. That was one of my first big aha moments with registry data — if you don’t look, you don’t know, and you may be placing patients at risk of harm.”

Kazerooni recognizes that different registries require different amounts of time and work for participation — and that can be a concern for practices that might be interested in submitting and using the benchmarking data. “The Dose Index Registry is probably the easiest one for practices to get started with because it basically sets things up so that their CT scanners automatically send information to the registry,” she says. “It’s not a manual collect, enter and send process, so there is a low barrier to participation and it uses less staff resources.”

Other registries require more human investment. “You will need to look at and collect your own data for the lung cancer screening registry and the national mammography database, for example,” Kazerooni says. “But there are tools to help you send data to the ACR. There are EHRs that can help and IT tools you can purchase for lung screening, for instance, that we have validated to ensure that they are clinically useful in tracking patient screening and submitting information to the registries.”

In February, the ACR secured a $100,000 planning grant to improve diagnostic performance feedback through its clinical registries. The College plans to use the award to develop a registry built off of the existing LCSR focused on the management of actionable incidentally detected pulmonary nodules (AIPNs). Lung cancer screening and AIPN management are two combined strategies for the early detection of lung cancer.

Making use of these registries is an investment to engage the people who can help you improve care delivery and quality and safety practices, Kazerooni says. “Still, it’s one thing to get your data into a registry and another to use what you find from other facilities to change something around quality improvement based on your findings. Sometimes there is no team member specifically responsible for translating benchmark data into quality improvement — and physicians often don’t have the time.”

Defining Dose

Because all state health departments oversee radiation safety and have radiation dose requirements and protocols, staying on top of dose benchmarks is fundamental to the quality and safety of radiology. “This has been probably the most active and exciting time for the DIR since it was originally developed. We have just added fluoroscopy as the newest DIR module — interventional fluoroscopy — and it is fully up and running now,” says A. Kyle Jones, PhD, steering committee chair for the Dose Index Registry and a professor in the Department of Imaging Physics at the University of Texas MD Anderson Cancer Center.

DIR data are used to establish national dose indices benchmarks, and the registry also supports reporting requirements for the Merit-based Incentive Payment System (MIPS). The DIR provides feedback reports that allow facilities to compare their CT, fluoroscopy and digital radiography dose indices’ performance with that of all DIR participants by standardized exam, procedure name and age group.

Jones says they are also nearing the end of the diagnostic fluoroscopy pilot and the pilot for digital radiography, which will become the newest module added to the DIR. “We have gone from one module, which was CT four years ago, to three new ones — with the two pilots expected to be up and running by the end of spring. We’re in the process of developing a pilot for nuclear medicine as well,” Jones says.

“We are not only expanding the registry, but improving the quality,” Jones says. “The CT group is working on adding support for combined procedures, which has long been kind of a pain point for people participating in the registry. They are also working on introducing indication-based common IDs, which would be a major shift.” 

Right now, the common ID is based on how a procedure is named at participating sites, Jones says. The shift to indication-based would make the data much more useful. “If you think of a CT for abdomen or pelvis, for example, it could be routine or it could be for liver protocol at a cancer center,” Jones says. “Those two indications, the way that you would perform the CT, are very different — with very different expectations for the dose indices. You can imagine that it will be much more useful for sites to be able to map their data and analyze it along that indication dimension.”

Jones discussed how the CT group has also been considering how image quality metrics could be incorporated into the registry data and working to establish bidirectional communication within the DIR. “We want the data to be more user-friendly and accessible, and we want to be more flexible by giving participating sites more freedom in how they use the benchmark data that is available,” he says. An example of bidirectional communication would involve allowing a site to display benchmarks from the registry within their own internal system applications — in the context of their own site-specific data.

“What we’re really working toward here is making the experience as automated as possible, and then making the analysis as meaningful as possible,” Jones says. “We are doing really well on the second part, and we’re making good progress on the first part. Once the DIR is completely automated from beginning to end, beyond making initial connections on the facility side, the mapping and quality control is all done on the ACR side. This would make the DIR unique and greatly increase its user base to benchmark data.”

Capturing Benchmarks

The registries can be used to compare data across subspecialties, geographic regions or even branches of the same organization. 

“Without participating in the NRDR’s registries, we would have no benchmarks,” says Vikki M. Casey, BS, CPHQ, safety and quality project manager for Providence Health & Services (Oregon region) and an active member of the ACR’s NRDR committee. “The reports we get through the NRDR registries are invaluable.”

The GRID, for example, provides aggregated information that allows participating facilities to compare turnaround times, patient wait times, incident rates and other process and outcome measures with other facilities and practices of comparable size and type. 

“Before the GRID report, it would take me days to put together information for our 13 facilities,” Casey says. “Now it is just a matter of extracting the data from the registry. I have what I need within 24 hours, and we have been on autopilot here since our initial registry set-up. It is incredibly significant that the data reporting allows me to look across all of my facilities, showing me the big picture. This is not like back in the day when I was creating PDFs for each facility to compare them.” 

The registry reports are especially important when dealing with the four radiology groups that service facilities in the Oregon region, Casey says. “From an operations perspective, I can look across the entire region — and I can choose filters to dial it down to only the facilities that a radiology group may be interested in and run a data report, flawlessly,” she says.

Internally, the reports create an opportunity to address areas of improvement. “We might hear that there was some kind of event — that CTs are taking an hour, as an example — and I can pull up those reports and demonstrate what other facilities are doing compared to how we’re doing it,” Casey says. These findings can also serve to keep you on track, she says. “I can go back to my radiology groups, and say, ‘From 2019 to 2023 we really decreased our times and were exceeding most of the registry participants in terms of turnaround time. Now we’re kind of sliding.’ That sort of discovery starts a conversation that may never have happened.”

NRDR participation depends a great deal on a correct understanding of the registries’ value and ease of use, Casey says. “The ease-of-use part may seem intimidating, but it’s not. The ACR has brought the use of registries for quality improvement a long way over the past 15 years,” she says. “The College not only provides documentation for participation, it produces webinars and someone from the College is always available to help. The support is top-notch, and you never feel like you’re trying to work through the process on your own.”

Rethinking Quality

Staying on top of quality improvement hinges on using all resources at your disposal — and realizing that your situation is probably not unique. “I know a lot of us are striving to be the best, but you should focus on projects that are meaningful across the specialty — efforts that likely address the biggest issues we’re all facing,” says Ben C. Wandtke, MD, MS, chief of diagnostic imaging at FF Thompson Hospital and associate professor in the Department of Imaging Sciences at University of Rochester Medical Center. Wandtke is also director of the UR Medicine CT lung screening program and vice chair of quality and safety.

“Registry participation provides a great opportunity for the field to standardize and to make performance improvement more efficient by measuring ourselves against each other in meaningful, important areas,” Wandtke says. 

“My first experience with the NRDR was about a decade ago, using the DIR. At the time, we were trying to optimize CT radiation dose for our patients, but it was difficult in the absence of benchmark data to see what was good enough and what was best practice,” Wandtke recalls. “We tried to optimize our CT protocols without the DIR, and we were simply guessing. Once we started participating in the registry and looking at benchmark data from similar groups that were dose conscious, it gave us a tremendous amount of confidence and guidance — to better understand what we were doing and how to be successful.”

Beyond the DIR, Wandtke’s group has participated in the NRDR lung cancer screening registry for about eight years. As with the other registries, reports come out each quarter and are promptly reviewed by the group’s lung cancer screening team. 

“I’ll tell you as vice chair for quality and safety, we do a lot of performance improvement projects,” Wandtke says. “One of the things we have learned is that you can spend about half of your time on a project just trying to gather accurate data. The registries provide quality-controlled and updated data. You don’t have to waste your time on data gathering when using the NRDR, freeing up more time for your improvement teams to study the problem and implement changes.”

When you embark on performance improvement projects related to metrics and data that are coming out of the ACR-supported registries — whether it’s lung cancer screening, the DIR or the GRID registry — the data is there for you, it is benchmarked, and you can set goals more easily, Wandtke says. “It helps you evaluate your current state of operations, think about the root causes and find opportunities for potential interventions for making positive changes.”

Because registry data can be filtered, participating facilities can compare what they are doing to the activities of other sites that are relative peers. “I’m in a rural environment in a smaller hospital setting, for instance,” Wandtke says. “You can filter the data and break it down by large academic health systems or those in more urban areas. This is tremendously powerful.”

While there are small differences and variations around the country based on location or practice style, radiology groups are largely dealing with many of the same problems, Wandtke says. “If you have been active in the quality community or operations community of radiology for a while, you realize that we are all dealing with a similar set of challenges. Once you start to measure performance in these areas and compare them to other sites, it can really open your eyes — you cannot hide from the fact that you may not be performing at an optimal level in a certain area,” he says. “And if you want to hang your hat on being the best in the country at something, you’d better have the data to prove it.”

Being on top should never be assumed, Wandtke says. As Casey pointed out, the registries can help your group avoid a backslide after enjoying long-term performance improvements. “It is very common that you improve a process by 30% or 50% while you are actively working on it,” Wandtke says. “Then you shift your attention to the next project and may neglect monitoring the improvements you’ve made. If you check the data a year later, you might be right back where you started.”

Avoid wasted efforts, Wandtke cautions. “When registry data reports are coming in on a regular basis for your review, make sure your practice hasn’t slipped, and determine whether you need to refocus your attention on that area,” he says. “This is particularly relevant for the DIR, where there are new scanners coming into the system and application upgrades that may inadvertently reset protocols.”

To garner the most from the NRDR registries, it takes commitment — quality improvement is not a one-off. “It really comes down to your willingness and ability to work on continuous quality improvement or continuous performance improvement,” Wandtke says. “I believe that operational excellence, or being outstanding at what we do in radiology, is not viable unless you are measuring your performance and comparing data.”

Achieving excellence through registry participation does not necessarily translate into an excessive time commitment, Wandtke says. “We meet once per quarter when reports come out and then supervisors spend several hours reviewing the data so they are ready to present a plan for how to address any issues identified in the report,” he says. “The benefits greatly outweigh the time and effort.”

It is important to understand the data that is available to you — what it says about other radiology groups and how it can be applied to meet your own quality and performance goals. 

“I think the overarching message is that we all could be and should be doing better with the tools at our disposal,” Wandtke says. “If you are not using these registries, you are missing a great opportunity to improve your facility’s performance in delivering the best patient care.”


New NRDR Video Series

Hear from radiology leaders about the value of participating in the National Radiology Data Registry (NRDR) in a new video series available on the NRDR resources page. NRDR committee chairs reflect on why they champion NRDR participation and dedicate their time and expertise to registry advancements. They also highlight examples of how using registry data in their own organizations helped identify and resolve quality issues.

The series includes these videos about Harnessing the Value of NRDR Participation: Perspectives of the Registry Chairs:

Author Chad Hudnall,  senior writer, ACR Press