The Quiet but Growing Downside of ChatGPT
AI tools like ChatGPT offer convenience, but over-reliance can hinder skill development. Learn strategies to use AI wisely without losing critical thinking.
Read more
Sara Siddiqui, BA
Sara Siddiqui, BA, MS4 from Northwestern University Feinberg School of Medicine, contributed this piece.
Radiologists, data scientists and research institutions are now training AI and machine learning to interpret medical images, from chest radiographs to CT scans. These tools offer tremendous promise in streamlining workflows and improving diagnostic precision. As we move toward greater advancements in AI, we are called to pause and ask deeper questions about the technology being built, and whether it serves all patients equitably or inadvertently reinforces old biases in new, invisible ways.
Dr. Melissa Davis, MD, MBA, Associate Professor of Radiology and Biomedical Imaging at Yale University1, has spent years at the intersection of equity and innovation. “There are papers coming out right now that can tell the race of patients based off of an X-ray,” Dr. Davis explains. “If we’re going to perpetuate our biases back to providers, we’re going to amplify the problems on the other side.”
Radiology is often mistaken as a specialty remote from patient care, but Dr. Davis reminds us that even from behind the screen, implicit bias can shape clinical judgement. Visual cues on imaging, like a patient’s hairstyle or jewelry, may influence a radiologist’s interpretation. “I can tell if someone has braids in their hair or is wearing a certain set of earrings- that makes me have some sort of set notion about who that person is,” explains Dr. Davis. “Although we can say ‘the lungs are clear’ when we see pathology, we use our biases to generate the differential, and it may not be right.”
While radiology is imagined as “color-blind” given its reliance on grayscale radiographs, both human and machine interpretations are shaped by history, context and representation. AI algorithms are trained on massive datasets, but when those datasets underrepresent Black and Brown patients, the resulting tools perform less accurately on images from those populations, risking delayed diagnoses or incomplete care. ACR Data Science Institute® AI Lifecycle Support resources2 offer a framework for implementing AI in radiology and the ability to confidently master revolutionary technology while maintaining quality assurance.
As AI becomes integrated into radiology workflows, Dr. Davis encourages a collaborative mindset. “You can’t think about these issues unless it’s part of your life,” she notes. “If you can’t see it, how are you going to mitigate against it?”
AI has enormous potential to enhance patient care, but it is important to acknowledge the biases carried by these tools. The path forward may include diversifying the datasets used to train algorithms, including perspectives from historically underrepresented communities in the development process, and advocating for equity in both technology and the field of radiology itself.
The Quiet but Growing Downside of ChatGPT
AI tools like ChatGPT offer convenience, but over-reliance can hinder skill development. Learn strategies to use AI wisely without losing critical thinking.
Read moreTeaching by Mistake: Radiological AI Errors as Learning Tools
AI errors in radiology can be reframed as teaching tools, helping trainees and clinicians learn critical reasoning and improve diagnostic skills.
Read moreReal Problems β Real Solutions
Transitioning to attending is different for everyone, but quality, safety & informatics will always play a key role in your career.
Read more