New AI Tool Mimics Radiologist Gaze to Read Chest X-Rays

Ngan Le, assistant professor of computer science and computer engineering, studies AI and computer vision.
Photo by Russell Cothren

Ngan Le, assistant professor of computer science and computer engineering, studies AI and computer vision.

Artificial intelligence can scan a chest X-ray and diagnose if an abnormality is fluid in the lungs, an enlarged heart or cancer. But being right is not enough, said Ngan Le, a U of A assistant professor of computer science and computer engineering. We should understand how the computer makes its diagnosis, yet most AI systems are black boxes whose “thought process” even their creators cannot explain.

“When people understand the reasoning process and limitations behind AI decisions, they are more likely to trust and embrace the technology,” Le said.

Le and her colleagues developed a transparent, and highly accurate, AI framework for reading chest X-rays called ItpCtrl-AI, which stands for interpretable and controllable artificial intelligence.

The team explained their approach in “ItpCtrl-AI: End-to-end interpretable and controllable artificial intelligence by modeling radiologists’ intentions,” published in the current issue of Artificial Intelligence in Medicine.

The researchers taught the computer to look at chest X-rays like a radiologist. The gaze of radiologists, both where they looked and how long they focused on a specific area, was recorded as they reviewed chest X-rays. The heat map created from that eye-gaze dataset showed the computer where to search for abnormalities and what section of the image required less attention.

Creating an AI framework that uses a clear, transparent method to reach conclusions — in this case a gaze heat map — helps researchers adjust and correct the computer so it can provide more accurate results. In a medical context, transparency also bolsters the trust of doctors and patients in an AI-generated diagnosis.

“If an AI medical assistant system diagnoses a condition, doctors need to understand why it made that decision to ensure it is reliable and aligns with medical expertise,” Le said.

A transparent AI framework is also more accountable, a legal and ethical concern in areas with high stakes, such as medicine, self-driving vehicles or financial markets. Because doctors know how ItpCtrl-AI works, they can take responsibility for its diagnosis.

“If we don’t know how a system is making decisions, it’s challenging to ensure it is fair, unbiased or aligned with societal values,” Le said.

Le and her team, in collaboration with the MD Anderson Cancer Center in Houston, are now working to refine ItpCtrl-AI so it can read more complex, three-dimensional CT scans.

The first author on the paper is Trong-Thang Pham, a Ph.D. student in Le’s Artificial Intelligence and Computer Vision Lab. Other authors include Jacob Brecheisen, a U of A undergraduate at the time of the research, and radiologist Arabinda Choudhardy of University of Arkansas Medical Sciences.

About the University of Arkansas: As Arkansas' flagship institution, the U of A provides an internationally competitive education in more than 200 academic programs. Founded in 1871, the U of A contributes more than $3 billion to Arkansas’ economy through the teaching of new knowledge and skills, entrepreneurship and job development, discovery through research and creative activity while also providing training for professional disciplines. The Carnegie Foundation classifies the U of A among the few U.S. colleges and universities with the highest level of research activity. U.S. News & World Report ranks the U of A among the top public universities in the nation. See how the U of A works to build a better world at Arkansas Research and Economic Development News.

News Daily