BELOW SUPERNAV drop zone ⇩

Will AI replace doctors who read X-rays, or just make them better than ever?

The Koios DS Smart Ultrasound software, used to get a second opinion on mammography images, is seen on a computer screen, Wednesday, May 8, 2024, at Mount Sinai hospital in New York. In the near term, experts say AI will work like autopilot systems on planes — performing important navigation functions, but always under the supervision of a human pilot. (AP Photo/Mary Altaffer)

The Koios DS Smart Ultrasound software, used to get a second opinion on mammography images, is seen on a computer screen, Wednesday, May 8, 2024, at Mount Sinai hospital in New York. In the near term, experts say AI will work like autopilot systems on planes — performing important navigation functions, but always under the supervision of a human pilot. (AP Photo/Mary Altaffer)

MAIN AREA TOP drop zone ⇩

This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

WASHINGTON (AP) — How good would an algorithm have to be to take over your job?

It’s a new question for many workers amid the rise of ChatGPT and other AI programs that can hold conversations, write stories and even generate songs and images within seconds.

For doctors who review scans to spot cancer and other diseases, however, AI has loomed for about a decade as more algorithms promise to improve accuracy, speed up work and, in some cases, take over entire parts of the job. Predictions have ranged from doomsday scenarios in which AI fully replaces radiologists, to sunny futures in which it frees them to focus on the most rewarding aspects of their work.

That tension reflects how AI is rolling out across health care. Beyond the technology itself, much depends upon the willingness of doctors to put their trust — and their patients’ health — in the hands of increasingly sophisticated algorithms that few understand.

Even within the field, opinions differ on how much radiologists should be embracing the technology.

“Some of the AI techniques are so good, frankly, I think we should be doing them now,” said Dr. Ronald Summers, a radiologist and AI researcher at the National Institutes of Health. “Why are we letting that information just sit on the table?”

Summers’ lab has developed computer-aided imaging programs that detect colon cancer, osteoporosis, diabetes and other conditions. None of those have been widely adopted, which he attributes to the “culture of medicine,” among other factors.

Radiologists have used computers to enhance images and flag suspicious areas since the 1990s. But the latest AI programs can go much further, interpreting the scans, offering a potential diagnosis and even drafting written reports about their findings. The algorithms are often trained on millions of X-rays and other images collected from hospitals.

Across all of medicine, the FDA has OK’d more than 700 AI algorithms to aid physicians. More than 75% of them are in radiology, yet just 2% of radiology practices use such technology, according to one recent estimate.

For all the promises from industry, radiologists see a number of reasons to be skeptical of AI programs: limited testing in real-world settings, lack of transparency about how they work and questions about the demographics of the patients used to train them.

“If we don’t know on what cases the AI was tested, or whether those cases are similar to the kinds of patients we see in our practice, there’s just a question in everyone’s mind as to whether these are going to work for us,” said Dr. Curtis Langlotz, a radiologist who runs an AI research center at Stanford University.

To date, all the programs cleared by the FDA require a human to be in the loop.

In early 2020, the FDA held a two-day workshop to discuss algorithms that could operate without human oversight. Shortly afterwards, radiology professionals warned regulators in a letter that they “strongly believe it is premature for the FDA to consider approval or clearance” of such systems.

But European regulators in 2022 approved the first fully automatic software that reviews and writes reports for chest X-rays that look healthy and normal. The company behind the app, Oxipit, is submitting its U.S. application to the FDA.

The need for such technology in Europe is urgent, with some hospitals facing monthslong backlogs of scans due to a shortage of radiologists.

In the U.S., that kind of automated screening is likely years away. Not because the technology isn’t ready, according to AI executives, but because radiologists aren’t yet comfortable turning over even routine tasks to algorithms.

“We try to tell them they’re overtreating people and they’re wasting a ton of time and resources,” said Chad McClennan, CEO of Koios Medical, which sells an AI tool for ultrasounds of the thyroid, the vast majority of which are not cancerous. “We tell them, ‘Let the machine look at it, you (review and) sign the report and be done with it.’”

Radiologists tend to overestimate their own accuracy, McClennan says. Research by his company found physicians viewing the same breast scans disagreed with each other more than 30% of the time on whether to do a biopsy. The same radiologists even disagreed with their own initial assessments 20% of the time, when viewing the same images a month later.

About 20% of breast cancers are missed during routine mammograms, according to the National Cancer Institute.

And then there’s the potential for cost savings. On average, U.S. radiologists earn over $350,000 annually, according to the Department of Labor.

In the near term, experts say AI will work like autopilot systems on planes — performing important navigation functions, but always under the supervision of a human pilot.

That approach offers reassurances to both doctors and patients, says Dr. Laurie Margolies, of Mount Sinai hospital network in New York. The system uses Koios breast imaging AI to get a second opinion on breast ultrasounds.

“I will tell patients, ‘I looked at it, and the computer looked at it, and we both agree,’” Margolies said. “Hearing me say that we both agree, I think that gives the patient an even greater level of confidence.”

The first large, rigorous studies testing AI-assisted radiologists against those working alone give hints at the potential improvements.

Initial results from a Swedish study of 80,000 women showed a single radiologist working with AI detected 20% more cancers than two radiologists working without the technology.

In Europe, mammograms are reviewed by two radiologists to improve accuracy. But Sweden, like other countries, faces a workforce shortage, with only a few dozen breast radiologists in a country of 10 million people.

Using AI instead of a second reviewer decreased the human workload by 44%, according to the study.

Still, the study’s lead author says it’s essential that a radiologist make the final diagnosis in all cases.

If an automated algorithm misses a cancer, “that’s going to be very negative for trust in the caregiver,” said Dr. Kristina Lang of Lund University.

The question of who could be held liable in such cases is among the thorny legal issues that have yet to be resolved.

One result is that radiologists are likely to continue double-checking all AI determinations, lest they be held responsible for an error. That’s likely to wipe out many of the predicted benefits, including reduced workload and burnout.

Only an extremely accurate, reliable algorithm would allow radiologists to truly step away from the process, says Dr. Saurabh Jha of the University of Pennsylvania.

Until such systems emerge, Jha likens AI-assisted radiology to someone who offers to help you drive by looking over your shoulder and constantly pointing out everything on the road.

“That’s not helpful,” Jha says. “If you want to help me drive then you take over the driving so that I can sit back and relax.”

___

The Associated Press Health and Science Department receives support from the Howard Hughes Medical Institute’s Science and Educational Media Group and the Robert Wood Johnson Foundation. The AP is solely responsible for all content.

AP U.S. News

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed AP

Site Settings Survey

 

MAIN AREA MIDDLE drop zone ⇩

AUTO TEST CUSTOM HTML 20241119133138

AUTO TEST CUSTOM HTML 20241202111905