+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

An AI bot scored nearly as well as doctors on a radiology exam — but it's not ready to replace humans yet

Dec 24, 2022, 02:04 IST
Business Insider
AI did almost as well as radiologists on a medical exam, per a recent study.picture alliance/ Getty
  • An AI bot took the exam radiologists in the UK have to pass before finishing training.
  • The AI candidate passed two of 10 mock exams, while humans passed four of 10.
Advertisement

Some good artificial intelligence may be just what the doctor ordered — or at least some day soon.

An AI system recently passed a radiology exam — scoring an average overall accuracy of 79.5% over 10 tests — researchers in the UK found. That's not bad, considering the average accuracy of actual human radiologists is 84.8%.

The researchers, who published their study in the British Medical Journal, wanted to determine whether or not an AI bot could pass the "rapid reporting" part of the Fellowship of the Royal College of Radiologists exam that radiologists in the UK are required to pass to complete training.

The rapid reporting section of the three-part exam requires candidates to interpret 30 radiographs in 35 minutes. To pass, candidates have to report at least 90% of the radiographs correctly. AI achieved a score higher than 90% on two of the 10 tests it took.

"This part of the examination is designed to 'stress test' candidates for speed and accuracy, providing a mixture of challenging normal and abnormal cases typically referred by general practice and the emergency department for radiological interpretation in clinical practice," the researchers said.

Advertisement

The researchers used 10 mock FRCR rapid reporting exams for the study. The AI candidate was a Smarturgences tool developed by French AI company Milvue that is commercially available. The study also included 26 radiologists who had taken and passed the real FRCR exam within the last year.

When non-interpretable images — those that the AI doesn't register at all — were not used, the AI candidate had average overall accuracy of 79.5% and was able to pass two of 10 mock exams. The radiologists had an accuracy of 84.8% and were able to pass four of the 10 mock exams, on average.

The AI diagnosed images that were typically diagnosed correctly by the radiologists 91% of the time. For the images that most radiologists incorrectly diagnosed, the AI was only wrong 50% of the time, correctly diagnosing hands, carpal bones, and feet.

The AI candidate needs more training analyzing areas that are considered 'non-interpretable' like the abdomen and axial skeleton.

While the AI candidate had "relatively high" accuracy, it was only the highest scoring candidate for one of the mock exams.

Advertisement

"The artificial intelligence candidate would still need further training to achieve the same level of performance and skill of an average recently FRCR qualified radiologist," the researchers said, adding that it still needs training on "subtle musculoskeletal abnormalities."

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article