92% AI sensitivity in bone fractures

A team of UK researchers has reviewed 42 existing studies to compare diagnostic performance in fracture detection between AI and clinicians.

A group of researchers detects that the sensitivity of the artificial intelligence (AI) for detect bone fractures it is between 91 and 92 percent. This is stated in a study published in the journal ‘Radiology’ in which experts assure that AI is an effective tool for fracture detection it has potential to help doctors in busy emergency departments. In this sense, the researchers found no statistically significant differences between the performance of the clinicians and that of the AI.

The misdiagnosis or late fractures on x-rays it is a common mistake that can have serious consequences for the patient. Lack of timely access to expert opinion, as growth in imaging volumes continues to outpace radiologist recruitment, only makes the problem worse.

AI can help solve this problem by acting as an aid to radiologists, helping to accelerate and improve the diagnosis of fractures. To learn more about the potential of the technology in the field of fractures, a team of researchers from the United Kingdom has reviewed 42 existing studies in which the diagnostic performance in fracture detection between AI and doctors was compared. Of the 42 studies, 37 used X-rays to identify fractures and five used CT.

Artificial intelligence, comparable to medical performance

“We found that the AI ​​worked with a high degree of precision, comparable to the doctors’ performance“, explains the lead author of the study, Rachel Kuo, from the Botnar Research Center in the Nuffield department of Orthopaedics, Rheumatology and Musculoskeletal Sciences in Oxford. He also adds that “it is important to note that we found this to be the case when the HE was validated using independent external data sets, suggesting that the results may be generalizable to the general population.”

“The results of the study point to several educational and clinical applications promising for AI in detecting fractures,” says Kuo. In addition, he stresses that “it could reduce the rate of misdiagnosis early in difficult circumstances in the emergency setting, including cases where patients may sustain multiple fractures. It has potential as an educational tool for novice clinicians.”

It could also be useful as a ‘second reader’, giving clinicians reassurance that they have made the correct diagnosis or prompting them to take another look at the images before treating patients,” says Kuo.

The researcher warns that research on the AI fracture detection it is still in a very early preclinical phase. Only in a minority of studies did she and her colleagues evaluate clinicians’ performance with the help of AI, and there was only one example where an AI was evaluated in a prospective study in a clinical setting.

“It remains important that clinicians continue to exercise their own judgment. AI is not infallible and is subject to bias and error,” Kuo concludes.

Although it may contain statements, data or notes from health institutions or professionals, the information contained in Medical Writing is edited and prepared by journalists. We recommend the reader that any questions related to health be consulted with a health professional.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.