A group of researchers from Northwestern Medicine has gone on to develop an advanced AI tool that effectively analyze chest radiographs. This tool goes on to demonstrate accuracy levels that are comparable to or even overtaking those of radiologists, particularly for certain medical conditions.
The study furnishes a tool that has been solely developed to aid emergency department radiologists who are overworked due to a heavy workload. Apart from this, it looks to offer guidance to clinicians happen to who work in environments where there happen to be no radiologist who are available on call.
The objective was to utilize the clinical expertise and experience in developing AI tools for clinical integration in order to address this issue by making use of their institutional data, stated the first author of the study and also a student in the Medical Scientist Training Program- MSTP at Northwestern, Jonathan Huang. They have gone on to develop a model that enhances the interpretation of X-rays and also assists physicians in analyzing medical images. This model automatically goes on to generate text reports from the images, which in turn helps to expedite clinical workflows and, at the same time, boost overall efficiency.
The researchers made use of 900,000 chest X-rays along with the radiologist reports to build the model. The tool was subsequently trained to use these findings so as to generate a report for every image. The report would help in describing the relevant clinical findings and their significance, employing the same language along with the style as a human radiologist.
The research team went ahead with the testing of the model using 500 chest X-rays that were obtained from an emergency department at Northwestern Medicine. The evaluations that were obtained were then compared to the initial interpretations that were made by the radiologists along with teleradiologists in the clinical setting.
Huang went on to explain that their objective was to assess the effectiveness of the AI model in case of an emergency department setting. This is specifically important as emergency physicians often do not have any access to onsite radiologists who can offer guidance while simultaneously attending to patients. This scenario goes on to represent a practical clinical use case in which an AI model could effectively elevate and, at the same time, support human decision-making.
Thereafter, a group of 5 board-certified emergency medicine physicians were requested to evaluate each report generated by the AI. They were asked to rate them on a scale of one to five, with a rating of five putting forth the fact of their agreement with the tool’s interpretation and no need for any changes when it comes to the wording of the report.
Throughout this study, the researchers explored how the AI demonstrated the ability to effectively pinpoint X-rays displaying worrisome clinical observations as well as generate imaging reports of excellent quality. The study also interpreted that there was no significant difference when it came to accuracy between reports that were generated by radiologists and those that were generated with the help of AI.
The AI went on to demonstrate a sensitivity of 84% and a specificity of 98% as compared to the on-site radiologists, indicating its capacity to accurately detect abnormalities. The original reports coming from teleradiologists for the same task had a sensitivity of 91% and a specificity of 97%.
It is well to be noted that in a few instances, the AI tool was even able to detect findings that the radiologists had missed, which included identifying a pulmonary infiltrate in an X-ray.
Researchers say that this is the first instance where a generative AI model has been made use of to generate chest X-ray reports.
Senior author, MD, PhD, assistant professor of Anaesthesiology and Biomedical Engineering at the McCormick School of Engineering, Mozziyar Etemadi, pointed out that AI tools within the radiology gamut have typically been designed for certain purposes, including ones that they have already created in the past. For example, consider an AI model that’s capable of evaluating a mammogram to gauge the presence or absence of cancer. In this case, the AI model offers clinicians with total information about an image and gives accurate diagnoses, often surpassing the expertise and capabilities of certain doctors.
Apparently, the research team in the future plans to enhance the model’s capabilities so as to include the interpretation of MRIs, ultrasounds, as well as CAT scans. They happen to have high hopes that the tool will, in the end, prove to be valuable when it comes to clinics that are facing workforce shortages.
The objective of this is to become the radiologist’s assistant, thereby relieving them of the monotony when it comes to their work, stated Etemadi.
There are other institutions also that are also exploring AI application tools in the field of medical imaging.
The team of researchers from Johns Hopkins Medicine recently went on to announce the development of a machine learning model that is capable of estimating the percent necrosis in patients that have intramedullary osteosarcoma. PN refers to the percentage of a tumor that is regarded as dead and no longer active.
It is mandatory to calculate with precision the PN that occurs after chemotherapy in order to assess effectiveness when it comes to the treatment and at the same time provide an estimate of the patient’s survival prognosis. However, this function necessitates pathologists to analyze whole-slide images of bone tissue, which can be a time-consuming process.
The model went in to demonstrate success in gauging and evaluating these WSIs, although with certain limitations.