How do clinicians perceive use of AI to triage brain MRI?

New research from a top London hospital suggests that AI can be implemented effectively to triage MRI scans of the brain to streamline the patient pathway and also relieve clinician workload.

"This survey demonstrates that clinicians overall support the implementation of AI tools, and value the importance of transparency in AI and its ability to explain its decision-making, which is a key factor for its successful implementation into clinical workflows," noted Dr. Munaib Din, radiology registrar at Guy's and St Thomas' NHS Foundation Trust and King's College London, and colleagues in an article posted on 6 January by the European Journal of Radiology (EJR).

The research confirms that clinician input and feedback throughout AI development is essential and including information on the AI model's training, performance, and explainability through visual explanations in the form of saliency maps increased clinician confidence for an AI tool to triage MRI brain scans, they added.

In the first quarter of 2024, the researchers undertook a survey of stakeholder clinicians who are closely involved with, and have expertise in, the clinical workflow of neuroimaging (specifically MRI of the brain). All responses were anonymous and voluntary.

Of the 133 respondents from 34 centers across the U.K., 66% (88/133) were male, 63% (84/133) were senior (consultant grade), and 37% (49/133) were junior (specialist trainee grade). A total of 41% (55/133) were neuroradiologists, 31% (41/133) were general radiologists, and 28% (37/133) were neurologists. The median age group was 35-44.

The authors found that 32% (43/133) of participants felt familiar or very familiar with the use of AI tools in imaging to support decision-making, while 61% (81/133) reported that they have never used them in medical practice. Only a minority of participants (18%, 24/133) felt confident in the efficacy of AI tools to support them. Despite this, there is optimism for AI's role in their future work, with 61% (81/133) of participants feeling either optimistic or very optimistic.

"We noted a positive correlation between how familiar a participant was with the use of AI in imaging, and their optimism about its role in future work," they wrote. "Senior clinicians tended to be more familiar and confident with the use of AI in clinical practice."

There was a variation of responses between specialties. Neuroradiologists routinely use AI as a tool to support decision-making more than other radiologists or neurologists, and this may be related to widespread use following randomized controlled stroke trials which incorporated AI tools within the clinical pathway, according to Din and colleagues. Both groups of radiologists felt more familiar with the use of AI in imaging as a tool to support medical decision-making compared with neurologists.

AI as a first reader

For AI employed in radiology workstreams in general, more participants (50%, 66/133) appeared to prefer AI as a "first reader" that highlights any abnormalities to the radiologist, rather than as a "second reader" (38%, 50/133) that reviews scans after a radiologist, they continued. "When focusing explicitly on the specific use-case of an AI tool being used as a dedicated triage tool to prioritize the order in which MRI brain scans are reviewed, participant responses to questions on overall benefit were mixed, but largely neutral (23%, 31/133) or positive (67%, 89/133). 71% (94/133) of respondents preferred the use of an AI-assisted triage tool for MRI brain scans compared to the current system without triage."

A total of 74% (99/133) of participants felt there was scope for improvement in the way in which MRI brain scans are prioritized in current healthcare settings, as typically they are reported chronologically within patient classes (such as inpatient/outpatient and urgent/nonurgent lists). Of the sample, 67% (89/133) thought it might improve the clinical pathway. Furthermore, 47% (63/133) of participants stated they would feel comfortable or very comfortable with the use of AI in triaging MRI brain scans without radiologist input.

Half of the participants (67/133) stated that extra information on the AI model would affect their previous answers more positively. Participant confidence in the AI tool improved when individual components of the further information were considered in isolation: the number of training and validation cases, the model's performance, and being validated on unseen data.

The authors also questioned the group on the acceptability of the risks associated with the use of this AI tool in MRI brain triage: 50% (67/133) believed that the false negative risk is acceptable, and 61% (81/133) thought that the benefits of such a triage tool outweigh the risks. Less experienced senior clinicians (those with up to five years of experience as a consultant) were more willing to accept the risks associated with the AI tool.

Most participants (82%, 108/133) positively affirmed the importance of the MRI brain triage tool in providing its reasoning or explanation. "This highlights the importance of providing clinicians with information on how AI models come to their decisions. The findings suggest that clinical AI triaging tools should maintain the concept of explainability as a core priority throughout their development and implementation process to ensure that clinicians feel confident in its application in patient care," the authors wrote.

There was a more heterogeneous spread of responses regarding how comfortable clinicians would feel when explaining the AI decisions to patients, with 48% (63/133) feeling either neutral or not confident. "We do acknowledge that clinicians may rarely be in a situation where they explain how image triage works, especially as radiologists tend to have less patient interaction compared to other clinicians," they noted.

Finally, neurologists felt less confident than neuroradiologists in the ability of an AI tool to effectively prioritize scan reporting. "It is plausible that the discrepancy results from neurologists having less exposure than their radiologist colleagues to imaging AI tools or having fewer concerns regarding reporting backlogs. It is uncertain whether this difference in perspective between radiologists and neurologists would impact the clinical translation of this particular AI application, as both groups work closely together in facilitating the patient pathway."

Follow-up studies

The researchers are now looking at the generalizability of the MRI triage tool across the U.K. in 44 sites, corresponding author Dr. Thomas Booth told AuntMinnieEurope.com on 6 January. "This study is the U.K.'s largest neurology study, where 37,000 patients have been recruited prospectively to ensure the MRI triage tool performance is acceptable in hospitals with different populations and different scanners. This analytical validation study will report in Q1-2 2025."

The team is also evaluating the usability of the MIDI tool in an ongoing study to determine how easy it is to use. This usability study will report in the first quarter of 2025, noted Booth, who is a consultant neuroradiologist at King's College Hospital NHS Foundation Trust.

"Clinical validation -- when the tool is embedded in the department workflow -- is the next study to prove workflow and health economic benefit, and this is planned for 2025," he said.

The technology has been expanded beyond what was assessed in the EJR study, Booth pointed out. "Here our framework could also serve as a clinical decision support tool, not only by suggesting findings to radiologists and detecting errors in provisional reports, but also by retrieving and displaying examples of pathologies from historical examinations that could be relevant to the current case-based on textual descriptors. This pre-print is under revision in a journal and we hope it will be published in Q1 2025."

You can read the full EJR article here.

Page 1 of 259
Next Page