12 key points to remember for AI deployment

Liz Carey Feature Writer Smg 2023 Headshot

For anyone who is confused about responsibilities for AI as a medical device (AIaMD) regulatory compliance issue in the EU, help is here in the form of a new consensus paper from the AI Working Group of the ESR.

The paper adds to understanding of post-market clinical follow-up (PMCF) and the regulatory landscape of radiology AI deployment, providing 12 recommendations for physicians navigating the complexities, noted lead author Dr. Renato Cuocolo, associate professor of radiology at the University of Salerno in Campania, Italy, and colleagues.

Only 29% of medical AI deployers -- hospitals, radiology departments, or individual clinicians who use an AI system during diagnosis or treatment -- consider themselves familiar with EU medical device regulation (MDR) and post-market surveillance (PMS) requirements, they explained. Yet radiologists share PMS responsibilities and are expected to contribute to ongoing monitoring for safety and performance.

"Radiologists need to familiarize themselves with AIaMD EU regulations due to shared PMS responsibilities and current ambiguities," the group said. "Current laws lack imaging-specific guidance."Diagram summarizing the interplay between AI as a medical device provider, deployers, and post-market surveillance regulations in the EU.Diagram summarizing the interplay between AI as a medical device provider, deployers, and post-market surveillance regulations in the EU.Figure and caption courtesy of Insights into Imaging.

Cuocolo and colleagues explained that the EU MDR (2017/745) was not originally tailored to AI components and does not present specific requirements to account for AIaMD use. The EU AI Act (Regulation (EU) 2024/1689) is not necessarily any clearer, and its novelty may also explain the lack of widespread awareness amongst radiologists, they said.

To bridge the gap, members of the ESR eHealth and Informatics Subcommittee and ESR AI Working Group developed a series of 12 consensus statements intended to provide clear indications to physicians and other healthcare professionals currently using an AIaMD in practice or planning to do so in the future. They included the following, among others:

  1. To allow effective PMS, providers must provide the AI medical device baseline accuracy metrics, including uncertainty measures (e.g., 95% confidence intervals). These should be clearly visible on the PMS platform to deployers in order to facilitate the detection of underperformance or other issues.
  2. To facilitate compliance by deployers to PMCF duties, it would be preferable for the platform employed for PMS to also allow user feedback to be recorded. This information should also be accessible to other physicians within the same institution, through the same platform, to facilitate local awareness of critical issues with an AI medical device.
  3. Providers may employ shared software platforms to optimize access to multiple AI medical devices. In this case, the use of interoperable PMS standards is recommended to facilitate the aggregation of data from all AI medical devices. For example, access through a unified user interface at the software platform level to monitor all devices delivered through said platform would be preferable rather than siloed PMS systems within each medical device’s dedicated interface.

The advice is "a first step towards establishing the necessary guidelines and awareness amongst our community to improve consistency in the deployment and monitoring of AI as medical devices, while also allowing physicians to have a clear reference standard when approaching these novel technologies, which present unique challenges compared to previous practice," the group wrote.

A secondary aim of the paper was to provide expert opinion to the EU regulatory bodies.

Find the complete paper here.

Page 1 of 410
Next Page