5 ways the FDA promises to regulate AI-related medical devices

As AI-related medical devices continue to saturate the healthcare market, regulatory agencies like the FDA are struggling to keep up with a new category of technology. But, writing in the Journal of the American College of Radiology this October, ACR Data Science Institute CMO Bibb Allen, MD, outlined several steps the FDA is taking to ensure the safety and efficacy of these devices remains intact.

Clearance for class III medical devices—those subject to the highest level of regulatory control—typically requires sponsors to submit clinical data supporting their tech, Allen said, so officials have clear proof of the safety and efficacy of the device. Devices with no legally marketed, substantially equivalent predicates are also automatically classified as class III, regardless of the risk they pose.

“This could have been the pathway for artificial intelligence algorithms,” Allen wrote. “However, the FDA has recently revamped the de novo request process, which allows the developer of a low- to moderate-risk device without a predicate to submit a request to the FDA to make a risk-based classification of the device into class I or II.”

Once that de novo request is granted, the device can then serve as a predicate for 510(k) premarket approval of similar devices in the future, which is how a good chunk of AI software has been approved to date.

Apart from the de novo overhaul, Allen said the FDA is developing several initiatives “that will likely affect the regulation of AI products in the United States.” This is what they look like:

1. The Medical Device Development Tools program

The FDA proposed this program—a pathway for the agency to qualify tools that medical device sponsors could use in the development and evaluation of their devices—last August. For a device to pass qualification, it must be determined by the FDA that it “produces scientifically plausible measurements and works as intended within the specified context of use.”

According to Allen, the tools can be developed by private groups or sponsors themselves, but will be helpful in the approval process for AI algorithms and other less-than-tangible softwares.

2. The National Evaluation System for Health Technology, or NEST

NEST exists to move medical devices from their nascent stages to market as quickly as possible, while “strategically and systematically leveraging real-world evidence and applying advanced analytics to data tailored to the unique data needs and innovation cycles of medical devices.”

The FDA claims it’ll make that happen by shifting to more active surveillance, therefore improving its ability to detect safety issues. NEST is also designed to leverage real-world data with the goal of generating better, more widely applicable evidence representative of a diverse U.S. population.

3. The ACR Data Science Institute and Lung-RADS Assist

Allen said the NEST Coordinating Center chose the Lung-RADS Assist: Advanced Radiology Guidance, Reporting and Monitoring program as a way to demonstrate its newfound approach to the evaluation of AI algorithms. The project, sponsored by the ACR Data Science Institute, is a method for validating and monitoring AI algorithms built to detect and classify lung nodules in lung cancer screening programs as defined by Lung-RADS.

“The demonstration will use real-world data to assess end-to-end workflow from the deployment of an AI algorithm in a radiology reporting system through the capture of performance metrics within a national registry,” Allen wrote.

“This example of a public-private partnership may serve as a model for how AI algorithms can be monitored in clinical practice to ensure ongoing patient safety while establishing a pathway to increase the efficiency of the FDA premarket review process.”

4. The Software Precertification Program

The pilot Software Precertification Program was designed to provide qualified developers with an efficient premarket pathway for software-based medical devices. Allen said the program, which will eventually be bundled with the NEST and Medical Device Development Tools platforms, evaluates developers’ capability to respond to real-world performance.

The SPP is expected to help establish mechanisms for validating AI algorithms in the future.

5. Public-private partnerships

“Although these U.S. regulatory programs may seem somewhat disjointed, in all of its activities, the FDA seems to be working to streamline the review process for AI applications in healthcare,” Allen wrote. “However, even with the streamlined premarket processes described herein, developers will still need to demonstrate efficacy, patient safety and a process for postmarket surveillance of ongoing effectiveness using real-world data.”

Regulatory agencies like the FDA are “ill-equipped” to do those things internally, he said, and the review process is already burdened by the number of algorithms submitted for approval. Public-private partnerships, like those between regulatory agencies and medical specialty societies, can facilitate the premarket review and data collection processes.