In April of this year, the US Food and Drug Administration (FDA) released a discussion paper, Proposed Regulatory Framework for Modifications to Artificial Intelligence/Machine Learning (AI/ML) – Based Software as a Medical Device (SaMD), which proposed a novel regulatory framework for artificial intelligence (AI)-based medical devices.  The public docket closed on June 3, 2019, and FDA received over one hundred comments from manufacturers, industry associations, and other interested parties. The comments vary in support of FDA's framework and largely urge FDA to align with external stakeholders that are already developing industry standards and clarify the agency's expectations under the proposed framework.

FDA Proposed Approach

In its discussion paper, FDA recognized that its current approach to the regulation of medical devices―which is based on devices that are static in nature with planned, discrete changes―is ill-suited for AI algorithms.  For example, under the current framework, changes in an AI algorithm due to real-world use, depending on the significance or risk posed to patients of that modification, could trigger premarket review by FDA. The consequence would be that whenever the algorithm learns or adapts (which ideally it would with every use), the manufacturer would have to ask FDA to clear (or approve) the algorithm change. That scenario is unworkable—both for the manufacturer and for FDA. FDA's approach introduced a framework that considers the adaptive nature of AI and machine-learning (ML) based technologies, and proposed a streamlined approach that should lessen regulatory burden on industry.

The framework proposed a total lifecycle approach, based on four principles: (1) good ML practices (GMLP) from software development through distribution; (2) initial pre-market review that would include a pre-determined plan for modifications; (3) risk management approach to modifications after pre-market review; and (4) post-marketing monitoring and reporting of product performance. At the heart of FDA's proposed framework, and the most unique aspect of the approach, is the second principle—the option for manufacturers to submit a plan of predetermined modifications during the initial premarket review of an AI or ML device.

The plan, called a "predetermined change control plan," would disclose changes that are anticipated based on the software's adaptive learning, and the methodology for implementing those changes in a controlled manner.  In other words, the plan would lay out a roadmap, or region, of potential changes as the machine learns, and would describe the methods in place to control the risks anticipated with those algorithm changes.  Changes within the scope of the plan would not require FDA premarket review.  Changes outside the scope of the plan, and that lead to a new intended use―for example, to diagnose a different/new disease―would require premarket review.  Changes that are outside of the plan, but that do not create a new intended use, would trigger a "focused" FDA premarket review of the plan.  This approach would eliminate the burdensome requirement for a company to seek clearance from FDA for every significant software change.

Public Comments

The public comments vary in support of FDA's proposed framework. Some comments asked FDA to expand the scope of the framework to include software in medical devices, as opposed to the current scope, which covers only software as a medical device. Many commenters asked for more clarity around terms used in the discussion paper, including foundational terms such as "machine learning," as opposed to the broader terms of artificial or augmented intelligence, and terms such as "locked down" and "continuous learning" algorithms. Numerous comments requested more examples to illustrate the Agency's expectations, such as examples of changes to AI/ML software that would (or would not) trigger premarket review. Some commenters noted that certain principles, including GMLP, are either premature given the nascent state of the industry, or should be harmonized with standards already being developed by external standards setting organizations.

A few comments noted that the incorporation of elements of CDRH's software pre-certification program is confusing because the "pre-cert" program has not been fully evaluated or adopted, and the results have not been shared outside of the few companies who participated in the pilot program. One comment noted that FDA did not discuss bias as a significant risk for ML software with clinical applications. The output of the ML software is only as good as its inputs, and the comment notes that clinical trials systematically include or exclude patients with certain characteristics, and insurance records capture information only from those with access to the health care system. Another commented noted that the transparency in monitoring that FDA requests (under the fourth principle) is not realistic, given the barriers presented by privacy and access restrictions to patient information.

One comment requested coordination not only with external stakeholders, but with other FDA centers, such as CDER and CBER. In particular, drug companies are increasingly using AI/ML technology during drug discovery to identify new biomarkers, incorporating software into combination products regulated by CDER, or using continuous learning algorithms in clinical decision support software to recommend specific patient therapies. FDA has not issued guidance in these areas, and the comment urged FDA to discuss how AI/ML-based software could be used outside of a medical device premarket application, such as in support of clinical trial design or patient recruitment.

Moving Forward

While the industry largely encourages FDA to react quickly to evolving technology so as not to stifle innovation, the Agency will have to take time to more clearly define foundational terms in this complex area and clarify its expectations for software developers and medical device companies.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.