1. Lifecycle regulation and evaluation of artificial intelligence and machine learning-based medical devices
- Author
-
Cohen, Glenn, Minssen, Timo, Price II, Nicolson W, Robertson, Christopher, Shachar, Carmel, Cohen, G ( Glenn ), Minssen, T ( Timo ), Price II, N W ( Nicolson W ), Robertson, C ( Christopher ), Shachar, C ( Carmel ), Vokinger, Kerstin Noëlle, Hwang, Thomas J, Kesselheim, Aaron S, Cohen, Glenn, Minssen, Timo, Price II, Nicolson W, Robertson, Christopher, Shachar, Carmel, Cohen, G ( Glenn ), Minssen, T ( Timo ), Price II, N W ( Nicolson W ), Robertson, C ( Christopher ), Shachar, C ( Carmel ), Vokinger, Kerstin Noëlle, Hwang, Thomas J, and Kesselheim, Aaron S
- Abstract
Between 2017-2018, the FDA cleared fourteen AI and ML-based software products as devices. This chapter analyzes how these products were cleared by the FDA and discusses how a lifecycle-based framework for regulating AI/ML-based software would address some of these characteristics. It is important to address the currently limited evidence for safety and effectiveness available at the time of market entry. To address the post-approval period, manufacturers and the FDA should work together to generate a list of industry-wide allowable changes and modifications that the software can employ to adapt in real-time to new data that would be subject to a “safe harbor” and thus not necessarily require premarket review by the FDA. Even anticipated changes may accumulate to generate an unanticipated divergence in the software’s eventual performance. There should be appropriate guardrails as software evolves over time. Finally, AI/ML is often criticized as a “black box” that is not well understood by or well explained to users. Given the inherent opacity of AI/ML-based software, the FDA should require a high standard of transparency to allow patients and clinicians to make informed decisions.
- Published
- 2022