Page 26 - Shaping ethics, regulation and standardization in AI for health
P. 26
Shaping ethics, regulation and standardization in AI for health
stakeholders, including but not limited to, developers who are exploring and developing AI
systems, regulators and policymakers who might be in the process of identifying approaches
to manage and facilitate AI systems, manufacturers who design and develop AI-enabled
medical devices, and health practitioners who deploy and use such medical devices and AI
systems. This Deliverable contains considerations in six general topic areas: Documentation
and transparency, total product lifecycle approach and risk management, intended use and
analytical and clinical validation, data quality, privacy and data protection, and engagement
and collaboration. Stakeholders are invited to take into account the considerations detailed in
this Deliverable as they continue to develop frameworks and best practices for the use of AI in
healthcare and therapeutic development.
A�2�2 DEL 2�1: Mapping of IMDRF essential principles to AI for health software
Summary: AI for health (AI4H) software provides a number of new aspects that have not been
considered when developing the regulatory framework for Software as a Medical Device
(SaMD), as described by the International Medical Device Regulators Forum (IMDRF) Essential
Principles (EPs) in "Essential Principles of Safety and Performance of Medical Devices and IVD
Medical Devices", IMDRF Good Regulatory Review Practices Group, IMDRF GRRP WG/N47
FINAL, 31 October 2018.
This document provides a suggested mapping of the EPs to the related aspects of AI4H software.
Its purpose is to cover all aspects considered in the regulation of SaMDs and whether, and if
yes, how they are applicable to AI4H.
In clause 7.1 the IMDRF EPs are evaluated for their applicability to AI4H. This reduces the
number of relevant EPs to six: General (5.1), Clinical Evaluation (5.2), Medical Devices and IVD
Medical Devices that Incorporate Software or are Software as a Medical Device (5.8), Labeling
(5.10), Protection against the Risks posed by Medical Devices and IVD Medical Devices intended
by the Manufacturer for use by Lay Users (5.12) and Performance Characteristics (7.2). The key
concepts of these EPs are extracted in clause 7.2 and grouped with related AI4H concepts in
clause 7.3. Finally, in clause 7.4 the explicit mapping from EPs to AI4H is presented.
A�2�3 DEL 2�2: Good practices for health applications of machine learning:
Considerations for manufacturers and regulators
Summary: This Technical Paper recommends a set of good machine learning (ML) practice
guidelines to manufacturers and regulators of data driven artificial intelligence (AI) based
healthcare solutions on conducting comprehensive requirements analysis and streamlining
conformity assessment procedures for continual product improvement in an iterative and
adaptive manner. This set of good machine learning practice guidelines gives prime priority to
the factor of patient safety and focuses on a streamlined process for risk minimization and quality
assurance for AI / ML based health solutions and tries to establish a system of transparency and
accountability of all the processes involved in AI / ML based health solutions. The proposed
set of good machine learning practices adopts, extends and leverages the best practices and
recommendations provided by internationally recognized medical device regulatory agencies
such as the international medical device regulators forum (IMDRF) and the FDA. These guidelines
are devoid any legally binding or statutory requirements applicable to any specific regulatory
framework or specific geographic jurisdiction.
16