The EU AI Act – Imaging and Diagnostics
The EU’s draft AI Act could significantly disrupt the regulation of AI-powered imaging and diagnostic technology that is already regulated under medical and in vitro medical device regulations. Partners, Brian McElligott and James Gallagher, consider the impact the EU AI Act is expected to have on certain parts of the medtech sector.
The EU AI Act is set to become law later this year and providers of imaging and diagnostic artificial intelligence technology (AI) need now begin to consider its potential impacts on the regulation of their technology. Many manufacturers in this space are only just getting to grips with recently updated medical/in vitro medical device law when fresh regulatory obligations appear on the horizon relating to the use of AI in those devices. Regulators and notified bodies in the same space will also need to sit up and take notice when considering their new obligations in what is a novel technical area.
AI’s impact on the imaging and diagnostic market
AI has the power to significantly grow the imaging and diagnostic technology market. It brings into our homes and daily lives the power to track, monitor and detect a range of illnesses. Paired with appropriate treatment we may all live longer and healthier lives. For the same reasons, AI also has the power to democratise essential imaging and diagnostic technology with the potential effect of treating billions of people who would otherwise suffer.
The World Health Organization cites that two-thirds of the world’s population do not have access to essential radiology services, including the most basic of x-rays. Pairing AI, like Samsung’s S-Detect AI, with essential imaging and diagnostic tools is proving very successful for low-resource hospitals with significant shortages of medical professionals. S-Detect is a commercially available AI that has shown diagnostic accuracy in detecting breast cancer across many studies.
Increased compliance obligations and conformity thresholds under the EU AI Act
Manufacturers operating in this space, including in the software space, are well aware of their obligations under EU medical device legislation and plan well in advance of product launches for an arduous compliance program. The thresholds of those compliance programs are about to be ramped up under the EU AI Act.
Under the AI Act, imaging and diagnostic tools that are themselves AI systems or those deploying AI systems as safety components will likely be classed as high-risk. This means they will be subject to a new conformity assessment regime specific to AI.
Manufacturers must be able to demonstrate compliance with seven detailed requirements. These include:
- Record-keeping
- Transparency, and
- The provision of information for users and human oversight
This new compliance obligation will be in addition to general safety and performance requirements provided for in Annex I of the Medical Devices Regulation (EU) 2017/745) (MDR). This lays out general requirements related to software medical devices but remains silent on AI specifically.
Pivot required for manufacturers
Manufacturers will also need to pivot their existing competencies in preparing and maintaining technical documentation for medical devices towards meeting the new technical documentation requirements under the AI Act. For example, careful consideration will need to be given to:
- The description and presentation of the intended purpose of the AI system
- The methods and steps performed in the development of the AI system, and
- Data requirements in terms of description of training methodologies and techniques, as well as information about the provenance of those data sets, their scope, and main characteristics
Some solace can be found in the fact that the new laws won’t require manufacturers to deal with a new regulator. The intention is that the existing regulator in each Member State will take on a new role of overseeing compliance of these AI medical devices with the new AI Act requirements.
Comment
The market approval process for software medical and in vitro medical devices such as imaging and diagnostic tools has always been a challenge. The proposed AI Act is increasing the scope of that challenge for both manufacturers and regulators in this space. All of this means costs and resource issues for all parties involved. The balancing motivation here is ensuring the trustworthiness of the use of AI technology in what is a high-end sophisticated sector where the risk of harm is significant. The EU is determined to set the global standard of safety in this space. While there is great potential to make this life-saving technology available to a far greater portion of the population, it is still necessary to ensure the safety of those using it.
For more information, contact a member of our Artificial Intelligence team.
The content of this article is provided for information purposes only and does not constitute legal or other advice.
Share this: