Regulatory Update

EU Guidance MDCG 2025-6 Clarifies Compliance for Medical Device AI

The EU has released new guidance clarifying how the Artificial Intelligence Act (AIA) applies to AI-enabled medical devices already regulated under MDR and IVDR. MDCG 2025-6 outlines what qualifies as high-risk Medical Device AI (MDAI) and how manufacturers can integrate AIA requirements—such as bias mitigation, transparency, and cybersecurity—into existing quality systems.

Published on:
July 22, 2025

In June 2025, the Medical Device Coordination Group (MDCG) released a joint guidance with Artificial Intelligence Board (AIB) outlining how the EU’s Artificial Intelligence Act (AIA) applies to AI-enabled medical devices under the Medical Device Regulation (MDR) and In Vitro Diagnostic Regulation (IVDR). MDCG 2025-6 is an FAQ-style guidance that clarifies how manufacturers, notified bodies, and regulators should navigate the combined regulatory obligations for Medical Device Artificial Intelligence (MDAI), particularly for systems considered high-risk under the AIA. The AIA came into force in 2024; most obligations for high-risk AI devices will apply from August 2, 2026. Full enforcement of the AIA begins in August 2027.

Key takeaways from the guidance include:

  • “Provider” in accordance with the AIA should be understood as “Manufacturer”
  • MDR/IVDR classification determines whether an AI system is considered high-risk under the AIA
  • Application of the MDR/IVDR and the AIA for medical devices is simultaneous and complementary  
  • AIA obligations can be integrated into MDR/IVDR systems (e.g., in clinical and performance evaluation testing)
  • Transparency, explainability, and human oversight are now legal requirements
  • Bias mitigation, functional traceability, and cybersecurity are critical compliance areas

What Is MDAI?

First, the MDAI clarifies that all references to ‘manufacturer’ within the meaning of the MDR/IVDR should be understood as references to ‘provider’ in accordance with the AIA.  

MDAI refers to AI systems developed for medical purposes, including standalone software, diagnostic algorithms, and embedded systems. These can fall under MDR, IVDR, or be considered accessories under either regulation.

The AIA introduces new rules to mitigate AI risks, such as bias and cybersecurity, without replacing or redefining MDR/IVDR requirements. The result is a dual compliance model: high-risk AI systems must comply with medical device requirements and the AIA.

Conditions for High-Risk Classification

An MDAI system is considered high-risk under the AIA if:

  • It functions as a safety component or is itself a medical device
  • It is subject to third-party conformity assessment under MDR or IVDR

The AIA’s high-risk designation does not change the device’s MDR/IVDR classification. Also, in-house MDAI developed and used only in health institutions typically is not considered high risk if no notified body is involved in its conformity assessment. Class I devices are not considered to be MDAI systems.

AI-specific quality management system, risk, and documentation requirements

The guidance encourages manufacturers to integrate AIA obligations into their existing MDR/IVDR quality management systems (QMS), risk controls, and documentation. This includes:

  • Quality Management System and Risk Management: AI-specific requirements on data bias, fundamental rights, and robustness should be embedded into existing QMS processes
  • Technical Documentation: A single technical file must address both AI and device-specific obligations
  • Transparency and Human Oversight: High-risk MDAI must allow human intervention, provide explanations of outputs, and include clear user instructions

Additional testing and validation requirements

Under the AIA, high-risk MDAI must undergo testing for accuracy, robustness, and cybersecurity. These requirements are in addition to the clinical and performance evaluation standards already expected under MDR/IVDR to demonstrate clinical evidence and device interoperability for adequate user information. In practice this means, AIA should be considered in the Clinical  Evaluation Plan (CEP) or Performance Evaluation Plan (PEP) under MDR/IVDR.  

Traceability and post-market monitoring

In addition to the supply chain traceability under the MDR/IVDR, the AIA introduces functional traceability, which requires the automated logging of system performance and behavior throughout the lifecycle. These logs are intended to detect unwanted bias, monitor system performance, and flag potential cybersecurity threats. Manufacturers must also establish post-market monitoring systems that track potential interaction with other AI systems.  

Navigating EU requirements for AI-enabled devices

The AIA adds a new layer of regulatory oversight to an already complex framework. If your device has AI capability, Pure Global can help you determine your requirements in the EU and plan your roadmap to compliance with the AIA. Learn more about our end-to-end EU MDR and IVDR consulting support.

Subscribe to newsletter
Subscribe to receive the latest blog posts to your inbox every week.
By subscribing, you agree to our Terms and Conditions.
Thank you for subscribing!
Oops! Something went wrong while submitting the form.
Read more

Latest News

Browse our news hub featuring company announcements, regulatory updates, and industry insights to keep you informed and ahead of the curve.

Let's Talk,
Anywhere You Are.

Whether looking for more information or ready to partner with us, we're here to guide you through every step of the regulatory process.

Contact us