Back to EU AI Act
EU AI Act Compliance

EU AI Act Compliance Checklist

Assess your AI system against 25 key requirements from the EU AI Act. Track compliance across 5 categories and get prioritized recommendations.

0 of 25 answeredScore: 0/100

0

Risk Classification

0

Transparency

0

Documentation

0

GDPR

0

Monitoring

Risk Classification

(25% weight)

Have you classified your AI system's risk level per Article 6?

Determine if your system falls under unacceptable, high, limited, or minimal risk.

Art. 6

If high-risk, have you registered in the EU database (Art. 6(2))?

High-risk AI systems must be registered before being placed on the market.

Art. 6(2)

Do you have a risk management system in place (Art. 9)?

A continuous risk management process must be established and maintained.

Art. 9

Have you assessed if your system is listed in Annex III high-risk categories (Art. 7)?

Check if your AI system falls under the high-risk use cases listed in Annex III.

Art. 7

Have you documented the intended purpose and foreseeable misuse?

Per Annex III, the intended purpose and reasonably foreseeable misuse must be documented.

Annex III

Transparency

(25% weight)

Do users know they are interacting with an AI system (Art. 13)?

AI systems must be designed to ensure users are informed they are interacting with AI.

Art. 13

Is the system's operation sufficiently transparent (Art. 13(1))?

Users must be able to interpret the system's output and use it appropriately.

Art. 13(1)

Are AI-generated or manipulated content clearly labeled (Art. 52)?

Deep fakes and AI-generated content must be labeled as such.

Art. 52

Do you provide information about the system's capabilities and limitations (Art. 13(3))?

Instructions for use must include system capabilities, limitations, and known risks.

Art. 13(3)

Are emotion recognition or biometric categorization systems disclosed (Art. 52(1))?

Users must be informed when subjected to emotion recognition or biometric categorization.

Art. 52(1)

Documentation

(20% weight)

Do you maintain technical documentation per Article 11?

Technical documentation must be drawn up before the system is placed on the market.

Art. 11

Are system logs recorded automatically (Art. 12)?

High-risk AI systems must have automatic logging capabilities.

Art. 12

Does your documentation cover all elements listed in Annex IV?

Documentation must include system description, purpose, design, testing, and risk management.

Annex IV

Do you maintain documentation of training, validation, and testing datasets (Art. 11(1))?

Dataset documentation including design choices, collection processes, and data preparation.

Art. 11(1)

Do you have a quality management system (Art. 18)?

Providers must establish a quality management system ensuring compliance.

Art. 18

GDPR

(20% weight)

Have you conducted a Data Protection Impact Assessment (GDPR Art. 35)?

DPIA is required when processing is likely to result in high risk to individuals.

GDPR Art. 35

Do you provide clear privacy notices (GDPR Art. 13-14)?

Individuals must be informed about how their data is processed by the AI system.

GDPR Art. 13-14

Is data protection built into the system design (GDPR Art. 25)?

Privacy by design and by default must be implemented.

GDPR Art. 25

Have you appointed a Data Protection Officer if required (GDPR Art. 37)?

A DPO is required for large-scale processing of personal data.

GDPR Art. 37

Do you maintain records of processing activities (GDPR Art. 30)?

Records must be maintained of all personal data processing activities.

GDPR Art. 30

Monitoring

(10% weight)

Do you have post-market monitoring in place (Art. 9(9))?

A post-market monitoring system must be established proportionate to the risk level.

Art. 9(9)

Can you report serious incidents to authorities (Art. 61)?

Providers must report serious incidents to market surveillance authorities.

Art. 61

Do you have a system for reporting malfunctions (Art. 62)?

A process must exist for reporting malfunctions and non-compliance.

Art. 62

Can national authorities access your system for auditing (Art. 72)?

Market surveillance authorities must be able to access the AI system for compliance checks.

Art. 72

Do you regularly test and update your risk management measures (Art. 9(8))?

Risk management measures must be tested and updated throughout the system's lifecycle.

Art. 9(8)