GDPR + EU AI Act for AI Applications: Double Obligation 2026

Practitioner note: This is not legal advice. For specific situations, consult a qualified attorney or compliance officer.

TL;DR

  • Two regulations apply in parallel: GDPR (Art. 22, 35) and EU AI Act (Art. 14, 27) overlap but neither absorbs the other
  • DPIA + FRIA double assessment required for high-risk AI processing personal data
  • Human oversight required by both Art. 22 GDPR and Art. 14 EU AI Act — design once, document twice
  • CJEU C-634/21 (SCHUFA) classifies AI scoring as automated individual decision-making under Art. 22
  • Two supervisory authorities: data protection authority (GDPR) plus the AI authority (EU AI Act). Notify both on incident

1. Where GDPR and EU AI Act intersect

The GDPR governs personal data; the EU AI Act governs AI systems. When an AI system processes personal data, both regulations apply in parallel. The key intersections are Art. 22 GDPR (automated decisions) with Art. 14 EU AI Act (human oversight), Art. 35 GDPR (DPIA) with Art. 27 EU AI Act (Fundamental Rights Impact Assessment, FRIA), and Art. 9 GDPR (special categories) with Art. 10(5) EU AI Act (training data).

2. DPIA + FRIA double assessment

For high-risk AI that processes personal data, both a DPIA (Data Protection Impact Assessment) and a FRIA (Fundamental Rights Impact Assessment) are required. They cover overlapping but not identical content: the DPIA focuses on data subject risk, the FRIA on broader fundamental-rights impact (non-discrimination, freedom of expression, dignity). A combined template is recommended; full merging is not legally required.

3. Art. 22 GDPR meets Art. 14 EU AI Act

Both articles require human oversight of consequential decisions. The CJEU C-634/21 (SCHUFA, December 2023) confirmed that AI-generated credit scores qualify as automated individual decision-making under Art. 22 GDPR. Practical design: the human reviewer must have actual authority and substantive review capacity — rubber-stamping fails both regulations.

4. Special categories and training data

Art. 9 GDPR restricts processing of special categories. Art. 10(5) EU AI Act creates a narrow exception for using special-category data to detect and correct bias in high-risk AI systems, subject to strict safeguards: necessity, proportionality, security measures, documented justification.

5. Dual supervision and incident reporting

The data protection authority supervises GDPR aspects; the national AI authority supervises EU AI Act aspects (in Germany expected at the BNetzA — Federal Network Agency). On a serious incident, notify both within their respective windows: 72 hours under GDPR, sector-specific under the EU AI Act.

6. Practical implementation

Build a combined DPIA+FRIA template, link AI inventory entries to records of processing, and train staff on both regulations. The acceptable-use policy (AUP) should reference both regimes so that the same control statements satisfy both audits.

Summary

AI applications processing personal data face a true double obligation. Treat it as one design exercise with two documentation outputs: combined assessment, single human-oversight design, dual supervisory communication path. Most SMEs underestimate the FRIA scope — start there, then layer the DPIA on top.

View GDPR Kit →

Frequently Asked Questions

Is a combined assessment sufficient?
Recommended but not mandatory. Both legal acts require their own specific content.
Reporting in case of an incident?
Both supervisory authorities in parallel. BfDI within 72 hours, BNetzA and AI authority sector-specific.

Sources