EU AI Act Compliance: 12-Month Roadmap Through December 2027

Practitioner note: This article is practice-oriented compliance documentation, not legal advice. We are a compliance specialist, not a law firm. For legally binding guidance, please consult a licensed attorney.

TL;DR

  • Three phases: NOW (AI Literacy + Prohibitions) — 08/2026 (Transparency + FRIA + Annex III legally binding) — 12/2027 (only if the Digital Omnibus proposal is adopted)
  • Digital Omnibus proposal of 19 November 2025 [VOLATILE]: the trilogue is ongoing; postponement of Annex III to 02 December 2027 has NOT yet been adopted — until adoption, 02 August 2026 remains legally binding
  • 90 % of SMEs are pure deployers — no provider obligations
  • AI system inventory is the starting point — per system: risk class, role, obligations
  • Fines up to EUR 35 million / 7 % of turnover for prohibited practices (Article 5)

1. Status of the EU AI Act 05/2026 + Digital Omnibus Proposal

The AI Act (Regulation (EU) 2024/1689) has been in force since 01 August 2024. Application is phased. The European Commission presented the Digital Omnibus proposal on 19 November 2025 — as of 02 May 2026 the trilogue is ongoing and a postponement of the high-risk deadlines has NOT been adopted. The following applies:

ObligationDate of application
Article 4 (AI Literacy)02 February 2025 — already in force
Article 5 (Prohibited practices)02 February 2025 — already in force
GPAI (Articles 53–55)02 August 2025 — already in force
Article 50 (Transparency, watermarking)02 August 2026 — unchanged
Article 27 FRIA (credit scoring + public bodies)02 August 2026 — unchanged
Annex III High-Risk AI02 August 2026 legally binding (DO proposal: 02 December 2027 — not adopted)
Annex I (regulated products)02 August 2027 (DO proposal: 02 August 2028 — not adopted)
If you do not want to build your own technical documentation under Annex IV from scratch, the EU AI Act Kit provides a complete template including risk management and declaration of conformity.

2. Phase 1 (NOW): AI Literacy + Prohibitions

In force since 02 February 2025 — no transition period remaining.

3. Phase 2 (08/2026): Transparency + FRIA

Short answer: from 02 August 2026, two central obligations apply: Article 50 requires machine-readable labelling of AI-generated audio, video, image and text content, explicit deepfake disclosure as well as AI notices for chatbots. Article 27 requires a FRIA for public bodies and private actors with public tasks — in particular credit scoring (Annex III No. 5b) and life/health insurance scoring (No. 5c).

From 02 August 2026:

4. Phase 3 (High-Risk AI Annex III from 02 August 2026)

From 02 August 2026 (legally binding; Digital Omnibus proposal of 19 November 2025: postponement to 02 December 2027 — trilogue ongoing, not yet adopted):

5. AI System Inventory — the Starting Point

Short answer: every EU AI Act compliance effort starts with an AI system inventory. Per system, six fields must be captured: name and provider, use case, role (provider/deployer/both), risk class (prohibited under Article 5 / high-risk under Annex III / GPAI / minimal), data categories processed including special categories, as well as those responsible for oversight and escalation.

Document the following per AI system:

  1. Name + provider: ChatGPT/Microsoft Copilot/in-house solution
  2. Use case: marketing copy, HR screening, code generation
  3. Role: provider / deployer / both
  4. Risk class: prohibited (Article 5) / high-risk (Annex III) / GPAI / minimal
  5. Data categories: which personal data / special categories?
  6. Oversight: who monitors? Which escalation path?

6. Provider vs. Deployer

Short answer: providers develop or place an AI system on the market and bear the principal obligations (conformity assessment, technical documentation Annex IV, EU database registration, risk management); around 10 % of SMEs are providers. Deployers use an AI system under their own responsibility (~90 % of SMEs) and must ensure intended use, review input data relevance, retain logs and, where applicable, carry out a FRIA. In the event of a substantial modification (Article 25), the deployer transitions into the provider role.

ProviderDeployer
Who?Anyone who develops or places an AI system on the marketAnyone who uses an AI system under their own responsibility
SME share~10 %~90 %
Principal obligationsConformity assessment, technical documentation, EU database registration, risk managementIntended use, input data relevance review, log retention, FRIA where applicable
Transition to providerUpon substantial modification (Article 25)

7. 12-Month Action Plan

Short answer: the 12-month plan starts with the AI inventory, Article 5 screening and AI Literacy training (months 1-2), followed by an Acceptable Use Policy (M3), GPAI compliance review (M4-5), Article 50 transparency workflow (M6), FRIA for credit scoring/public tasks (M7), risk management system under Article 9 if you are a provider (M8-9), technical documentation Annex IV (M10-11) and audit simulation plus managing director briefing on 12/2027 readiness (M12).

MonthAction
Month 1-2AI system inventory + Article 5 screening + AI Literacy training
Month 3Acceptable Use Policy for AI, provider/deployer clarification per system
Month 4-5Review GPAI inventory + Code of Practice compliance
Month 6Article 50 transparency workflow (watermarking, chatbot notices)
Month 7FRIA for relevant use cases (credit scoring, public tasks)
Month 8-9Risk management system (Article 9) if provider
Month 10-11Prepare technical documentation Annex IV
Month 12Audit simulation, managing director briefing on 12/2027 readiness

8. Budget & Effort

Short answer: effort scales with role: pure deployers in the SME segment are at around 0.2 FTE over 4 months (EUR 5,000-20,000 plus training), high-risk deployers at 0.5 FTE × 6 months (additional EUR 15,000-40,000), high-risk providers at 1 FTE × 12 months including conformity assessment (EUR 50,000-200,000) and GPAI providers at 2-5 FTE × 12 months (EUR 250,000-1,000,000).

RoleInitial effort (FTE)Cost estimate
Pure deployer (SME)0.2 FTE × 4 monthsEUR 5,000-20,000 + training
High-risk deployer0.5 FTE × 6 months+EUR 15,000-40,000
High-risk provider1 FTE × 12 months + conformity assessmentEUR 50,000-200,000
GPAI provider2-5 FTE × 12 monthsEUR 250,000-1,000,000

Sources

As of: 02 May 2026

  1. Regulation (EU) 2024/1689 (AI Regulation) — EUR-Lex DE full text (as of: 02 May 2026)
  2. AI Act Article 99 — fine tiers 35M/15M/7.5M (Service Desk) (as of: 02 May 2026)
  3. AI Act Article 27 — FRIA obligation (applies from 02 August 2026)
  4. AI Act Article 51 — GPAI threshold 10²⁵ FLOPs
  5. European Commission — Digital Omnibus proposal (as of: 02 May 2026, trilogue ongoing)
  6. GPAI Code of Practice (Commission) (as of: 02 May 2026)

Frequently Asked Questions

What do I need to do NOW (as of 04/2026)?
Three immediate obligations: (1) AI Literacy under Art. 4 - training obligation in force since 02 February 2025. (2) Prohibition screening under Art. 5 - ensure that no AI system in use engages in prohibited practices. (3) GPAI inventory - which GPAI models do we use, and what are our obligations as a deployer? An AI system inventory serves as the basis.
Which obligations would be postponed by the Digital Omnibus?
As of 02 May 2026: The European Commission presented the Digital Omnibus proposal on 19 November 2025 - the trilogue is ongoing, NOT yet adopted. Planned: Annex III (high-risk AI in HR, education, law enforcement) postponed from 02 August 2026 to 02 December 2027; Annex I (regulated products) to 02 August 2028. UNCHANGED: Art. 4, Art. 5, GPAI obligations, Art. 50 transparency. Until adoption, 02 August 2026 remains the legally binding deadline.
When must a FRIA be carried out?
Art. 27 EU AI Act: for public bodies as well as for private entities carrying out public tasks (e.g. credit scoring under Annex III No. 5b, life/health insurance scoring under 5c). Deadline 02 August 2026 - NOT affected by the Digital Omnibus proposal (19 November 2025). Use in HR recruiting (Annex III No. 4a) from 02 August 2026 (DO proposal: postponement to 12/2027 - not yet adopted).
Provider or deployer - which one am I?
If you develop or place an AI system on the market under your own name, you are a provider (Art. 3 No. 3). If you use it 'under your own responsibility' (e.g. ChatGPT for internal purposes), you are a deployer (Art. 3 No. 4). 90% of SMEs are pure deployers.
Substantial Modification - when do I become a provider?
Under Art. 25 EU AI Act, in the event of a substantial change to the intended purpose or functionality, OR when placing the system on the market under your own name. Example: a RAG system built on a GPAI API and marketed as an 'AI legal assistant' - provider obligations apply.
Is ISO 42001 sufficient as evidence of AI Act compliance?
ISO 42001 (AI Management System) covers a large portion - in particular Art. 9 (risk management) and Art. 17 (QMS). Not included: conformity assessment (Art. 43), EU database registration (Art. 49), FRIA (Art. 27), GPAI Code of Practice. A sensible building block, but not a full substitute.
What does AI Act compliance cost?
Pure deployer (90% of SMEs): EUR 5,000-20,000 initial setup plus AI Literacy training (approximately EUR 3,000-10,000). High-risk deployer: an additional EUR 15,000-40,000 for FRIA, logging, and technical documentation. High-risk provider: EUR 50,000-200,000 (conformity assessment). Compliance-Kit EU AI Act Kit: one-off EUR 490-1,490.
Fines - how high?
Three tiers: Art. 99(3) prohibitions under Art. 5 up to EUR 35 million / 7% of turnover. Art. 99(4) high-risk / operator violations including Art. 4 AI Literacy up to EUR 15 million / 3%. Art. 99(5) incorrect or incomplete information to authorities up to EUR 7.5 million / 1%. SME/start-up privilege (Art. 99(6)): the lower value applies.

Sources