Building an AI System Inventory 2026: 8-Step Guide
Practitioner note: This article is practice-oriented compliance documentation, not legal advice. We are a compliance specialist, not a law firm. For legally binding information, consult a licensed attorney.
TL;DR
- An AI inventory is a prerequisite for Art. 4 (AI Literacy) + Art. 26 (deployer obligations)
- 8-step build: Discovery → classification → GPAI tagging → risk → FRIA trigger → provider/deployer → training → update
- Minimum columns: 12 (tool name, provider, use case, GPAI yes/no, data class, Annex III, risk, FRIA, AI literacy, approval date, owner, update)
- Update cycle: on change + annually
- Fine exposure: up to EUR 15 million / 3 % (Art. 99(4) — high-risk/operator obligations); up to EUR 7.5 million / 1 % (Art. 99(5) — false information to authorities)
1. Why an AI inventory is mandatory
Even though the EU AI Act does not contain an explicit "inventory" article, an inventory is the operational minimum standard for several obligations:
- Art. 4 (since 02 February 2025) — AI Literacy requires that staff be trained on the AI in use. Without an inventory, no training is possible.
- Art. 5 — Prohibited practices must be excluded. This is only possible with an overview.
- Art. 26 — Deployer obligations for high-risk AI. Without classification, the obligations cannot be met.
- Art. 99 — Fines are tiered: up to EUR 35 million / 7 % (prohibitions under Art. 5), up to EUR 15 million / 3 % (high-risk/operator obligations including AI literacy), up to EUR 7.5 million / 1 % (false information to authorities) — the burden of proof lies with the controller.
2. 8-step guide
| Step | Activity | Output |
|---|---|---|
| 1. Discovery | Employee survey, IT systems scan, licence review | Raw list >=20 tools |
| 2. AI system yes/no | Art. 3 test per tool | Cleaned list |
| 3. GPAI tagging | Model classification (ChatGPT, Claude, Gemini = GPAI) | GPAI column |
| 4. Annex III check | 8 areas, review filter mechanism | High-risk tag |
| 5. FRIA trigger | For high-risk + certain deployers (Art. 27) | FRIA to-do list |
| 6. Provider/deployer role | Document per tool | Role column |
| 7. AI literacy training per tool | Individual training obligations | Training plan |
| 8. Update cycle | Quarterly review date | Owners, deadlines |
3. Excel template columns (minimum standard)
- Tool name + version
- Provider (manufacturer)
- Use case (one short sentence)
- GPAI yes/no (model ID if known)
- Data categories processed
- Annex III area (1-8 or "none")
- Risk class (prohibited / high / limited / minimal)
- FRIA status (not required / open / completed)
- AI literacy training (date)
- Approved by / date
- Responsible business unit
- Last update + next review
Template as Excel + 12 pre-filled SME examples in the EU AI Act Kit.
4. 12 SME examples of typical AI tools (as of 04/2026)
| Tool | Use case | GPAI | Risk | Obligation |
|---|---|---|---|---|
| ChatGPT Enterprise | General text generation | yes | limited (Art. 50 transparency) | AUP, AI literacy |
| Microsoft 365 Copilot | Office assistance | yes (GPT-4) | limited | AUP, data classification |
| Claude (Anthropic) | Research, code support | yes | limited | AUP, AI literacy |
| HR recruiting software (with ML score) | Candidate scoring | no | HIGH (Annex III, 4) | FRIA, bias test |
| Payroll AI (anomaly detection) | Error detection | no | minimal | documentation only |
| DeepL Pro | Translation | no (NMT) | minimal | documentation only |
| Grammarly | Language optimisation | no | minimal | documentation only |
| Zendesk AI (chatbot) | Customer support | yes (GPAI-based) | limited (Art. 50) | AUP, bot disclosure |
| Otter.ai (transcription) | Meeting transcription | yes (Whisper-based) | minimal | documentation only |
| Sales forecast tool | Revenue forecasting | no | minimal | documentation only |
| Credit scoring software | Credit decision recommendation | variable | HIGH (Annex III, 5) | FRIA, bias test |
| Spam filter (classic) | Email filter | no (Bayes) | minimal | documentation only |
5. Update cycle + triggers
- Quarterly: review all entries, add new tools, remove outdated ones
- Trigger-based: Annex III change, trilogue decisions on the Digital Omnibus proposal (19 November 2025, as of 02 May 2026 NOT adopted), new supervisory authority FAQs
- For every tool procurement: AI check before purchase, IT must tag "AI yes/no"
- On contract renewal: review provider AUP, update data flow
6. Audit preparation
- Inventory <6 months old
- Per tool: approval workflow traceable
- FRIAs in place for high-risk use cases from 02 August 2026 (Digital Omnibus proposal of 19 November 2025: postponement to 02 December 2027 — not yet adopted)
- AI literacy training logbook with knowledge quiz
- Acceptable Use Policy <12 months old
- Data breach procedure also documented for AI tools
Sources
- Regulation (EU) 2024/1689 (AI Act) — Art. 4, 6, 26, 99 (EUR-Lex DE) (as of: 02 May 2026)
- AI Act Art. 99 — fine tiers (Service Desk)
- AI Act Annex III — 8 high-risk areas
- European Commission — Digital Omnibus proposal (as of: 02 May 2026, trilogue ongoing)
- EU AI Office — GPAI Code of Practice (as of: 02 May 2026)
Sources
- Regulation (EU) 2024/1689 — EU AI Act (Art. 3, 4, 5, 26, 99) (As of: 2026-05-02)
- EU AI Act Annex III — high-risk areas (As of: 2026-05-02)
- Commission Digital Omnibus (proposal, 19 Nov 2025) (As of: 2026-05-02)