Section 22 AGG: Reversal of the burden of proof in discrimination cases — what employers need to know
TL;DR
- Section 22 AGG: The applicant only has to prove indicia — the employer must produce full proof of non-discrimination
- BAG trend: Statistical indicia of discrimination — including from algorithmic selection outputs — trigger the reversal of the burden of proof
- Typical damages: 1-3 gross monthly salaries per claimant (Section 15 AGG); amount depends on the individual case
- The mandatory reversal of the burden of proof makes preventive documentation indispensable
- 5 preventive measures: job ad audit, selection documentation, AI bias test, anonymisation, HR training
1. Wording of Section 22 AGG
"Where, in the event of a dispute, one party proves indicia which suggest disadvantage on one of the grounds referred to in Section 1, the other party bears the burden of proof that there has been no breach of the provisions on protection against disadvantage." — Section 22 AGG (German General Equal Treatment Act)
Distribution of the burden of proof:
- Applicant: indicia (low threshold)
- Employer: full proof of non-discrimination (high threshold)
2. What qualifies as "indicia"?
BAG (Federal Labor Court) case law recognises the following as indicia:
- Wording of job advertisements (e.g. "young, dynamic team")
- Statistical anomalies in selection procedures
- Discriminatory statements in the interview (recording, witnesses)
- Temporal correlations (termination after pregnancy)
- AI tool outputs with documented discrimination
3. Rebuttal by the employer
Short answer: Once the claimant has produced indicia, the employer must provide full proof of non-discrimination — typically through a documented AGG-compliant job advertisement, selection criteria defined in writing in advance, objective scoring sheets per applicant, attendance lists for the interviews and a written, reasoned selection decision. For AI-assisted selection additionally: bias test results and audit logs.
Full-proof requirements:
- Documented job advertisement — neutrally worded, AGG-compliant
- Selection criteria defined — in writing before applications are received
- Scoring sheets per applicant — objective points-based assessments
- Attendance list for interviews — who was present when and with whom
- Reasoning for the selection decision — written note, dated
- For AI tools: bias test results + logs
4. Statistical indicia (AI!)
Statistical anomalies in algorithmic selection outputs are recognised as indicia within the meaning of Section 22 AGG — the indicia threshold is low:
- Significant ratio deviations along protected characteristics (age, gender, origin, disability)
- Systematic exclusion of a comparison group (e.g. applicants > 50)
- Black-box models without audit/logic disclosure (cf. CJEU C-203/22 on the scope of the right to information)
- Correlated proxy variables without documented mitigation
- Purely algorithmic rejection without human review (Art. 14, 26 EU AI Act)
BAG 8 AZR 300/24 (23 October 2025, Daimler Truck) illustrates the trend: a single comparator case sufficed there as an indicium in the equal-pay context — algorithmic bias findings should all the more carry weight as indicia.
5. 8 key BAG rulings
Short answer: The reversal of the burden of proof under Section 22 AGG is shaped by central BAG rulings: BAG 8 AZR 300/24 (23 October 2025, Daimler Truck) — a single male comparator suffices as an indicium; BAG 8 AZR 488/19 (21 January 2021) — median salary as a statistical indicium; BAG 8 AZR 450/21 (16 February 2023) — "better negotiating skills" is not a sufficient justification; CJEU C-203/22 (27 February 2025) — right to information on the logic of automated decisions.
| Ruling | Statement | Damages |
|---|---|---|
| BAG 8 AZR 300/24 (23 October 2025) | Pair comparison equal pay (Daimler Truck) — a single male comparator suffices as an indicium | case-dependent |
| BAG 8 AZR 450/21 (16 February 2023) | "Better negotiating skills" does not justify a pay differential | case-dependent |
| BAG 8 AZR 488/19 (21 January 2021) | Indicium: lower median female pay → reversal of the burden of proof | case-dependent |
| BAG 8 AZR 21/24 (19 September 2024) | No compensation for "AGG hoppers" (abuse of rights) | n/a |
| BAG 8 AZR 209/21 (08 May 2025) | Workday data transfer within the corporate group: GDPR damages possible | case-dependent |
| CJEU C-203/22 (27 February 2025) | Right to information on the logic of automated decisions (Art. 15(1)(h) GDPR) | n/a |
6. Practical guide
Short answer: Eight practical steps minimise the Section 22 AGG risk: job advertisement audit with AGG-compliant language, selection criteria fixed in writing before applications are received, scoring sheets per applicant, mandatory bias testing for AI tools (CNIL guidelines + Art. 10(5) EU AI Act), annual HR training, anonymised initial screening, structured interviews with score sheets and retention of documentation for 3 years.
- Job advertisement audit with AGG-compliant language
- Selection criteria in writing BEFORE applications
- Scoring sheets per applicant
- For AI: mandatory bias testing (CNIL + EU AI Act Art. 10(5))
- Annual HR team training
- Anonymised initial screening (remove photo, name, date of birth)
- Structured interviews with score sheets
- Retain documentation for 3 years (AGG limitation period + buffer)
Sources
- German General Equal Treatment Act (AGG), Sections 1, 3, 7, 22 — gesetze-im-internet.de
- BAG 8 AZR 300/24 (equal pay 23 October 2025, pair comparison)
- CJEU C-203/22 (CK Telekommunikation) on the burden of proof in algorithmic decisions
- BfJ + ADS: Federal Anti-Discrimination Agency — antidiskriminierungsstelle.de
Sources
- General Equal Treatment Act (AGG) — full text, gesetze-im-internet.de (As of: 2026-05-02)
- BAG 8 AZR 300/24 — pair-wise comparison equal pay (Daimler Truck) (As of: 2026-05-02)
- BAG 8 AZR 488/19 — median pay as indicia (Section 22 AGG / Section 3(2) EntgTranspG) (As of: 2026-05-02)
- Directive (EU) 2023/970 — Pay Transparency (As of: 2026-05-02)