Skip to main content
ARTICLE 4 COMPLIANCE

AI Literacy isn't optional. It's Article 4.

Track who's trained, who's overdue, and who needs what — by role, department, and AI system. Exportable compliance reports for every audit.

What Counts as "Sufficient AI Literacy"?

Article 4 doesn't prescribe a specific curriculum. Instead, it requires organisations to take "measures" that ensure sufficient literacy, considering three factors:

Factor 1: The Person's Background

Technical knowledge, experience, education, and existing training. A software engineer operating an ML pipeline needs different training than a recruiter using an AI screening tool. The obligation is proportionate—not everyone needs the same depth.

Factor 2: The Context of Use

What the AI system does, how it's used in the organisation's workflows, and what decisions it influences. Using ChatGPT for internal brainstorming requires less training depth than using an AI system for credit scoring or employment decisions that directly affect people's lives.

Factor 3: The Affected Persons

Who is impacted by the AI system's outputs—customers, employees, job candidates, students, patients, or the general public. When vulnerable groups are affected (minors, elderly, disabled persons), the literacy standard is higher because mistakes carry greater consequences.

Klarvo's approach: We map training tiers to these three factors. When you add an AI system to your inventory and assign operators, Klarvo auto-suggests the appropriate training tier based on the system's classification, affected groups, and the operator's role. This ensures proportionate, defensible training coverage.

POWERED BY KLARVOENGINE

Training tiers assigned automatically

KlarvoEngine identifies which AI systems require operator training when it classifies your systems. The training tracker automatically assigns the right training tier based on the classification — operators of high-risk systems get advanced training requirements, general staff get awareness-level training.

Key Features

Role-Based Assignment

Assign different training tiers to operators, reviewers, and general staff automatically. When a user is assigned 'AI operator' for a system, advanced training is auto-assigned.

Completion Tracking

See who's completed what, who's overdue, and who needs reminders. Dashboard shows completion percentages by role, department, and AI system.

Automatic Reminders

Email notifications for incomplete training and upcoming re-certification deadlines. Configurable escalation: reminder → manager notification → compliance alert.

Audit-Ready Reports

Export completion reports as evidence for auditors and procurement. Reports include: who completed, when, what material, quiz scores (if applicable), and attestation status.

Annual Refresh

Automatic re-certification scheduling keeps training current. Set annual or quarterly refresh cycles. Track version changes in training materials.

Policy Acknowledgement

Track AI acceptable use policy sign-offs alongside training completion. Combine training + policy attestation into a single evidence export for auditors.

Gap Matrix

Visual grid showing every team member against every training type. Instantly spot who's missing what. Red highlights for gaps, green for completed, amber for in-progress.

Compliance PDF Report

Export a branded 4-page PDF for auditors: executive summary with completion rates, full records table, Article 4 compliance notes, and expiry warnings. One click, audit-ready.

Role-Based Training Tiers

Different roles need different training depths. Here's the framework Klarvo uses, aligned with Article 4's proportionality principle:

All Staff AI Basics
Audience: Everyone in the organization
Covers: What AI is, how the company uses it, approved tools, AI acceptable use policy, how to report concerns, basic understanding of EU AI Act obligations
Article 4 requires measures ensuring 'sufficient AI literacy' of staff dealing with AI. This tier covers baseline awareness for all employees.
Duration
30-60 mins
AI Tool Operators
Audience: Staff who directly use AI systems in their daily work
Covers: System-specific training, oversight procedures, escalation protocols, logging requirements, data input quality checks, output review processes, incident reporting
Article 4 and Article 26(7) require that persons assigned human oversight have the necessary competence, training, and authority. Operators need system-specific depth.
Duration
2-4 hours
High-Risk AI Operators
Audience: Staff operating high-risk AI systems (e.g., HR screening, credit scoring)
Covers: All operator content plus: Annex III category awareness, deployer obligations under Article 26, human oversight authority, suspension procedures, serious incident recognition, log review protocols
Deployers of high-risk AI must ensure oversight staff have competence and authority proportionate to the risk level and potential impact on affected persons.
Duration
4-6 hours
Reviewers & Approvers
Audience: Compliance owners, classification reviewers, sign-off authorities
Covers: Classification methodology, evidence assessment, FRIA oversight, audit preparation, regulatory updates, governance reporting
Those responsible for classification decisions and compliance sign-off need deeper understanding of the regulatory framework, Annex III criteria, and Article 27 FRIA requirements.
Duration
4-6 hours

How It Works

1

Add Training

Upload training materials (PDF, video links) or link to external courses. Optionally add quiz questions.

2

Assign by Role

Match training tiers to staff roles. Auto-assign when users are added as AI system operators.

3

Track & Remind

Monitor completion by person, role, and department. Automatic reminders until complete.

4

Export Reports

Generate audit-ready completion evidence linked to LIT-01, LIT-02, LIT-03 controls.

What Evidence Auditors Expect for Article 4

When auditors or procurement teams ask about AI literacy, they want to see structured records—not a vague statement that "we train our staff."

Training programme documentation

What training exists, what it covers, and how tiers map to roles

Completion records per person

Who completed, when, which training module, and any assessment results

Policy acknowledgements

AI acceptable use policy sign-offs with timestamps

Re-certification schedule

Proof that training is refreshed regularly, not a one-time exercise

Coverage by AI system

Which staff operate which AI systems and what training they received for each

Contractor/third-party coverage

Evidence that external parties operating AI on your behalf are also trained

Klarvo exports all of this as a single evidence report linked to your AI literacy controls (LIT-01, LIT-02, LIT-03).

Related Resources

AI Literacy Guide (Article 4)

Deep dive into Article 4 requirements with practical implementation steps and real-world examples.

Read Guide

AI Acceptable Use Policy Template

Ready-to-use policy template for staff AI usage. Track acknowledgements alongside training completion.

Get Template

Frequently Asked Questions

What is AI literacy under Article 4?

Article 4 of the EU AI Act requires providers and deployers to take measures to ensure 'sufficient AI literacy' of their staff and other persons dealing with the operation and use of AI systems on their behalf. This must take into account the technical knowledge, experience, education, and training of those persons, as well as the context and the persons or groups on which the AI system is to be used.

When do AI literacy requirements apply?

AI literacy obligations under Article 4 applied from 2 February 2025. This is one of the earliest EU AI Act obligations to take effect. Organisations should already have measures in place.

What training should we provide?

The EU AI Act doesn't prescribe specific training curricula. Instead, it requires 'sufficient' literacy proportionate to the role. At minimum: AI basics for all staff, system-specific training for operators (especially for high-risk systems), and governance training for reviewers/approvers. The key is that training is documented, role-appropriate, and regularly refreshed.

How do we prove compliance with Article 4?

Keep records of: what training was provided (materials, content, duration), who completed it (per role and department), when they completed it, any assessment results, and policy acknowledgements. Klarvo's Training Tracker generates all of this as an exportable evidence report linked to your Article 4 control (LIT-01, LIT-02, LIT-03 in our control library).

Do staff need external certification?

No. The EU AI Act does not require external certification or accredited courses. Internal training programmes with documented completion are sufficient, provided they are appropriate to the role and context. What matters is that you can demonstrate the training was relevant, delivered, and tracked.

How often should training be refreshed?

The Act doesn't specify a refresh frequency, but best practice is annual re-certification for all tiers, with ad-hoc updates when systems change, new AI tools are deployed, or regulatory guidance is updated. Klarvo automates refresh scheduling and sends reminders before certifications expire.

What about contractors and third parties?

Article 4 covers 'other persons dealing with the operation and use of AI systems on their behalf'—this includes contractors, temporary staff, and outsourced teams who operate AI systems for you. They should receive appropriate training, and completion should be tracked as evidence.

Training tracking is included in every paid plan.

Classify your AI systems with KlarvoEngine. Training requirements are assigned automatically. No credit card to start.

No credit card
Free plan forever
Upgrade anytime