AI Act: 62% of Companies Have No AI Inventory

5 min read
Article

The EU AI Act requires a complete inventory of AI systems before August 2, 2026. The problem: 62% of companies have none. That number is not a delay — it is a blind spot.

The free AI newsletter
AI Act: 62% of Companies Have No AI Inventory

62% of Companies Do Not Know What They Are Deploying

According to a 2026 EY report compiled by SoftwareSeni, only 38% of organizations maintain a complete inventory of AI applications in production. The remaining 62% are flying blind.

This is not an administrative technicality. The AI Act — which enters full application for high-risk systems on August 2, 2026 — treats the inventory as an absolute prerequisite. No inventory means no classification. No classification means no documentation.

No documentation means no compliance. The chain is linear and unforgiving.

Put plainly: a company that does not know what it is deploying cannot comply with the law. Even if it wanted to.

The Inventory: Foundation Layer That Nobody Builds

The AI Act requires deployers to identify, classify, and document every AI system in use — especially those touching sensitive domains: recruitment, credit, healthcare, education, critical infrastructure. These are the systems listed in Annex III, subject to the tightest requirements.

But before you can classify, you have to list. And listing is already the first problem.

Think of a warehouse no one has ever fully walked through. You have a rough sense of what came in, but not what is still there, in what state, or since when. That is exactly the situation most large organizations are in with their AI tools: systems arrive, others get tested and forgotten, teams deploy without involving IT.

The appliedAI study on 106 enterprise AI systems illustrates the point: 18% are clearly high-risk, 42% clearly low-risk, and 40% cannot be unambiguously classified. Even with an existing inventory, classification stays murky for a large share of systems.

Shadow AI Is Sabotaging the Inventory Effort

The real blind spot is shadow AI.

According to Microsoft's 2025 Work Trend Index, 71% of employees use unapproved AI tools at work. These tools appear on no official inventory. A company that thinks it manages 5 AI systems may actually have 50.

A 2026 EY survey confirms: 52% of AI initiatives at the departmental level operate without formal IT approval. Half of operational deployments are, by definition, invisible to compliance teams.

Deloitte finds that only 21% of companies have a mature governance model for their AI agents, while 75% plan to deploy more in the next two years. Organizations are building faster than they are governing. Under the AI Act, that gap becomes direct financial exposure.

The GDPR Parallel: Companies Procrastinate, Then Fines Arrive

We have seen this movie before.

In May 2018, the GDPR came into force after a two-year grace period. In the weeks before the deadline, law firms were running at capacity and emergency DPO hires were queuing up. Many companies were not ready — and some admitted it, betting that enforcement would be slow.

They were right in the short term. GDPR enforcement took time to get organized. But it came: fines across the EU now run into the billions of euros cumulatively.

The AI Act follows the same pattern: announced in 2024, two-year delay, August 2026 deadline, and the same signals of surface-level compliance. Policy documents on paper, but no real inventory.

The difference is the size of the fines. The GDPR caps at 20 million euros or 4% of global turnover. The AI Act goes to 35 million or 7%. And if an AI system also processes personal data — which is common — the two regimes stack. The combined ceiling can reach 55 million euros for a single infringement.

Asymmetric Enforcement: A False Comfort

There is another signal that could mislead: as of March 2026, only 8 of the EU's 27 member states have designated a national contact point for AI Act enforcement — seven months after the legal deadline for doing so.

It is fair to ask whether enforcement will be truly uniform. The honest answer: probably not, at least initially.

But "they are not enforcing yet" is not a compliance strategy. It is a bet. Companies that bet on slow GDPR enforcement were sometimes right for two or three years. Some are paying those fines today, with interest.

The question is not "will we get caught?" It is "do we want to take this risk with a documented track record of non-compliance?"

Where to Start: The First 3 Steps of an AI Inventory

If your organization does not yet have an AI inventory, here is what you can do right now, without waiting for a formal compliance project.

Step 1: Survey active tools. Ask each department to list every AI tool in use, approved or not. Give them a week and a guarantee: the goal is the inventory, not punishment. The volume will probably surprise you.

Step 2: Prioritize by use case. Systems touching HR, credit decisions, healthcare, or individual assessments are the ones that fall under Annex III. They go to the front of the documentation queue.

Step 3: Flag the ambiguous systems. For tools no one is sure how to classify, document the uncertainty. A list of unclassified systems with a justification is already the beginning of a good-faith record.

This is not full compliance. It is triage. But it is the only way to turn a blind spot into a manageable problem.


The AI Act did not create the problem of AI governance. It just added a deadline and a price tag.

Topics covered:

RegulationAnalysis

Frequently asked questions

What AI inventory does the EU AI Act require?
The AI Act requires deployers to list, classify, and document every AI system in use, especially those touching recruitment, credit, healthcare, or education (Annex III). Without an inventory, compliance is impossible.
When is the AI Act deadline for high-risk systems?
Compliance obligations for high-risk AI systems come into force on August 2, 2026. Organizations without an AI inventory have very little time left.
What is shadow AI and why does it create problems?
Shadow AI refers to AI tools used without IT approval. According to Microsoft, 71% of employees use unapproved AI tools at work. These systems appear on no official inventory, making AI Act compliance impossible.
What fines does the AI Act impose?
The AI Act provides for fines up to 35 million euros or 7% of global turnover. When an AI system also processes personal data, AI Act and GDPR regimes stack, pushing the combined ceiling to 55 million euros.
Where should a company start building an AI inventory?
Three priority steps: survey all active tools by department, prioritize systems touching HR, credit, and health (Annex III), then document ambiguous systems with a classification justification as evidence of good faith.
The free AI newsletter