EU AI Act Technical Auditor
GDPR was brutal.
Now, the EU AI Act can cost you
€35.000.000
That is the maximum fine for prohibited AI practices under Article 5 of the EU AI Act. High-risk AI systems must meet conformity requirements by August 2026. I classify your AI and document Article 4 AI literacy.
EU AI Act Compliance
Most companies do not know if their AI systems are high-risk.
Deploying AI without proper classification creates regulatory risk.
Most high-risk obligations begin applying from August 2026, but diligence pressure and remediation cost hit earlier.
Prohibited practices face the largest penalties. Classification mistakes still create procurement friction, investor concern, and expensive remediation work long before a fine arrives.
I map your product against Annex III of the EU AI Act and deliver a written risk position your CTO, legal team, and investors can use.
>70%
of companies say they are not prepared for the EU AI Act.
Do not spend €150,000 or more fixing compliance gaps discovered during due diligence or regulatory review. Early classification costs a fraction of that.
Technical Audit
Software cannot find your AI risk.
But a technical audit can.
Compliance software
- Checks what you declare
- Misses AI hidden in APIs and pipelines
- Produces generic reports
- Assumes you understand the law
Technical AI audit
- Full architecture review
- Discovery of undocumented AI components
- Mapping of system intent to Annex III risk
- Clear written classification for investors and regulators
Result: A written AI risk classification based on Articles 5 and Annex III of the EU AI Act.
How the audit works:
I inspect your product architecture, models, data flows, and external APIs.
AtechnicalclassificationofyourAIsystemundertheEUAIAct
Triage: Article 5 & Annex III
We review your AI architecture and product roadmap. Determine whether your system falls under prohibited practices or high-risk classification.
Documentation: Article 11 & Annex III
Your system architecture is translated into the technical documentation required by the EU AI Act. Every component is mapped to regulatory obligations.
Readiness: Investor and regulatory review
You receive a board ready classification memo explaining whether your AI triggers prohibited or high-risk obligations.
Triage: Article 5 & Annex III
We review your AI architecture and product roadmap. Determine whether your system falls under prohibited practices or high-risk classification.
Documentation: Article 11 & Annex III
Your system architecture is translated into the technical documentation required by the EU AI Act. Every component is mapped to regulatory obligations.
Readiness: Investor and regulatory review
You receive a board ready classification memo explaining whether your AI triggers prohibited or high-risk obligations.

Stockholm School of Economics — AGI Keynote (invited by SSE Debate Club)
Matei Cananau
EU AI Act technical auditor
I translate AI systems into EU AI Act compliance.
Most compliance experts come from law or policy.
My background is machine learning and enterprise AI.
I audit your models, pipelines, and data flows against Articles 5, 11, and Annex III of the EU AI Act.
You receive a technical classification of your system that investors, legal teams, and regulators can understand.
Audience
This engagement is
executive-grade.
I provide technical audits of AI systems and Article 4 AI literacy workshops and documentation for scale-ups and enterprises.
Perfect for
- VC backed scale-ups with AI inside the core product
- Enterprises deploying high-risk AI systems
- CTOs or Heads of AI needing technical EU AI Act classification
- Legal teams preparing engineering for the 2026 conformity deadline
- Companies needing Article 4 AI literacy workshops and documentation
Not for
- Pre revenue startups running LLM wrappers
- Hobby projects with no real users or data risk
- Companies that only want a compliance PDF to check a box
- Teams unwilling to open their architecture for technical review
Pricing
Let us talk AI.
Clear pricing, real knowledge, no hidden fees.
Early founding-client pricing is available for the first few companies.
AI Literacy
Standard price:
Early client rate:
Best for: documenting AI literacy
- ✦ Article 4 AI literacy workshop and documentation pack
- ✦ Executive briefing on EU AI Act obligations
- ✦ Staff literacy testing framework
- ✦ 90 minute executive session
- ✦ Written record proving workforce AI literacy
High-risk Audit
Standard price:
Early client rate:
Best for: proving architecture legality to investors.
- ✦ Architecture review
- ✦ Article 5 prohibited system check
- ✦ Annex III classification report
- ✦ Five to eight page technical memo
- ✦ You will know exactly what the law says about your product
Fractional AI Officer
Standard price:
Early client rate:
Best for: companies actively shipping AI features.
- ✦ Continuous Annex IV technical documentation
- ✦ Article 17 QMS maintenance
- ✦ Model change impact reviews
- ✦ Regulatory readiness for audits
Common Questions
Everything you need to know.
Prohibited AI practices under Article 5 can lead to fines of up to €35,000,000 or 7% of global annual turnover. Other violations carry lower but still significant penalties. Companies deploying AI in the EU must demonstrate compliance with the Act’s requirements.
Yes. The Act applies to companies that develop, deploy, or integrate AI systems into their products. If your product uses external AI APIs such as OpenAI or other providers, you still remain responsible for how that AI is deployed and used in your system.
High-risk systems are AI systems used in certain regulated use cases defined in Annex III, including hiring, finance, education, healthcare, biometric identification, and critical infrastructure. These systems must meet strict requirements including documentation, risk management, human oversight, and conformity assessment.
Yes. Article 4 requires companies to ensure that employees interacting with AI systems have adequate AI literacy. Organizations must be able to demonstrate that staff understand the capabilities, risks, and legal obligations associated with AI use.
Most compliance software relies on questionnaires and self reported inputs. It cannot analyze real system architecture, data pipelines, or hidden AI usage inside APIs and integrations. A technical audit reviews the actual system design to determine legal classification and risk.
Legal teams interpret the law. They usually do not analyze model architecture, training data, or system pipelines. EU AI Act compliance depends on how the system is technically built. A technical audit connects the engineering reality with the legal requirements.
Increasingly yes. During due diligence, investors often review regulatory exposure related to AI systems. Clear classification and documentation reduce legal risk and prevent compliance issues from delaying funding rounds or acquisitions.

