The EU AI Act entered into force on 1 August 2024, and the staggered application timeline means the first real obligations land before the end of the year. For enterprises deploying AI into the EU market — whether you’re based there, selling there, or have EU users of a product built elsewhere — the practical compliance work has moved from watching the trilogue to write the policy.
This is not a legal analysis and we’re not your lawyers. What follows is an operator’s reading of where the Act will actually show up in your engineering and product backlog, and what to put on the roadmap now so that the 2025 and 2026 deadlines aren’t a scramble. The biggest mistake we see teams making right now is assuming none of this applies to them because they “don’t build AI” — most of them do, they just call it something else.

Five practical moves to make on the compliance roadmap this quarter:
- Build an internal AI system inventory. You cannot classify what you cannot see. Most enterprises we’ve worked with this year have been surprised by how many embedded vendor AI features live inside their existing SaaS stack. Start with a spreadsheet; upgrade it to a system later.
- Classify each system against the Act’s risk tiers. Most internal productivity systems will fall into the “limited” or “minimal” risk bands — but anything touching HR decisions, credit, education admission, essential public services or biometric identification is almost certainly high-risk and needs a meaningfully different compliance posture.
- Write the transparency disclosures now. Systems that generate or manipulate image, audio, video or text — chatbots, summarisers, content generators — need clear disclosure that a user is interacting with AI. The engineering work is small; the policy and legal review is the long pole.
- Decide your approach to third-party foundation models. If you’re building on a GPAI (general-purpose AI) model from a provider, you inherit part of their compliance story and owe them data for theirs. Your vendor contracts and your model-use telemetry need to catch up to this reality — most don’t yet.
- Appoint an internal owner before regulators do. Your DPO is probably already overloaded and is not the right person by default. An engineering leader with a foot in product and a foot in risk, with a quarterly mandate and budget, is the shape of role that ends up making this work.

The Act is the first of many — the UK, US and several APAC jurisdictions are all drafting their own framework at time of writing, and the patterns are converging more than diverging. Treat the EU Act as a forcing function to build the AI governance muscle you’ll need for all of them, not as a one-off compliance exercise. The teams that do this work deliberately in 2024 will be the ones still shipping confidently in 2025.
