Lost in legal language: SMEs and the AI Act

28.11.2025 · comment of the week · legislation
Graphic

A few days have passed since the European Commission published the so-called “Digital Omnibus on AI” as part of the “Digital Simplification Package”, which also includes proposed amendments to the GDPR, the NIS2 Directive and the Data Act. This is a good moment to recall one often overlooked dimension of deregulation. It is not only about what should be simplified or eased in terms of obligations, but also about the language in which it is described. If the legal text remains written primarily for a narrow group of experts, then even the best “facilitations” will remain dead letters for many SMEs, hidden behind a language barrier they are unable to overcome on their own.

If SMEs cannot read the AI Act, who is the real addressee of the law?

Let us imagine an SME entrepreneur who wants to bring to market a simple AI-based service, for example, a tool for automatic document classification. On the technology side, the problem is solvable. The real challenge begins the moment they have to answer the question: “Am I really complying with the AI Act?”. Instead of innovation, there appears the need to hire a team of lawyers and consultants. For many companies, this is the moment when the project ends up in a drawer.

This gap between technology and regulation becomes clearly visible when we look at the law through the lens of an original Legal Complexity Scale (LCS), which we will use in this text. It is a conventional five-level scale that does not assess the “wisdom” of the legislator, but rather how far a text can be understood through self-study by an entrepreneur who is neither a lawyer nor an AI specialist. Level 1 includes regulations based on simple, widely known categories; level 5 is reserved for acts so technical and multi-layered that in practice they require the presence of an expert already at first reading. The intermediate levels are successive steps from the language of values to the language of procedures.

If we treat them as a message to the citizen, the constitutions of CEE Member States such as Poland, Czechia, Bulgaria, Lithuania or Slovenia fall roughly within levels 1-2 of the LCS. A citizen can independently understand that they have the right to a fair trial, that the state cannot detain them without a legal basis, and that public authority is not unlimited. This is the language of values and principles, not the language of parameters and risk matrices.

The AI Act falls within levels 4-5 of the LCS. The text of the regulation expands definitions (“AI system”, “high-risk AI system”, “general-purpose AI models with systemic significance”), imposes obligations on entities in terms of risk management, data quality, technical documentation, quality management systems and post-market monitoring, and refers to a number of other acts and technical standards.

This is no longer a law that an entrepreneur can “just read”. It is an ecosystem in which one must navigate with a map in the form of a law firm and a consulting company.

From the perspective of SMEs, the difference between a constitution and the AI Act is not theoretical but very practical. It can be reduced to three simple questions that a business owner asks themselves when they pick up a legal act:

  1. Does this act concern me at all?
    In the case of a constitution, the answer is intuitive: yes, it concerns me as a citizen. In the case of the AI Act, even determining whether a given system falls under the definition of “AI” at all, and if so, whether it is “high-risk” or “low-risk”, requires an analysis that most SMEs will not carry out on their own.

  2. If it does concern me, what exactly am I supposed to do?
    From the constitution, we get a general view: the state has certain limits, and I have certain rights. From the AI Act, instead of a view, we get a list: policies, registers, procedures, tests, audits, labels, and information obligations. For a large corporation, this is just another compliance package. For a smaller company, it is an entirely parallel organisational project.

  3. What will happen if I fail to do something?
    In the case of a constitution, we intuitively understand that its violation is a serious matter, but we do not plan everyday activity on the basis of detailed liability scenarios. The AI Act introduces specific sanctions and evidentiary obligations that an entrepreneur should factor into their business risk. Except that, to do this, they must first understand the regulation itself, so we are back to square one.

In practice, this means that the AI Act may become an unintended barrier to entry. Companies withdraw not because they are unable to build safe systems, but because they are unable to assess whether they actually comply with the requirements of the regulation. As a result, a regulation that was supposed to level the playing field starts to favour entities with the largest resources.

Therefore, in the debate on the implementation of the AI Act and in discussions on the “Digital Omnibus”, it is not enough to talk about “facilitations for SMEs” in the form of isolated exemptions or soft law. What is needed is a change of approach: recognition that simplicity is an element of the quality of law, not its opposite. Simple does not mean primitive; it means “readable without an intermediary”.

Radically simplifying the AI Act by building clear, linear and proportionate compliance pathways for SMEs is a prerequisite for ensuring that the right to innovate does not become a privilege of the largest players. The constitutions of the countries mentioned above show that it is possible to speak about fundamental matters in comprehensible language. The law on artificial intelligence – like any other EU legal act – should not be a special exception here but a consequence of the same logic, meaning that the rules of the game must be accessible to everyone they concern and that the first step towards this is to change not only the substance but also the language of the provisions.

 

· More articles

Graphic
08.08.2025

The AI Act Comes Into Force. Poland Has a Chance to Build the Regulator of the Future — But Time Is Running Out

The beginning of August marks an important milestone for artificial intelligence regulation in Europe. In line with the AI Act timeline, several key provisions have now entered into force — including those related to the obligations of general-purpose AI (GPAI) providers and the governance structures at both EU and Member State levels. For Poland, this […]

Graphic
14.08.2025

What Does a Yacht Have to Do with Development? The Strategic Drift of Poland’s Economic Policy

The HoReCa sector support programme under the National Recovery Plan, which has been stirring public debate for the last week, should be more than just another topic in the daily political squabble. It has revealed a broader, systemic problem that calls for deeper reflection. While headline-grabbing cases of funding yachts or saunas rightly raise questions […]