Ways of avoiding falling under the scope of application of high-risk AI systems (Part II)

The AI Act contains a comprehensive regulatory programme for providers of high-risk AI systems. Providers of AI systems would therefore be well advised to take measures to avoid falling under the scope of application of high-risk AI systems.

According to the exemption codified in Art. 6 (3) AI Act, AI systems that would generally be categorised as high-risk AI systems under Art. 6 (1) AI Act are exempt from this categorisation according to certain criteria. 

Here, we report on a first possible way of avoiding falling under the corresponding area of application.

A further way of avoiding falling under the scope of application of high-risk AI systems is provided by Art. 25 (1) AI Act for providers of AI systems that have been placed on the market or put into operation:

Solution 2: change of provider, Art. 25 (1) AI Act

According to Art. 25 (1) AI Act, a provider is released from its provider obligations with regard to a specific AI system manufactured by it and the operator, distributor or importer is henceforth deemed to be the provider of this system, with the legal consequence that the latter assumes the obligations under Art. 16 AI Act. The former provider is only required to cooperate by providing or otherwise supporting the operator, distributor or importer with the necessary information and reasonably expected technical access to enable the latter to fulfil the obligations for high-risk AI systems, Art. 25 (2) AI Act. 

In all other respects, it is released from its obligations under Art. 16 AI Act. This drastically reduces the risk of sanctions for the former provider, as breaches of the obligation to cooperate under Art. 25 AI Act are not subject to fines, unlike breaches of the obligations under Art. 16 AI Act (see Art. 99 of the AI Act).

The change of provider pursuant to Art. 25 (1) AI Act concerns the following constellations:

(i) lit. a: An operator, distributor or importer subsequently affixes its name or trademark to a high-risk AI system.

According to Art. 3 (3) AI Act, a provider is a party that places an AI system on the market or puts it into operation under its own name or trademark. This means that a provider of white label AI systems can avoid claims being made directly against it as a provider (although the cooperation obligations remain).

(ii) lit. b: An operator, distributor or importer makes a significant change to a high-risk AI system.

According to Art. 25 (1) lit. b AI Act, a change of provider occurs if substantial modifications are made to a high-risk AI system without the modification resulting in the system no longer being categorised as a high-risk AI system. These are, in particular, changes of the intended use stated in the provider's user manual or the area of use of the AI system within the meaning of Annex III to the AI Act. 

However, changes to the algorithm or changes that occur as a result of the "further learning" of a system during operation are not considered to be substantial modifications if they have been specified in advance by the provider in the technical documentation to be prepared as part of the conformity assessment in accordance with Annex IV No. 2 lit. f AI Act. 

In future, the Commission will publish guidelines specifying constellations of substantial modifications. Providers will therefore be able to limit the risks of their own provider status by using narrow wording in the user manual or conformity assessment.

(iii) lit. c: An operator, distributor or importer changes the intended use of an AI system or a GPAI model in such a way that this leads to the system first becoming a high-risk AI system.

Art. 25 (1) lit. c AI Act imposes the provider obligations on the party responsible for the subsequent categorisation of the AI system as a high-risk AI system. 

Art. 25 (2) AI Act contains an exception for this constellation with regard to the support obligations of the original provider. The former provider is not obliged to provide support if it has expressly excluded the conversion of the system into a high-risk AI system by contract. The provider of an AI system that has not been categorised as a high-risk AI system can thus avoid suddenly having to disclose confidential information about the system to the operator, distributor or importer.

Art. 25 AI Act is an interesting provision that releases providers from their provider obligations in certain cases. The future interpretation of the requirements for a substantial modification will be of particular importance.

Back to list