Article 20 Corrective actions and duty of information


    1. Providersmeans a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge; of high-risk AI systemsmeans a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments; which consider or have reason to consider that a high-risk AI systemmeans a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments; that they have placed on the market or put into service is not in conformity with this Regulation shall immediately take the necessary corrective actions to bring that system into conformity, to withdraw it, to disable it, or to recall it, as appropriate. They shall inform the distributorsmeans a natural or legal person in the supply chain, other than the provider or the importer, that makes an AI system available on the Union market; of the high-risk AI systemmeans a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments; concerned and, where applicable, the deployersmeans a natural or legal person, public authority, agency or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity;, the authorised representativemeans a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation; and importersmeans a natural or legal person located or established in the Union that places on the market an AI system that bears the name or trademark of a natural or legal person established in a third country; accordingly.

    1. Where the high-risk AI systemmeans a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments; presents a riskmeans the combination of the probability of an occurrence of harm and the severity of that harm; within the meaning of Article 79(1) and the providermeans a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge; becomes aware of that riskmeans the combination of the probability of an occurrence of harm and the severity of that harm;, it shall immediately investigate the causes, in collaboration with the reporting deployermeans a natural or legal person, public authority, agency or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity;, where applicable, and inform the market surveillance authoritiesmeans the national authority carrying out the activities and taking the measures pursuant to Regulation (EU) 2019/1020; competent for the high-risk AI systemmeans a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments; concerned and, where applicable, the notified bodymeans a conformity assessment body notified in accordance with this Regulation and other relevant Union harmonisation legislation; that issued a certificate for that high-risk AI systemmeans a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments; in accordance with Article 44, in particular, of the nature of the non-compliance and of any relevant corrective action taken.

We're continuously improving our platform to serve you better.

Your feedback matters! Let us know how we can improve.

Found a bug?

Springflod is a Swedish boutique consultancy firm specialising in cyber security within the financial services sector.

We offer professional services concerning information security governance, risk and compliance.

Crafted with ❤️ by Springflod