GDPR
GDPR
The GDPR (General Data Protection Regulation 2016/679) sets limits on the use of personal data considered sensitive. It provides many opportunities for effective monitoring and enforcement of fairness, transparency, individual rights and granting individual control over their personal data. The GDPR does not affect algorithms that do not process personal data or that affect people’s rights (e.g. democracy). Thus, controllers must ensure and be able to demonstrate compliance with the GDPR. This means that controllers are required to consider any potential risks that the use or creation of the specific algorithm may potentially pose to the rights and freedoms of natural persons and, if necessary, to take measures to address these risks.
With respect to AI systems, the steps to be taken by controllers include checks on the adequacy and completeness of the software’s training programs, and on the existence of causes of bias and unfairness.
Furthermore, Article 22 of the GDPR prohibits any decision based solely on automated processing of personal data, if such a decision “produces legal effects in relation to [the data subject] or similarly affects [the data subject].” This means that in practice, it will often not be possible to rely solely on the output of an algorithm when making sensitive decisions.