European Regulators Chime In On The Interplay Between Data Protection And AI

WF
William Fry

Contributor

William Fry is a leading full-service Irish law firm with over 310 legal and tax professionals and 460 staff. The firm's client-focused service combines technical excellence with commercial awareness and a practical, constructive approach to business issues. The firm advices leading domestic and international corporations, financial institutions and government organisations. It regularly acts on complex, multi-jurisdictional transactions and commercial disputes.
On 18 July 2024, the Data Protection Commission of Ireland (DPC) also released its first official guidance on the interplay between data protection and AI
European Union Privacy
To print this article, all you need is to be registered or login on Mondaq.com.

On 17 July 2024, the European Data Protection Board (EDPB) released a statement (the Statement) on the important role Data Protection Authorities (DPAs) play within the EU Artificial Intelligence Act (AI Act) enforcement framework.

On 18 July 2024, the Data Protection Commission of Ireland (DPC) also released its first official guidance on the interplay between data protection and AI (the Guidance). The Guidance provides an overview of AI systems, large language models (LLMs) and the associated data protection risks and issues for organisations to consider.

In this article, we provide an overview of each regulatory publication.

The EDPB Statement

The Statement was timely, given that the AI Act will enter into force on 1 August 2024. The Statement highlights the interplay between the AI Act and data protection legislation, such that they must be considered as complementary and, therefore, interpreted in light of each other. The Statement stresses that data protection legislation is applicable to the processing of personal data involved at all stages of the lifecycle of AI systems. As such, the Statement places a significant emphasis on the vital role of national DPAs in the supervision and enforcement of the AI Act.

The AI Act requires Member States to appoint Market Surveillance Authorities (MSAs) at the national level before 2 August 2025. Their role will be to supervise and enforce the AI Act.

EDPB Recommendations

" The EDPB recommends that DPAs should be designated as MSAs at a national level as they "have proven and are proving indispensable actors in the chain leading to the safe, rights-oriented and secure deployment of AI systems across several sectors" and have skills in many areas referred to in Article 70 of the AI Act, such as data computing and data security. They should supervise the use of high-risk AI systems mentioned in Article 74(8) of the AI Act, which are concerned with biometrics used for law enforcement, border management, administration of justice, and democratic processes. Member States should consider appointing DPAs as MSAs for the remaining high-risk AI systems as listed in Annex III of the AI Act, particularly where those high-risk AI systems are in sectors likely to impact the rights and freedoms of natural persons with regard to the processing of personal data, leveraging their expertise in data protection to ensure ethical and lawful AI deployment (unless those sectors are covered by a mandatory appointment required by the AI Act).

" Since Article 70(2) of the AI Act requires a single point of contact, each DPA (acting as MSA) should also be designated as the single point of contact for the public at Member State and Union levels.

" Clear procedures should be established under Article 74(10) of the AI Act for cooperation between MSAs and the other regulatory authorities tasked with supervising AI systems, including DPAs. In addition, appropriate cooperation should be established between the EU AI Office and the DPAs/EDPB.

The Statement places DPAs at the centre of data protection enforcement under the AI Act. For Ireland, this means the DPC could see its responsibilities greatly expanded, particularly in light of its existing prominence in regulating Big Tech and major AI companies with EU headquarters in Dublin due to the GDPR's consistency mechanism (known as the 'one-stop-shop').

DPC guidance on AI, LLMs and data protection

The Guidance includes explainers on generative AI and LLMs, and importantly, it outlines some of the main data protection risks and considerations for individuals and organisations who use, design, develop or provide AI systems.

Overall, the Guidance highlights that where the development or usage of an AI system involves the processing of personal data, the GDPR and data protection legislation will apply. Therefore, coordinated and consistent enforcement is crucial to give businesses certainty, particularly as the extra-territorial effect of the AI Act mirrors that of the GDPR. Namely, if an AI system's output is available in the EU, it will trigger the requirements of the AI Act, irrespective of whether a deployer or provider is based in the EU.

Issues to Consider

To help organisations recognise the risks associated with any processing of personal data by AI systems and determine whether they are infringing the GDPR, the Guidance lists some key issues associated with using AI products, which can be summarised as:

  • Be aware of the risk of using personal data to train AI. If unanticipated processing of personal data occurs, it could impact GDPR principles.
  • Ensure that your organisation has processes in place to facilitate the exercise of rights by data subjects, including access data or deletion of data from your AI system.
  • Understand the additional data security or data protection risks associated with using third-party AI products.
  • Consider the risk of AI models unintentionally regurgitating (personal) training data.
  • Ensure that AI filters are secure against attacks and mitigate unauthorised data processing.
  • Mitigate risks of inaccurate or biased AI outputs affecting decisions.
  • Adhere to the 'storage limitation' principle and keep a data retention schedule.
  • If you publish personal data on your own website, ensure to protect it from unauthorised AI training.

Specific Considerations for AI product designers, developers and providers

The Guidance also contains specific advice for organisations that provide an AI product using personal data. These organisations may be deemed as controllers or processors under the GDPR and, as such, are subject to relevant obligations. Below is a summary of the considerations highlighted in the Guidance:

  • Always ensure that the purpose and scope of your processing have a sufficient legal basis. Consider if non-AI technologies can achieve your goals.
  • Remember, publicly accessible personal data still falls within the scope of the GDPR.
  • When using personal data, ensure the processing is for the purpose that the data subject intended. When using special category data or new technology, you must carry out a data impact assessment.
  • When processing data from a third-party organisation, ensure you have a sufficient legal basis under the data use agreement.
  • Ensure you are transparent with data subjects about how you are using their data.
  • Ensure that your AI product and any associated personal data are protected from any unauthorised use.
  • Ensure you have appropriate personal data governance to comply with GDPR accountability requirements.

Conclusion

The EDPB's timely Statement ahead of the AI Act's coming into force places significant emphasis on the crucial role of DPAs in supervising the implementation and enforcement of this landmark piece of legislation. The EDPB points to DPAs' proven experience and expertise in AI-related areas as the reason for recommending that they be designated as MSAs for high-risk AI systems. It may be no coincidence that, immediately after this Statement, the DPC issued its first official guidance on the interplay between data protection and AI. The Guidance could be the first indicator of the DPC's key role in the AI Act framework going forward.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More