ARTICLE
13 August 2024

Requirements For General-Purpose AI Models And Systems Under The European Artificial Intelligence Act

LP
Logan & Partners

Contributor

Logan & Partners is a Swiss law firm focusing on Technology law and delivering legal services like your in-house counsel. We are experts in Commercial Contracts, Technology Transactions, Intellectual Property, Data Protection, Corporate Law and Legal Training. We are dedicated to understanding your industry and your business needs and to deliver clear and actionable legal services.
As the European Artificial Intelligence Act (AI Act) comes into force, tech companies need to be aware of the new requirements for general-purpose AI models and systems.
Switzerland Technology
To print this article, all you need is to be registered or login on Mondaq.com.

As the European Artificial Intelligence Act (AI Act) comes into force, tech companies need to be aware of the new requirements for general-purpose AI models and systems. These are the rules that aim to balance the benefits and risks of AI. In this article, we'll explain what these requirements are and how they affect your business.

General-Purpose AI Models: The Core Focus

General-purpose AI models, commonly referred to as foundation models, are the building blocks of many AI systems. They can be used for different purposes and across various domains, and are key for many tech products that rely on AI.

The AI Act introduces a two-tiered regulatory framework for general-purpose AI models, distinguishing between regular models and those that pose a systemic risk, requiring that providers of general-purpose AI models comply with certain rules.

General Obligations for All Models

  • Technical Documentation. Providers must maintain detailed and up-to-date technical documentation. This must be accessible to the European AI Office and national authorities of the EU Member States upon request.
  • Transparency and Information Sharing. Providers must disclose certain information to downstream providers who integrate these models into their AI systems. This involves the provision of the technical documentation, including information of capabilities and limitations of the model.
  • Compliance with Copyright Laws. Providers must ensure their models comply with EU copyright law, adopt a policy regarding such compliance and publicly share summaries of the training data content.

Additional Requirements for General-Purpose AI Models with Systemic Risk

Some general-purpose AI models may pose a systemic risk due to their high-impact capabilities and widespread use in the EU. Providers of these models face additional stricter requirements:

  • Risk Assessment and Mitigation. Providers must assess and mitigate the systemic risks associated with their models. This includes conducting model evaluations and adversarial testing.
  • Cybersecurity. Providers are responsible for ensuring robust cybersecurity measures for both the model and its infrastructure.
  • Incident Reporting. Any serious incidents involving the model must be documented and reported to the EU Commission.

Exemption for Open Source

Providers of general-purpose AI models that are released as free and open source, with publicly available details like parameters, model architecture, and usage information, are exempt from some of the usual documentation and information requirements. However, this exemption wouldn't apply if the model is considered to pose a systemic risk.

General-Purpose AI Systems: Integration and Application

While the AI Act places emphasis on the models themselves, remember that general-purpose AI systems built upon these models also fall under the AI Act. These systems might still be classified into various risk categories depending on their application. If a system is built on a general-purpose AI model with systemic risk, the system may inherit some of the model's compliance requirements. Additionally, if the general-purpose AI system is categorised as high-risk, the requirements for high-risk AI systems apply.

Practical Steps for Compliance

For tech companies, staying compliant with the AI Act involves:

  • Updating Documentation: Ensure all technical documentation is current and ready for inspection.
  • Risk Management: Continuously assess and mitigate risks, following the requirements of the AI Act.
  • Maintaining Transparency: Keep downstream providers informed about the capabilities and limitations of the models you provide.
  • Staying Agile: As regulations evolve, so should your compliance strategies.

How we can help

The AI Act sets clear rules for regulating general-purpose AI models and systems, making transparency and legal compliance a must. For tech companies, understanding these requirements is essential to staying competitive and avoiding costly penalties.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More