ChatGPT and the EU AI Act
General Purpose AI has been a prominent feature of 2023 with the release of large language models like OpenAI’s ChatGPT and Google’s Bard. Brian McElligott, Head of our Artificial Intelligence team looks at how the draft EU AI Act will likely designate General Purpose AI and the obligations providers of such systems will be subject to.
General Purpose Artificial Intelligence (AGI) – interchangeably referred to as Generative AI – has been prominent in the news with the release of large language models like OpenAI’s ChatGPT and Google’s Bard. In the field of copyright law, Getty Images recently announced that they were suing Stability AI, the creator of AI image generator Stable Diffusion, for copyright infringement.
With the draft EU Artificial Intelligence Act (AI Act) currently being considered by the EU Parliament, recent speculation suggests that the EU Parliament is considering adding AGI to the category of “high risk” AI systems. It may be that AGI will default to the transparency category but that its ultimate uses in products could be high-risk. In other words, the determination will be based on final use rather than the model itself.
AGI refers to a category of algorithms that generate new outputs based on the input data they have been trained on. The EU Council’s draft of the AI Act added a new Title IA. While only containing three clauses, the new Title IA has been added to account for situations where AI systems can be used for many different purposes and in circumstances where AGI technology gets integrated into another system which itself may become high-risk. The EU Council’s text specifies in Article 4b(1) that certain requirements for high-risk AI systems would also apply to AGI. The draft AI Act defines a ‘general purpose AI system’ as one that is intended by the provider to perform generally applicable functions such as image and speech recognition, audio and video generation, pattern detection, question answering, translation and others. This definition applies irrespective of how the system is placed on the market or put into service, including as open source software. An AGI system may also be one that is used in a plurality of contexts and is integrated in a plurality of other AI systems.
There will be significant compliance documentation required for providers of AGI systems, including the production of technical documentation in accordance with Article 11 of the AI Act. This technical documentation must demonstrate that the high-risk AI system complies with the requirements set out in the AI Act. In addition, the documentation must provide national competent authorities and notified bodies with all the necessary information in a clear and comprehensive form to assess the compliance of the AI system with those requirements. This documentation must be kept at the disposal of the national competent authorities for a period ending 10 years after the general purpose AI system is placed on the Union market or put into service in the Union.
In terms of exemptions, Article 4(b) of the AI Act will not apply when the provider has explicitly excluded all high-risk uses in the instructions of use or information accompanying the general purpose AI system.
Conclusion
We await the final text of the AI Act to see if AGI will be classified in the same category as “high risk” AI systems. Recent reports suggest that lawmakers may default AGI to the transparency category but that its ultimate uses in products could be high-risk. The EU Parliament is expected to finalise its position on the AI Act in the coming weeks. The intense nature of the negotiations and the constant reports of progress and compromises suggests a willingness on the part of European Union lawmakers to meet the projected deadlines for the passage of this law through Parliament and on to the Commission.
For more information and expert advice on the proposed legislation, contact a member of our Artificial Intelligence team.
The content of this article is provided for information purposes only and does not constitute legal or other advice.
Share this: