April 25. 2024. 7:00

The Daily

Read the World Today

EU Commission issues internal guidelines on ChatGPT, generative AI


The European Commission issued on Tuesday (30 May), internal guidelines for staff on using and interacting with online generative AI models, in particular addressing their limitations and risks.

The document “Guidelines for staff on the use of online available generative Artificial Intelligences tools” and its accompanying note, seen by EURACTIV, were made available in the Commission’s internal information system.

The Commission’s Information Management Steering Board adopted the guidelines on 27 April.

“The guidelines cover third-party tools publicly available online, such as ChatGPT. They aim at assisting European Commission staff in understanding the risks and limitations that online available tools can bring and support in appropriate usage of these tools,” reads the accompanying note.

In particular, the document is meant to guide staff members in managing the risks and limitations of generative AI such as ChatGPT, Bard and Stable Diffusion that generate content based on user prompts.

The Commission noted how these tools have the potential to boost efficiency and improve the quality of work office productivity as they could help to write briefing and develop computer code, but that usage also needs to be guard-railed under a set of conditions.

The guidelines note that the “discussed risks and limitations are not necessarily relevant for internally developed generative AI tools from the Commission. Internal tools developed and/or controlled by the Commission will be assessed case by case under the existing corporate governance for IT systems.”

In addition, it stresses that it should be considered a ‘living document’ to be kept up to date based on technological developments and incoming regulatory interventions, namely the EU’s AI Act.

The first risk outlined is the disclosure of sensitive information or personal data to the public, as the guidelines point out that any input provided to an online generative AI model is then transmitted to the AI provider, meaning it can subsequently feed into future generated outputs.

Thus, EU staff is forbidden from sharing “any information that is not already in the public domain, nor personal data, with an online available generative AI model.”

The second point relates to the potential shortcomings of the AI model, which might lead to wrong or biased answers due to an incomplete data set or the design of the algorithm, on which AI developers are not always transparent.

The rule is, therefore, that EU officials “should always critically assess any response produced by an online available generative AI model for potential biases and factually inaccurate information.”

Moreover, the document adds that this lack of transparency also implies a risk of breaching Intellectual Property rights, particularly copyright, as protected content might be used to train the AI model. As the produced outputs do not credit the used material, it is nearly impossible for users to obtain the necessary authorisation from rightsholders.

As a result, staff members are always requested critically to assess whether the AI-generated output violates IP rights, and copyright in particular, and to “never directly replicate the output of a generative AI model in public documents, such as the creation of Commission texts, notably legally binding ones.”

Finally, the document stressed that generative AI models might have a long response time or not be available all the time. Hence, the Commission staff is banned from relying on these tools for critical and time-sensitive tasks.

Read more with EURACTIV

AI Act to impact EU countries asymmetrically, Slovak expert says

AI Act to impact EU countries asymmetrically, Slovak expert says

To ensure effective implementation of the EU’s AI rules, member states without a rich AI ecosystem have used negotiations to try to mitigate the potential burden. Still, Slovakia sees opportunities for a comparative advantage if it succeeds in implementing them.