This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 2 minute read

ChatGPT Case: How the Italian Data Protection Authority Is Trying To Address AI Risks

Thanks to Elena Mandarà for collaboratin on this article

With a resolution issued on March 30, 2023, on an urgent basis the Italian Data Protection Authority (Garante per la protezione dei dati personali, “Garante”) imposed an immediate temporary limitation on the processing of Italian users’ personal data carried out by the company OpenAI L.L.C. (“OpenAI”) through the ChatGPT service, an AI platform capable of emulating and processing human conversations.

Specifically, the Garante pointed out that OpenAI did not provide users and other data subjects whose personal data is collected clear and complete information about the processing of personal data for training purposes. Moreover, the Garante addressed the lack of an adequate legal basis for the collection and processing of personal data for the purpose of training the algorithm, as well as the lack of an appropriate age verification mechanism for users using the service. Finally, according to the Garante, there is a risk of processing inaccurate personal data because ChatGPT does not always provide correct information in response to questions regarding individuals.

Upon receiving the provision, OpenAI requested a meeting with the Garante to address the concerns it raised.

Following that meeting, and in light of information provided by OpenAI, on April 11 the Garante issued a second resolution noting that OpenAI had been cooperative and suspending the previous order as long as certain conditions are met.

More specifically, the Garante ordered OpenAI to do the following:

  • publish on its website a privacy policy, available to data subjects (including -non-users), explaining the methods and rationale underlying the processing to train the algorithm;
  • make available a tool that data subjects (including non-registered subjects) can use to object to the processing of their personal data collected from third parties and to request that their personal data be corrected or deleted;
  • include a link to the privacy policy during the registration process;
  • rely on consent or legitimate interest as a legal basis for processing, rather than on the need to execute a contract;
  • request user to pass an age-gate (including the ages of users already signed in) to avoid providing the service to minors;
  • provide the Authority with a plan to implement age verification mechanisms, no later than May 31;
  • sponsor an information campaign to raise user awareness of possible collection and processing of personal data for the purpose of training the algorithm, no later than May 15.

OpenAI must comply with all the measures except the last two by no later than April 30.

Despite the above, the Garante will continue its inquiry in an attempt to establish possible infringement of the applicable data protection rules and may decide to take additional or different measures if this proves necessary.

Interestingly, on April 13th, the European Data Protection Board announced the launch of a task force to foster cooperation and the exchange of information about possible enforcement actions conducted by data protection authorities about the ChatGPT service.

Tags

chatgpt, data protection, gdpr, ai, artificial intelligence, data privacy, privacy, portolano-cavallo