Dutch Data Protection Authority fines GDPR violation
The Dutch privacy watchdog, the Dutch Data Protection Authority (Autoriteit Persoonsgegevens, AP), has imposed a fine of €30.5 million on the American company Clearview AI Inc. (Clearview). This company illegally scraped photos of millions of people from the internet and then provided these photos with a biometric code per face. International intelligence and law enforcement agencies can purchase this data for facial recognition purposes, for example, to identify a person by comparing the biometric data in the database with images from a surveillance camera.
Can photos found online be used for this purpose?
On May 16, 2024, the AP already ruled that these activities of Clearview are in violation of the General Data Protection Regulation (GDPR) Article 9(1). The AP now also states that Clearview has seriously violated the GDPR on multiple points: “…the company should never have created the database and is insufficiently transparent…”. The purpose of the fine is to put an end to the violations. To prevent recurrence in the future, the AP has now imposed penalties of up to €5.1 million.
Possible response from Clearview
Clearview has not responded, other than stating that it is based in the US and does not fall under the scope of the GDPR, which is legally vulnerable, to say the least. They might have been able to invoke Article 9(2)(e) of the GDPR: that the information itself has been made public. The principle of “You’ve made your bed, now lie in it.”. But Clearview will not get away with it that easily. The AP has already extensively motivated its decision. In this article, I will focus on just one aspect that stands out: the way the AP considered whether the collection of public photos falls under the exception provision of Article 9(2) of the GDPR.
Are photos published on the internet by the data subject essentially released?
The AP (Dutch Data Protection Authority) rules that the exception ground of Article 9(2)(e) is not applicable:
“…The mere fact that the aforementioned personal data are found online does not mean that the data subjects have expressly and by means of an unequivocal active action intended to make all those data public to a broadly accessible audience. For example, this is not the case when a photo of (the face of) a data subject is posted on the internet by a third party. Also, the situation where a user has set their social media profile to private and this user does not have the option to shield the profile picture (or is not aware of this option) does not count as making it publicly accessible, as intended here. There is no question that this user has expressly and by means of an unequivocal active action intended to make their personal data accessible to a broad audience…”
Guidelines regarding scraping
On May 1, 2024, the AP already published a guideline on Scraping, which provided an extensive refinement regarding personal data that is apparently made public by the data subject, including photos. See also our earlier article “scraping is almost always illegal.”
The AP states in that guideline on page 21 that if the default settings of a social media platform ensure that information is not public, and someone chooses to change the settings so that the information becomes public, then it is considered public disclosure. Therefore, an exception to the processing prohibition applies, according to the AP.
“…However, this conclusion is undermined a paragraph later with the following, referring to the EDPB Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, margin no. 76 of April 26, 2023: For photos publicly posted by a data subject, something else is important. The mere fact that a facial image has been made public by the data subject does not mean that you can also consider the potential biometric data (which you can extract from a photo with specific technical means) as publicly disclosed by the data subject. You may only consider biometric data as publicly disclosed if the data subject has deliberately made the biometric template – not just a facial image – freely accessible in a public source….”
Missed opportunity for clarity in the decision
In this light, the decision regarding Clearview seems somewhat hastily motivated. It would have been clearer for practice if the decision had been better aligned with the previously published guideline. In the decision, the AP could have referred to the guideline regarding scraping and thus also the EDPB guidelines.
Is your company still GDPR compliant?
Since the introduction of the GDPR in 2018, much has been clarified and the AP has started issuing fines. Do you have any questions about GDPR privacy legislation? Or would you like to know if your business operations are still in line with privacy legislation? Contact one of our lawyers by email, phone, or fill out the contact form for a free initial consultation. We are happy to think along with you!