Meta Begins Using EU Users' Public Data for AI Training: Regulator Response and Citizens' Rights
- Legal Basis
- Timeline of Events
- Oversight and Criticism
- What exactly does Meta use?
- How to opt out of your data being used?
- Current Disputes and Legal Uncertainty
- Context: Standard Practice or Exception?
- Practical Recommendations
- Contact our lawyer for more details
From 27 May 2025, Meta Platforms Ireland Ltd. (operator of Facebook, Instagram and Threads in the EU) began using public data from adult users in the EU to train its artificial intelligence models.
Users retain the right to object and exclude their data from training, in accordance with GDPR provisions.
Legal Basis
Meta relies on the legal basis of "legitimate interest" (Article 6(1)(f) GDPR). This approach has drawn criticism from the European legal community. Under this framework, the company considers it permissible to process data without explicit consent if it has compelling commercial reasons and the data processing does not infringe upon the rights and interests of data subjects.
Timeline of Events
- December 2024: The EDPB published general guidance (Opinion 28/2024) for supervisory authorities on assessing compliance with the use of personal data for AI training, requiring individual assessment of each case.
- April 2025: Meta announced the resumption of plans to use EU user data for AI training (14 April).
- 21 May 2025: The DPC published a statement indicating it would not prevent Meta from commencing data usage following the company's implementation of various improvements, whilst continuing active compliance monitoring.
- 23 May 2025: The Higher Regional Court of Cologne dismissed an application by the North Rhine-Westphalia Consumer Protection Organisation seeking an injunction against Meta's use of data for AI training.
- 27 May 2025: Meta actually commenced using EU users' public data.
Oversight and Criticism
As Meta's principal EU office is located in Ireland, compliance oversight is conducted by the Irish Data Protection Commission (DPC). However, various European regulators hold differing positions: the Hamburg Data Protection Commissioner initiated urgent proceedings against Meta, demanding suspension of AI training on German users' data for at least three months.
The privacy rights organisation NOYB sent a cease-and-desist letter to Meta's Irish headquarters, threatening collective legal action if the company continues using data without explicit user consent (opt-in).
Privacy advocates argue that using data without consent violates transparency and data minimisation principles enshrined in GDPR:
Publicness of data ≠ consent for machine learning
The principle of "available means usable" contradicts the spirit of GDPR. Even if a user publishes a post publicly, this does not mean they are aware of its subsequent machine use.
Insufficient information Users receive notifications about new terms, but the objection form itself is hidden within complex navigation. This raises concerns about the reality of choice provided.
Distortion of balance of interests "Legitimate interest" may only be applied provided it does not outweigh users' fundamental rights. Critics believe that training AI on user data fails to meet this criterion.
What exactly does Meta use?
- Only public data: posts, photos, captions, comments accessible to all;
- User interactions with Meta AI: questions, queries and other forms of communication with the AI assistant;
- Even if you don't have an account, your images or mentions in others' publications may be included in the training dataset. In this case, you also have the right to lodge an objection;
- Content accessible only to friends, as well as private messages, are not used by Meta for training purposes;
- Public data from accounts of users under 18 is NOT used for training.
How to opt out of your data being used?
In accordance with GDPR, you have the right to object to the use of your data. To do this, you need to:
- Access Facebook or Instagram settings.
- Navigate to the "Privacy Centre" section.
- Find the "Right to Object" option and complete the relevant form.
Alternative method: use the direct link to the objection form that Meta includes in user notifications.
Please note: if you do not lodge an objection, Meta may use both your past and future public data. Once training is complete, it is impossible to remove the influence of this data from the models. Under GDPR Article 21, Meta must cease processing your data upon objection unless it can demonstrate compelling legitimate grounds that override your interests, rights and freedoms - though Meta has stated it will assess each objection individually.
Current Disputes and Legal Uncertainty
As of late May 2025, there are divergent regulatory positions across the EU regarding the use of data for AI training. Collective complaints from privacy rights organisations are expected, along with possible development of legal proceedings in multiple jurisdictions simultaneously.
NOYB has already sent a letter demanding cessation of violations and threatening legal action. The Hamburg regulator has initiated urgent proceedings demanding suspension of AI training on German users' data.
There is a possibility that the case will reach the EU Court of Justice, where questions about the boundaries of permissible use of personal data for AI, as well as the legal nature of publicness in the digital environment, will arise again.
Context: Standard Practice or Exception?
Meta emphasises that using user data for AI training is standard practice, which the company has already applied in other regions since launching its AI models. Meta claims that Google and OpenAI have also used European users' data to train their models, though the specific legal bases and approaches used by different companies may vary.
Practical Recommendations
For users and companies:
- Check what data in your profile is publicly accessible;
- If necessary, restrict content visibility to friends only;
- Use Meta's provided form to lodge an objection;
- Bear in mind that Meta does not guarantee satisfaction of all objections and will consider them individually;
- Monitor your national regulator's position in your country;
- Consider changing privacy settings in advance, as data already used for training cannot be "extracted" from models.
The Irish Data Protection Commission plans to reassess the situation in October 2025, which may lead to further changes in data usage policy.
Our Arbitration & IT Disputes team is ready to assist users and businesses: from correctly lodging objections with Meta to risk assessment and auditing AI practices' compliance with GDPR requirements.
Authors: Kamal Tserakhau, Krystsina Vainilovich
Contact our lawyer for more details
Write to lawyer