
Platforms,
are responsible for ads with
Sensitive data.
Platforms,
are responsible for ads with
Sensitive data.
from
How much responsibility do online classifieds portals bear when strangers upload intimate pictures without authorization? And will operators have to check in future whether the person advertising is actually the person in the photos?
When a complaint leads to loss of control
What begins as a harmless classifieds platform can quickly develop into a drastic example of how privacy and data protection can collide, as the Russmedia case shows. An unknown person uploads a fake ad on publik24.ro in which a woman is presented as a provider of sexual services with photos and her real telephone number – without her knowledge and without her consent. The ad spreads rapidly as it is automatically copied from other advertising sites. The woman concerned thus loses control of her own data. Russmedia deletes the ad within an hour of being notified. At this point, however, it already has a life of its own on other portals.
The victim is claiming non-material damages. A court in Romania awarded her 7,000 euros, but the court of appeal overturned the ruling. According to the argument, the operator was only a host provider and therefore privileged. The Curtea de Apel Cluj finally appealed to the Court of Justice of the European Union – and the ECJ took the opportunity to fundamentally clarify the role of platforms in data protection law.
GDPR and e-commerce regime collide – but not at eye level
The case involves two areas of regulation that constantly come into conflict in everyday digital life. The E-Commerce Directive relieves providers who host third-party content of monitoring obligations. The GDPR , on the other hand, relies on comprehensive responsibility, especially when sensitive data is involved.
So while the E-Commerce Directive states: “You don’t have to monitor everything”, the GDPR stipulates: “However, as soon as you process personal data for your own purposes, you bear comprehensive responsibility.” This is precisely the central conflict that the ECJ must resolve.
ECJ: Platforms are responsible parties – with far-reaching obligations
In its judgment of 02.12.2025 – C-492/23, the European Court of Justice clearly states that the
Responsibility despite third-party content
The ECJ clarifies that Russmedia is not just a technical repository for content. The company structures, categorizes, monetizes and uses ads itself, for example through the ability to copy, share or modify content. These own purposes are decisive: the platform determines the purposes and essential means of data processing and thus becomes the controller within the meaning of the General Data Protection Regulation.
The clarity of the court is remarkable: the mere fact that a user posts the ad does not exonerate the operator. Both can be controllers at the same time. The GDPR does not want platforms to hide behind their technical role.
Handling sensitive data: A “red light” moment
The case concerns data on sexuality, which is a particularly protected category. The ECJ clarifies that even if the information is untrue, its sensitive nature remains. Sensitive data may not be processed unless an exception applies. The most important exception is explicit consent, which was clearly not given in this case.
This results in an obligation that will shape future platform practice. Operators must check whether an ad contains sensitive data before publication. If so, they may only publish it if the person concerned is identical to the account holder or has given their express consent. The ECJ therefore requires risk management that goes beyond today’s “notice and take down”. For platforms, this means that they must develop processes that recognize and prevent risks in advance instead of only reacting after the fact.
Before publishing advertisements, the operator of an online marketplace is obliged to identify those
that contain sensitive data, to check whether the person advertising is identical to the data subject and to refuse publication unless explicit consent or another exception under Article 9(2) GDPR applies.
Security measures: Protection also against further distribution
The ECJ’s statement that platform operators must take technical and organizational measures to make the copying and scraping of sensitive ads more difficult is particularly explosive. For the court, the loss of control by those affected in the original case is a direct expression of insufficient security. Quickly deleting the ad alone is not enough; the platform must also prevent third parties from automatically copying content.
The operator of an online marketplace must implement technical and organizational security measures to prevent ads containing sensitive data from being copied and unlawfully transferred to other websites.
This does not mean that platforms should control the internet, but they must create an “appropriate level of security” that at least makes it more difficult for bots, scraping mechanisms and other automated processes to take over content.
Provider privilege? Yes, but not against the GDPR.
The distinction from the E-Commerce Directive is also crucial: the liability privileges do not apply to data protection violations. The ECJ emphasizes that the GDPR stands independently. Anyone who processes personal data is fully subject to data protection law – regardless of whether they are also privileged as a host provider.
For platforms, this means that they cannot escape responsibility by claiming that they are only a “neutral intermediary”.
What platforms need to consider now
This ruling is a wake-up call for operators of marketplaces, classified ad portals and similar services. Many business models are based on the assumption that they are not responsible for the content posted by users. However, this assumption is no longer valid under data protection law. Obligations that now take center stage are:
- Check your own responsibility
Wherever platforms structure, utilize, display or convert ads into reach, they themselves become responsible. - Recognize sensitive data before publication
This can be done using keyword filters, AI-based recognition or manual checks – what is important is a verifiable, risk-oriented procedure. - Identity verification for high-risk ads
A “leap of faith” will no longer suffice for sensitive categories. There will have to be models that verify identity or authorization. - Protection against scraping and unauthorized disclosure
Bot protection, API limits, watermarking and partner control: measures that were previously considered “nice to have” are becoming mandatory compliance. - Documentation and accountability
Platforms must be able to demonstrate how they have implemented the obligations under Article 5(2) GDPR in terms of technology and organization.
Overall, the role of the platform operator is thus evolving away from being a passive host and towards being an active player in terms of data protection law.
Conclusion
The ECJ has drawn a clear line here. Anyone who operates a platform is not just a technical service provider, but also a data protection actor – especially when sensitive data is involved.
The Court requires proactive, intelligent risk management before publication and not only after complaints. At the same time, it makes it clear that the provider privileges of the E-Commerce Directive do not help in this case.
We are happy to
advise you about
Data protection law!







