In its ruling of December 2, 2025 (C-492/23 “Russmedia”), the ECJ set a course for online platforms with user-generated content (UGC) that goes far beyond the specific case of a classifieds marketplace. In essence, it is about an area of friction that many platform operators have been underestimating for years: Although the classic host provider privilege (formerly E-Commerce Directive 2000/31/EC, now largely further developed in the Digital Services Act) limits liability risks for “third-party information”, it does not automatically answer the question of who acts as the “controller” under data protection law as soon as personal data is processed in UGC content.
- ECJ classifies platform operators as joint controllers if they co-determine the purposes and means of publishing UGC.
- Host provider privilege from eCommerce Directive/DSA does not neutralize GDPR obligations; no escape from data protection responsibility.
- Triad of obligations under Art. 9 GDPR: identify sensitive content, verify user identity, refuse publication if necessary.
- Risk-based approach: Art. 24, 25, 32 GDPR require advance measures instead of pure notice-and-takedown.
- Terms and conditions with extensive rights of use/syndication speak for the platform's own purpose and support responsibility.
- No general monitoring obligation, but technical anti-copy TOMs required, as far as reasonable and state-of-the-art.
- Broad relevance for UGC ecosystems (forums, games, creators): differentiated workflows, identity gates and Art. 26 regulations required.
The initial facts of the case are quickly told and at the same time prototypical: an ad was posted on a Romanian online marketplace that falsely portrayed a woman as a provider of sexual services and used photos and a telephone number without consent. The platform operator deleted the ad within a short time after being informed, but the content continued to circulate on third-party sites. The national courts ruled differently; the Court of Appeal referred questions to the ECJ regarding responsibility under data protection law and the relationship to the exemption from liability under the E-Commerce Directive. (
The implications of this are clear from the structure of the ECJ’s answer: it classifies the marketplace operator as a (joint) controller, affirms joint responsibility with the posting user (“joint controllership”) for the publication process and derives upstream obligations from this, which in practice act as a “pre-upload compliance gate” – at least for sensitive data within the meaning of Art. 9 GDPR. At the same time, the ECJ brushes aside the attempt to neutralize these obligations via the host provider privilege.
It is important to note that the ruling is not a general carte blanche for comprehensive upload filtering in all situations. The Court’s arguments are recognizably risk-based, data category-related (Art. 9 GDPR) and in line with the accountability system of the GDPR (Art. 5 para. 2, Art. 24 et seq.). However, it is precisely this system that collides with many platform architectures that rely on rapid publication and downstream moderation in the “UGC default”.
The central change of dogma: “Controller” despite third-party content
The classic thinking of many platforms is: “Content comes from users; platform is neutral; therefore, in terms of data protection law, at most a technical service provider.” The ECJ makes it clear that this narrow view does not apply as soon as the platform co-decides on the purposes and means of processing. The decisive factor for the publication of personal data on the internet is not who types the text or uploads the photo, but who enables, parameterizes and commercially uses the publication.
The Court of Justice initially focuses on the well-known definition from Art. 4 No. 7 GDPR: The controller is the person who “alone or jointly with others” decides on purposes and means. It expressly emphasizes that “joint controllership” does not necessarily require a joint, formally coordinated decision; it is sufficient that decisions “converge” and each have a noticeable influence on purposes and means.
Transferred to the marketplace: The user typically determines the content and destination of their ad, i.e. the immediate purpose. The platform in turn determines the framework for publication: which categories there are, how long content is visible, how it is made findable, whether it can be posted anonymously, which technical processes apply when uploading, whether content can be passed on to partners, how it is presented. The co-determining control lies in this “publication machine”. The decision also emphasizes that the general terms and conditions of the marketplace provided for extensive rights of use of the ad content (copying, distribution, transmission, transfer to partners, removal at any time). Such clauses are not just “IP housekeeping”, but are seen as an indication of the platform’s own purpose under data protection law: The publication is not merely “for the user”, but also for its own commercial purposes.
This reveals a common compliance error: T&C texts that grant generous rights to UGC from a product or marketing perspective (syndication, cross-posting, “partners”, “promotion”) can pave the way for liability under data protection law – at least if personal data is part of the UGC. This applies not only to classified ads, but also to creator platforms, community hubs, comment areas, rating portals, modding portals and in-game marketplaces.
The concrete triad of duties: identify, verify, deny
The real explosive force of the Russmedia ruling lies in the preliminary obligations for sensitive data derived from it. The ECJ formulates them in a remarkably operational manner:
Firstly, the platform must identify “prior to publication” those ads that contain sensitive data within the meaning of Art. 9 para. 1 GDPR. Secondly, it must verify whether the user who wishes to post such an ad is the person whose sensitive data appears in the ad. Thirdly, it must refuse publication if this is not the case – unless the user can provide evidence of explicit consent (Art. 9 para. 2 lit. a GDPR) or another exception under Art. 9 para. 2 GDPR.
This triad is not based on an abstract “be careful”, but on the accountability logic: anyone acting as a (joint) controller must not only “hope” that data protection principles are lawful and complied with, but must also be able to prove this in accordance with Art. 5 para. 2 GDPR. The bar is higher for sensitive data: processing is generally prohibited, exceptions are narrow. Therefore, in the ECJ’s model, a mere notice-and-takedown is not sufficient if the platform systemically allows sensitive third-party data to be published without consent.
In practical terms, the ruling is both specific and tricky: “sensitive data” is not just health records or party registers. Art. 9 GDPR includes data on sexual orientation/private life, health data, political opinions, religious beliefs, trade union membership, biometric data for identification purposes and genetic data. Even a UGC post “X is depressed” or “Y is a member of party Z” can slip into this zone – even if it comes across as an “opinion” or “rumor” in colloquial language. This means that the more openly a platform allows UGC (forums, comment columns, in-game chat logs, community boards), the greater the risk that Art. 9 data will “resonate”.
The ruling therefore forces us to differentiate between platform interfaces and functional logic: a classified ad form that asks for categories in a structured manner and offers upload fields is more likely to be equipped with technical apron controls than a live chat. Nevertheless, the key question remains the same: Where is the platform in a position to take “appropriate technical and organizational measures” (Art. 24, 25 GDPR) to control risks before content becomes public?
Caution is required here with reflexive conclusions. The ECJ does not say: “Everything must be moderated in advance.” It says: Anyone who can be held responsible for the publication of sensitive third-party data must be set up organizationally and technically in such a way that (a) sensitive content can be identified and (b) an identity/consent check is possible for such content. This can take very different forms in terms of implementation: from upload workflows with risk flags to staged publication (“pending review”) to functional isolation of certain UGC areas (e.g. no public visibility without account verification, restriction of image uploads, limitation of certain categories). The decisive factor is proportionality in the light of “nature, scope, context, purposes” and the risks (Art. 24, Art. 32 GDPR).
No escape via the host provider privilege: GDPR remains GDPR
The second major block of the decision concerns the relationship with the E-Commerce Directive. Platform operators have always argued that as long as there is no knowledge of specific illegality, the exemption from liability for third-party content applies; there is no general monitoring obligation. These principles are still relevant for many liability regimes (e.g. civil law concepts of fault-based liability, copyright notice-and-takedown processes, DSA obligations). However, the ECJ draws a clear line: For breaches of data protection obligations under the GDPR, the marketplace operator cannot invoke Art. 12-15 of the E-Commerce Directive to “undermine” the GDPR obligations. ( European Court of Justice)
The classification of the prohibition of general monitoring obligations is particularly important here. The ECJ expressly states that the GDPR compliance obligations of the platform are not to be classified as “general monitoring obligations” within the meaning of Art. 15 of the E-Commerce Directive. In other words, even if national legal systems may not require general monitoring, the GDPR – based on risk and data categories – may very well require measures to be taken in advance.
This makes compliance design more complex: on the same platform, a post that is “only” offensive can be treated according to the classic notice-and-action mechanisms under liability law, while a post with the same content that also contains sensitive third-party data slips into a stricter regime under data protection law before publication. The “system break” emphasized by commentators is real: parallel duty regimes arise, which must be brought together in the product and moderation.
Another aspect: The Advocate General had – as far as documented – assessed the responsibility of the platform operator much more cautiously and classified it more as a processor or emphasized the parallel validity of the host provider privilege. The ECJ has deviated from this. In practice, this is a signal: a “we are only a host” narrative is not reliable under data protection law, even if it can continue to have a protective effect in other areas of law. (
“Anti-copy” obligations and the technical reality of UGC ecosystems
The ruling also contains a third component that is often underestimated in operational terms: Platforms must implement suitable security measures for sensitive data in order to prevent copying and unlawful publication on other websites, insofar as this is technically possible (Art. 32 GDPR risk-based). The ECJ does not formulate a guarantee obligation, but an obligation to seriously minimize risks based on the state of the art. The ruling states that the controller must consider technical measures that are “apt to block the copying and reproduction of online content”; at the same time, a subsequent unlawful dissemination should not automatically lead to the conclusion that the measures are unsuitable – the controller must have the opportunity to provide proof of exoneration.
This is tricky because it is technically difficult to completely prevent the “copying” of online content. Screenshots, scraping, re-uploads, mirror sites: The reality of an open network is replication. The obligation should therefore not be understood as a “digital DRM for classified ads”, but as a risk-based matrix of measures. Examples could be: Hotlink protection, rate limiting and bot mitigation, restrictions on embedding, differentiated robots rules (although this does not guarantee security in the sense of Art. 32), token-based delivery of images, watermarks (carefully: not as additional personal processing), shortened cache durations, access hurdles for sensitive categories, technical deindexing/takedown pipelines, partner syndication controls and – centrally – a limitation on the initial publication of sensitive data through pre-gates.
This is precisely where it becomes clear that “technology” and “law” cannot work separately. A GDPR-compliant platform setup requires technical design decisions (“privacy by design” and “by default”, Art. 25 GDPR) that are anchored in product roadmaps, architecture and moderation. For UGC platforms, this is not a one-shot project, but an ongoing risk management process.
Relevance for games, community platforms and creator ecosystems
A common objection is: “That was a marketplace, not a game.” This is formally correct – but only partially reassuring in terms of the risk logic. Games and gaming-related platforms have long been UGC ecosystems: chat, voice transcripts, profile texts, clan pages, screenshots, replays, user avatars, mod uploads, in-game marketplaces, guild recruitment posts, matchmaking forums, support tickets, community discord bridges, creator tools. In all these areas, personal data is processed and often made public.
Russmedia logic can be relevant here on several levels:
- In-game marketplaces and item trading platforms: As soon as users can post ads or offers, the parallel to the marketplace is immediate. If profiles or offers contain sensitive data (e.g. “Looking for clan for therapy breaks”, “Looking for fellow players, am HIV-positive”, “LGBTQ-only clan”), Art. 9 relevance potentially arises – regardless of the assessment in the community context. The ruling means that such content must either be moved to protected, non-public areas or preliminary checks must be implemented.
- Forums, comment areas, group functions: The main problem here is scaling. The ECJ thinks in terms of “identifying sensitive content”. For forums, this can mean risk-based triggers (keywords, categories, reporting functions) and functional differentiation. For example, a “health/support” area may not be publicly indexable by default, only be activated for verified accounts and have stricter posting rules. Such design decisions are not only community management, but also data protection risk reduction.
- UGC assets (mods, skins, maps, user portraits): As soon as UGC assets contain images, the risk of personal rights and data protection violations increases (third-party photos, deepfakes, biometric identifiability). The Art. 9 dimension can be linked to biometric data for unique identification (facial recognition), for example, at least if platform processes direct processing in this direction. In practice, the boundaries are complicated; however, the ruling increases the expectation that platforms will design upload processes in a risk-appropriate manner.
- Cross-posting, partner syndication, “featured content”: Many platforms actively spread UGC, for example through “trending” pages, social media embeds, partner networks, in-app feeds. The ruling differentiates: If the platform actively transfers content to partners, this can be a separate processing chain for which it is solely responsible. This is legally and contractually relevant because this is precisely where the purpose and means are particularly clearly in the hands of the platform.
The result is not a blanket obligation to turn user identities into clear name-based identities across the board. However, for certain risk zones – especially for potentially sensitive data – platforms will have to explain in future why a certain level of verification, workflow gating or visibility limitation is not required. Heise and other voices read the ruling as a step towards a “cleannet” or as de facto pressure for anonymous use. Whether this socio-political assessment is convincing is a separate debate; in any case, the ruling forces a new architecture of responsibility in terms of technical law.(heise online)
Contract and policy consequences: GTC, moderation, role models
For platform operators, Russmedia is less of a “legal opinion” and more of a restructuring mandate for governance and texts. Three areas are typically affected:
(1) Role model and documentation of joint controllership (Art. 26 GDPR)
The ECJ assumes that the marketplace and users are joint controllers for the publication process. This immediately raises the Art. 26 question: How are the “respective responsibilities” regulated transparently if the user is not even an individually negotiable contractual partner in the bulk business? In practice, this boils down to standardized platform conditions that map out data protection roles, user obligations (in particular, no third-party data without a legal basis), obligations to cooperate (e.g. proof of consent) and responsibilities for data subject rights. At the same time, the ECJ emphasizes that such a regulation would be impossible if the user remains anonymous to the platform – this is the legal lever for identity-gating in the case of sensitive content.
(2) UGC clauses on rights of use and syndication (“copy, distribute, transfer to partners”)
The GTC passage on extensive rights to UGC was prominent in the facts of the case. This does not necessarily mean that such clauses are “prohibited”. But they must be embedded in a coherent data protection concept: If UGC can contain personal data, it needs clear purposes, transparency (Art. 13/14 GDPR), limitation to what is necessary, differentiated consent models (where really necessary) and, above all, a product logic that prevents third parties from posting data “for” users. According to Russmedia, the purely declaratory sentence “User is responsible” will not suffice as a protective shield if the platform simultaneously pursues publication and dissemination purposes.
(3) Moderation and notice processes as “TOMs” (Art. 24, 25, 32 GDPR)
According to Russmedia, moderation processes are not just DSA obligations or community management, but can also constitute technical and organizational measures (TOMs) within the meaning of the GDPR. This increases the demand for verifiability: What checking mechanisms exist for sensitive data? Which triggers lead to “Hold” instead of “Publish”? How is identity checked when Art. 9 risks occur? How is it documented that a user has consent or that an exception applies? How are re-uploads recognized? How is the risk of copying reduced? These are governance questions that cannot be answered with a screenshot of a “Report” button in the event of a dispute.
In practice, a risk-based layered model will become increasingly important: Open UGC areas with purely downstream notice-and-action for “normal” content; stricter pre-controls, account verification and visibility limits in areas where sensitive data typically occurs; and special processes for image content and profile elements that carry particularly frequent personality rights violations.









































