Brief overview: Upload filters are not an end in themselves, but the result of a new liability regime for platforms. Art. 17 of Directive (EU) 2019/790 and the Copyright Service Provider Act (UrhDaG) oblige certain service providers to prevent infringements at the upload stage – flanked by exceptions, protective mechanisms against overblocking and complaints procedures. Fundamental rights (Art. 5 GG, Art. 11 GRCh) and personal rights have a parallel effect. Anyone hosting or providing content in 2025 will need clear processes, reliable technology and robust contractual clauses.
Legal framework: Art. 17 DSM and UrhDaG – functioning, de minimis limits, “presumed permitted”
Target group and basic principle. It covers service providers within the meaning of the UrhDaG, i.e. platforms that store user-generated content and make it publicly accessible (see Section 2 (1) UrhDaG). The service provider is generally responsible for the communication to the public (Section 1 (1) UrhDaG). In essence, liability can only be discharged if (1) a licence exists, (2) use is permitted by law or (3) the statutory duties of care are complied with (Section 1 (2) UrhDaG). The background under EU law is Art. 17 of the DSM Directive, which establishes a special responsibility for online content-sharing service providers.
Presumed permitted uses. To avoid disproportionate blocks, a bundle of presumptions, thresholds and procedural rights has been created (Part 4 UrhDaG). § Section 9 UrhDaG stipulates that presumed permitted uses must be publicly reproduced until the conclusion of a complaint procedure. The rebuttable presumption applies if the upload
(1) contains less than half of a third-party work (or several works),
(2) is combined with other content and
(3) is minor or has been marked as legally permitted (Section 9 (2) in conjunction with Sections 10, 11 UrhDaG).
De minimis limits. § Section 10 UrhDaG qualifies certain micro-uses as “minor” – up to 15 seconds per cinematographic work/running image, up to 15 seconds per audio track, up to 160 characters per text and up to 125 kilobytes per photograph/illustrated work/graphic. This threshold only applies to non-commercial use or for the generation of insignificant income. This provides a standardized corridor that is technically verifiable and acts as an overblocking brake.
Labeling as permitted by law. If no de minimis use applies, a two-stage procedure arises: If an upload is automatically blocked during uploading, the service provider must inform the user and allow it to be marked as legally permitted (Section 11 (1) UrhDaG). If an upload is only blocked after it has been uploaded, the content is deemed to be presumed permitted for 48 hours, even if it is not marked as such (Section 11 (2) UrhDaG). The concept protects legitimate uses (e.g. quotation, parody, pastiche) from premature blocking.
Complaints and accountability. There is an effective, free and swift complaints procedure for blocked or unblocked content (Section 14 UrhDaG). Until a decision is made, the service provider is not responsible under copyright law for the public reproduction of presumably permitted uses (Section 12 (2) UrhDaG); in the case of minor uses, the user is also not responsible for the time being (Section 12 (3) UrhDaG). The legislator thus combines preventive filtering obligations with procedural safeguards.
Limitations in copyright law. Traditional permissions remain applicable, in particular Section 51 UrhG (quotation) and Section 51a UrhG (caricature, parody, pastiche). These standards often form the material basis for labeling as legally permitted in accordance with Section 11 UrhDaG.
Start-up and small service provider privileges. § Section 2 UrhDaG distinguishes between start-up service providers (e.g. EU turnover ≤ € 10 million, services < 3 yrs.) and small service providers (turnover ≤ € 1 million). § Section 7 UrhDaG provides for relief from upload filter obligations for these groups, but does not release them from other obligations (e.g. license acquisition, procedures). Anyone integrating platform functions into a product should check at an early stage whether the thresholds are exceeded.
Fundamental rights and overblocking: freedom of expression, communication and art in balance
Fundamental rights framework. Upload filters affect fundamental communication rights. Nationally protects Art. 5 para. 1 GG (freedom of expression and information) and Art. 5 para. 3 GG (artistic freedom). Under EU law, Art. 11 CFR is central. The Court of Justice of the European Union has expressly confirmed that Art. 17 of the DSM Directive complies with fundamental rights, but at the same time emphasized the importance of protective mechanisms against overblocking (e.g. C-401/19, Poland v. Parliament and Council). German implementation law takes this into account through Sections 9-12 UrhDaG.
Proportionality through procedure. The legislator does not rely on unrestricted filtering, but requires structured consideration through upfront rules: Threshold values (de minimis limits), the presumption of “presumed permitted”, labeling options, rapid complaints with material reassessment. This combination is intended to mitigate incorrect decisions by automated systems and at the same time safeguard the legitimate interests of rights holders.
“Presumed permitted” is not a general amnesty from restrictions. The presumption protects legitimate use until clarification; it is overturned if the legal permission does not apply. Rights holders retain claims for injunctive relief and removal; the platform’s provisional freedom from responsibility pursuant to Section 12 (2) UrhDaG ends with the decision in the appeal proceedings. The system thus forces all parties involved – platform, uploader, rights holder – to provide verifiable arguments.
Technical diligence. “Best-efforts” obligations (Art. 17 para. 4 DSM Directive) require appropriate measures to prevent unlicensed uses. This means that detection systems are de facto unavoidable. It is crucial that their hit and error rates are controlled: A false positive hit (overblocking) can violate fundamental rights; a false negative hit (underblocking) damages copyright interests. Documented parameters, regular re-tuning and human second checks are therefore an integral part of proportionality.
Personal rights dimension and DSA interfaces: Moderation beyond copyright law
Personal rights online. Upload filters primarily address copyright risks. Nevertheless, the protection of personality rights under civil law (general right of personality under Art. 1 para. 1, Art. 2 para. 1 GG in conjunction with Section 823 BGB) and the right to one’s own image (Sections 22 et seq. KUG) play an important role. Due diligence obligations of the platforms must be designed in such a way that unlawful interventions (e.g. defamation, deepfakes, distortions) are efficiently addressed without suppressing permissible criticism, satire or artistic adaptations.
DSA obligations (platform regulation). The Digital Services Act does not supplement the copyright regime in terms of liability, but in terms of procedure: Reporting systems, complaint channels, transparency obligations and protection of minors are mandatory. For very large platforms, risk assessments, audits and transparency reports are added. In practical terms, this means that notice-and-action for content beyond copyright (e.g. infringements of personality rights) must run coherently alongside the UrhDaG workflows – ideally via standardized intake processes with specific routing.
Limitation right as a bridge. In borderline cases, quotation (Section 51 UrhG) and parody/pastiche (Section 51a UrhG) strike a balance between personality and copyright interests. A satirical meme upload can be permissible under copyright law, but at the same time violate personal rights, for example if it infringes privacy. Moderation guidelines should therefore carry out a two-stage check: (1) admissibility under copyright law, (2) other legal interests (personality rights, competition law, criminal law, protection of minors).
Evidence and documentation issues. Documentation counts for legal enforcement: notifications, reasons for blocking/unblocking, test steps, human reviews, training and threshold changes. These documents are important for internal audits, arbitration and legal proceedings.
Implementation 2025: governance, technology, contracts – a practical roadmap
A. Governance & Responsibilities
- Roles and escalations: Clearly assign those responsible for licenses, filter parameters, legal review, complaint processing and reporting. Document deputization and substitution rules.
- Policies: upload guidelines, permitted content, dealing with remix/parody/quotation, “marking as permitted” (Section 11 UrhDaG), time window (48 h), “presumed permitted” (Section 9 UrhDaG), complaints procedure (Section 14 UrhDaG).
- Protection of minors: DSA protection requirements – age-appropriate default settings, risk mitigation, reporting.
- Start-up status: monitor sales, runtime, visitor numbers; automatic change of compliance level if threshold is exceeded (see § 2, § 7 UrhDaG).
B. Technology & processes
- Recognition systems: Versioning of the models, tests with gold datasets (balanced: music, video, text, images), measurement of precision/recall, false positive rate per work class.
- Threshold control: Define score thresholds so that § 10 cases are not blocked; “low-confidence matches” in human review queues.
- Notification & UI: User-friendly notifications in the event of imminent blocking (reference to Section 11 UrhDaG) and for post-upload matches with 48-hour notification. One-click marking for § 51 / § 51a UrhG; possibility to upload licenses/consents.
- Complaints procedure: Chain of deadlines, qualified justification, mandatory notifications to rights holders (Section 14 UrhDaG), decision within one week as internal target value; escalation to legal department.
- Data and IT security: hash databases, fingerprints, evidence storage with integrity protection; logging in accordance with Privacy by Design (Art. 25 GDPR).
- Transparency reports: key figures on blocking/unblocking, average periods, complaint outcomes – DSA-compatible reporting.
C. Contractual framework & chain of rights
- Licenses: Rights clearance with collecting societies/producers; scope (territory, media, adaptations); observe direct remuneration (Section 12 (1) UrhDaG).
- Uploader T&Cs: Assurances of rights ownership, obligation to correct labeling (quotation/parody), cooperation in clarification, indemnification and recourse in the event of abusive labeling.
- Rights holder workflow: Standardized notice templates (work identification, rights chain, license status), rate limit against spam notices, clear escalation to the Section 14 procedure.
- Service provider contracts (filter provider/SaaS): service levels (hit rates, response times), audit and explainability clauses, data and confidentiality protection, exit rights (model/data port).
- Right to evidence: log retention, time stamps, signatures; legal hold in contentious proceedings.
D. Product and community design
- Remix-friendly defaults: templates and training modules on § 51/§ 51a UrhG; avoid fair use myths (no German law).
- “Narrow Block – Wide Review”: Hard block only for high match without § 10 content; everything else in manual review.
- UI for fundamental rights: Visible legal bases for decisions (citation, parody, pastiche, license), short justification texts and objection option.
E. Inspection and audit program
- Quarterly parameter reviews with A/B comparison; documentation for supervisory authorities, courts, arbitration boards.
- Bias checks (e.g. against certain genres/languages).
- Stress tests before major events (sport, festivals, releases).
F. Typical mistakes – and how to avoid them
- Hard blocking below § 10 thresholds → Check threshold logic, expand test data.
- No 48-hour release for post block → map Section 11 (2) UrhDaG in the system logic.
- Missing or sluggish complaints procedure → operationally secure Section 14 UrhDaG obligations (deadlines, resources).
- Exclusively technical consideration → Include legal review layers for borderline cases (citation/artistic freedom).
- DSA obligations viewed in isolation → standardized workflow with legal routing for non-copyright infringements.
Conclusion: Upload filters are legally required in 2025 – but only as part of a balanced system of licenses, barriers, presumptions and effective legal remedies. Those who implement obligations and rights in an integrated manner will reduce liability and reputational risks, protect fundamental rights and create robust procedures for disputes.
Fittingly:
In its ruling of July 17, 2025 (case no. I ZB 82/24), the Federal Court of Justice clarified that cloud services do not owe any copyright levy. The system of private copying remuneration pursuant to Sections 54 et seq. UrhG is linked to devices and physical storage media; the Senate rejects an analogous application to pure cloud storage due to the lack of an unintended regulatory gap. The Karlsruhe judges’ note: Any shifts from local storage to the cloud may have to be addressed legislatively; a judicial extension of the scope of levies is out of the question. In practice, this means that remuneration obligations (private copying) remain strictly separate from liability and due diligence obligations under Art. 17 DSM/UrhDaG. Pure cloud storage is regularly not an OCSSP (no communication to the public), whereas content sharing platforms must ensure upload filter compliance independently of private copying remuneration.