• Mehr als 3 Millionen Wörter Inhalt
  • |
  • info@itmedialaw.com
  • |
  • Tel: 03322 5078053
Rechtsanwalt Marian Härtel - ITMediaLaw

No products in the cart.

  • en English
  • de Deutsch
  • Informationen
    • Ideal partner
    • About lawyer Marian Härtel
    • Quick and flexible access
    • Principles as a lawyer
    • Why a lawyer and business consultant?
    • Focus areas of attorney Marian Härtel
      • Focus on start-ups
      • Investment advice
      • Corporate law
      • Cryptocurrencies, Blockchain and Games
      • AI and SaaS
      • Streamers and influencers
      • Games and esports law
      • IT/IP Law
      • Law firm for GMBH,UG, GbR
      • Law firm for IT/IP and media law
    • The everyday life of an IT lawyer
    • How can I help clients?
    • Testimonials
    • Team: Saskia Härtel – WHO AM I?
    • Agile and lean law firm
    • Price overview
    • Various information
      • Terms
      • Privacy policy
      • Imprint
  • Services
    • Support and advice of agencies
    • Contract review and preparation
    • Games law consulting
    • Consulting for influencers and streamers
    • Advice in e-commerce
    • DLT and Blockchain consulting
    • Legal advice in corporate law: from incorporation to structuring
    • Legal compliance and expert opinions
    • Outsourcing – for companies or law firms
    • Booking as speaker
  • News
    • Gloss / Opinion
    • Law on the Internet
    • Online retail
    • Law and computer games
    • Law and Esport
    • Blockchain and web law
    • Data protection Law
    • Copyright
    • Labour law
    • Competition law
    • Corporate
    • EU law
    • Law on the protection of minors
    • Tax
    • Other
    • Internally
  • Podcast
    • ITMediaLaw Podcast
  • Knowledge base
    • Laws
    • Legal terms
    • Contract types
    • Clause types
    • Forms of financing
    • Legal means
    • Authorities
    • Company forms
    • Tax
    • Concepts
  • Videos
    • Information videos – about Marian Härtel
    • Videos – about me (Couch)
    • Blogpost – individual videos
    • Videos on services
    • Shorts
    • Podcast format
    • Third-party videos
    • Other videos
  • Contact
Kurzberatung
  • Informationen
    • Ideal partner
    • About lawyer Marian Härtel
    • Quick and flexible access
    • Principles as a lawyer
    • Why a lawyer and business consultant?
    • Focus areas of attorney Marian Härtel
      • Focus on start-ups
      • Investment advice
      • Corporate law
      • Cryptocurrencies, Blockchain and Games
      • AI and SaaS
      • Streamers and influencers
      • Games and esports law
      • IT/IP Law
      • Law firm for GMBH,UG, GbR
      • Law firm for IT/IP and media law
    • The everyday life of an IT lawyer
    • How can I help clients?
    • Testimonials
    • Team: Saskia Härtel – WHO AM I?
    • Agile and lean law firm
    • Price overview
    • Various information
      • Terms
      • Privacy policy
      • Imprint
  • Services
    • Support and advice of agencies
    • Contract review and preparation
    • Games law consulting
    • Consulting for influencers and streamers
    • Advice in e-commerce
    • DLT and Blockchain consulting
    • Legal advice in corporate law: from incorporation to structuring
    • Legal compliance and expert opinions
    • Outsourcing – for companies or law firms
    • Booking as speaker
  • News
    • Gloss / Opinion
    • Law on the Internet
    • Online retail
    • Law and computer games
    • Law and Esport
    • Blockchain and web law
    • Data protection Law
    • Copyright
    • Labour law
    • Competition law
    • Corporate
    • EU law
    • Law on the protection of minors
    • Tax
    • Other
    • Internally
  • Podcast
    • ITMediaLaw Podcast
  • Knowledge base
    • Laws
    • Legal terms
    • Contract types
    • Clause types
    • Forms of financing
    • Legal means
    • Authorities
    • Company forms
    • Tax
    • Concepts
  • Videos
    • Information videos – about Marian Härtel
    • Videos – about me (Couch)
    • Blogpost – individual videos
    • Videos on services
    • Shorts
    • Podcast format
    • Third-party videos
    • Other videos
  • Contact
Rechtsanwalt Marian Härtel - ITMediaLaw

Planned EU “chat control”: child protection versus data protection in messenger regulation

5. May 2025
in Law on the protection of minors
Reading Time: 18 mins read
0 0
A A
0
geplante eu chatkontrolle kinderschutz kontra datenschutz in der messenger regulierung
Key Facts
  • The EU draft regulation aims to combat child sexual abuse on the internet more effectively.
  • Child protection is at odds with data protection and fundamental rights.
  • Messenger services must search private communications for abuse.
  • Technical challenges: End-to-end encryption is jeopardized by scanning obligations.
  • Critics warn of mass surveillance and violations of privacy and freedom of expression.
  • Legislative initiative is in tough negotiations, an agreement is still uncertain.
  • Companies must now develop compliance strategies and pay attention to data protection.

In May 2022, the EU Commission presented a draft regulation (COM(2022) 209 final) to combat child sexual abuse on the internet more effectively. This project – referred to by critics as “chat control” – is a response to alarming figures: At least one in five children are victims of sexual violence, and over a third of young people surveyed in 2021 said they had been asked to perform sexual acts online. The Commission sees an urgent need for action, as the digital spread of abusive images is increasing rapidly and existing, purely voluntary measures by companies are not enough. The aim is to better protect children online – in the area of conflict with data protection and the fundamental rights of users.

Content Hide
1. Content of the draft regulation: Scanning of private communications
2. Technical and legal feasibility
3. Fundamental rights and GDPR: Potential conflict of scanning obligations
4. Effects on end-to-end encryption
5. The door remains closed for chat control: The EU Council has not yet been able to agree on a common line, which means that the proposal remains blocked for the time being.
6. Current status of the negotiations (2024/2025)
7. Obligations for companies: What would Messenger, Cloud & Co. have to do?
8. Controversy: Arguments from supporters and opponents
9. Checklist: Preparing for the EU chat control
10. FAQ on EU chat control
10.1. Author: Marian Härtel

Content of the draft regulation: Scanning of private communications

The draft provides for far-reaching obligations for providers of digital services. Messenger and hosting services (including chats in apps or games) are to be obliged to search all private communications and files of their users for depictions of child abuse. The plan is not only to scan images and videos, but also to read text messages and even listen to audio communications in order to uncover known CSAM (Child Sexual Abuse Material) as well as new content and grooming attempts (sexual advances towards children). To this end, a competent authority would be able to issue so-called detection orders, forcing service providers to automatically scan all content on their platform.

In order to be able to capture content in end-to-end encrypted communication, the draft provides two technical options: Either breaking through the encryption (using server-side access) or client-side scanning (CSS) directly on the user’s end device. In plain language, client-side scanning would involve software on the mobile phone/computer checking messages and media before they are encrypted or after they have been decrypted. The EU Commission is also proposing the establishment of a new EU center to support implementation. Among other things, this center is to provide testing technologies, maintain hash databases with known illegal content and examine suspected cases reported by the services in advance and then forward them to the law enforcement authorities of the member states. The draft also contains requirements for the removal of reported content (deletion orders) and – explosively – mandatory age checks: App stores might have to verify the age of users and deny certain age groups access to risky apps, which could mean a de facto identification requirement.

Technical and legal feasibility

The technical implementation of these measures is extremely demanding and controversial. End-to-end encryption (E2EE) in particular poses a challenge: If it is broken or bypassed for the scanning process, the security of all users suffers. Even if client-side scanning is used instead, the effect remains the same – private chats would no longer remain truly confidential. Critics point out that any weakening of encryption creates security loopholes that could potentially be exploited by criminals. Messenger services such as Signal have even announced that they will withdraw from the EU market if such an obligation comes into force. The recognition software is not technically infallible either: the planned AI and hashing technologies for content analysis currently have error rates of up to 12%. In a global service with billions of users like WhatsApp, hundreds of millions of uninvolved people could therefore be mistakenly targeted – with potentially serious consequences if their private photos or messages are mistakenly classified as illegal and reported to the authorities.

It is also unclear whether companies can even develop powerful CSS scanners that only detect illegal content without invading the privacy of law-abiding users. A similar plan by Apple to scan iCloud photos for CSAM met with massive resistance in 2021 and was finally abandoned in 2022. Law enforcement and supervision are further issues: detection orders should only be issued by courts or independent authorities and the detection technologies used should be “state of the art” and as data-efficient as possible. However, according to the draft, data protection authorities would hardly be able to intervene – their role is limited to non-binding statements prior to the use of a new scanning technology. There are many doubts as to whether these built-in protection mechanisms are sufficient in practice.

Fundamental rights and GDPR: Potential conflict of scanning obligations

The planned measures conflict to a considerable extent with European fundamental rights. In particular, Art. 7 of the EU Charter of Fundamental Rights (respect for private and family life) and Art. 8 (protection of personal data) are considered to be at risk. According to critics, the unprovoked and comprehensive monitoring of private communications – even with the legitimate aim of protecting children – exceeds the limits of proportionality. The Federal Data Protection Commissioner considers the draft to be disproportionate and contrary to fundamental rights, as it effectively amounts to mass surveillance and would even cover protected communications such as those between lawyer and client or doctor and patient. German telecommunications secrecy (Art. 10 para. 1 GG) would also be violated. From a data protection perspective, questions arise regarding compatibility with the GDPR. Although the regulation is intended to create a separate legal basis for data processing to combat abuse, the basic principles of Art. 5 GDPR – such as data minimization and purpose limitation – are thwarted if every chat conversation is preventively screened. Scanning personal content without concrete suspicion contradicts the previous ban on monitoring communication data without cause. It is also questionable that such full access to messages would also involve the processing of special categories of personal data (Art. 9 GDPR) – such as health data, political opinions or intimate details of private life that are mentioned in chats. However, the processing of such sensitive information requires strict conditions. There are strong doubts as to whether a general scanning obligation corresponds to what is “absolutely necessary”, as required by case law for encroachments on fundamental rights. In a joint opinion in 2022, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) expressed serious doubts about the draft’s compatibility with EU fundamental rights and data protection law. Overall, the legal conformity of the chat control is in question – it is foreseeable that adoption in this form would result in judicial reviews (up to the ECJ or Federal Constitutional Court).

Effects on end-to-end encryption

The future of end-to-end encryption is a key topic of contention. This technology has so far guaranteed that only the sender and recipient can read the content of a message – neither platform operators nor authorities can see it. The Chat Control Regulation would effectively undermine this principle. Although the Commission emphasizes that encryption should not be banned per se, any obligation to scan means that content must be available in plain text – either through client-side reading or through backdoors on servers. For users, this would shatter their trust in the confidentiality of their communications. Many see this as a breach of the dam: today CSAM scanning, tomorrow possibly expansion to terrorism, hate speech or other purposes – once a surveillance infrastructure has been created, it could theoretically also be misused for mass surveillance. IT security researchers and even intelligence agencies are warning of the consequences: The Dutch domestic intelligence service AIVD, for example, has warned against chat control and pointed out the security risks that arise. Companies that rely on privacy are sounding the alarm: the messenger service Signal has announced that it will leave Europe if it is forced to soften its encryption. Other services such as WhatsApp and Threema are taking a similarly critical stance. They argue that strong encryption is not only essential for privacy, but also for protection against criminals (such as hackers and data thieves) and for national security. The regulation presents them with a dilemma: either give up the integrity of their services – or risk legal action and fines if they fail to comply. Accordingly, there is a heated public debate as to whether child protection is possible without weakening encryption, or whether a fundamental element of digital security is actually being sacrificed here.

The door remains closed for chat control: The EU Council has not yet been able to agree on a common line, which means that the proposal remains blocked for the time being.

Current status of the negotiations (2024/2025)

Despite all the calls for urgency, the legislative initiative is currently undergoing tough negotiations. The European Parliament and member states hold diametrically opposed positions in some cases. Following intensive discussions, the EU Parliament adopted amendments in November 2023 that are intended to significantly water down the draft. In particular, MEPs are calling for the targeted, event-driven use of scan orders (i.e. only in cases of specific suspicion instead of across the board) and excluded audio messages from the scope of the regulation altogether, for example. In plain language, the Parliament positioned itself against mass surveillance without cause and demanded that unencrypted content of suspects be scanned at most. This parliamentary position is incompatible with the Commission’s original line (obligation to carry out a general scan).

No agreement has yet been reached in the Council of the EU (member states). Several planned votes have been postponed or canceled as no sufficient majority has emerged. Germany in particular made it clear in 2024 that it would not approve the draft in its current form. Federal Minister of the Interior Nancy Faeser emphasized that chat control was “not compatible with the liberal constitutional state” and meant “nothing more than the mass scanning of private communications without cause”. Instead, the German government called for “targeted and constitutional” solutions to protect children. Germany was thus the spokesperson for a blocking minority of countries that have so far blocked the Council. This camp includes Austria, the Netherlands, Poland, Finland, Ireland, Luxembourg and several others who share considerable data protection concerns. On the opposite side is a majority of around 16 member states, led by Spain, France and Italy, who continue to insist on mandatory chat controls. These governments argue that a voluntary or only occasion-based regulation would be an unacceptable step backwards and would miss the actual objective. Spain, for example, described the softening towards voluntariness as a “red line” and Italy warned that services could otherwise “do whatever they want” – chat control would have to be mandatory and violations would have to be sanctioned.

The EU Council presidencies tried to find compromises in 2024/25: One idea under the Hungarian Presidency was to initially limit client-side scanning to images/video and links, while text and audio would be excluded. In addition, users of encrypted services would have to give their consent in advance – if they refused, they would no longer be allowed to send images or links. However, such models continued to meet with resistance because they also essentially require private chats to be screened. At the beginning of 2025, the Polish Council Presidency proposed making chat control voluntary rather than mandatory. However, this proposal was also vehemently rejected by 16 countries. So far, no general approach has been agreed in the Council – the necessary prerequisite for entering the final trialogue with Parliament. Observers expect that the negotiations could drag on into the second half of 2025 or even 2026. It is also possible that only a new federal government in Germany (after 2025) will adopt a different position that will break the deadlock – or that a fundamentally revised draft will be relaunched in view of the points of criticism. For the time being, however, the door is still closed for chat control.

Obligations for companies: What would Messenger, Cloud & Co. have to do?

If the regulation is adopted (in whatever form), tech companies – in particular messenger services, social network providers, cloud storage and even online games with a chat function – would have to prepare for extensive compliance obligations. The draft already provides for several levels of obligations:

  • Risk assessment and prevention: All service providers concerned would have to regularly analyze the risk of abuse of their platform. This would involve, for example, the question of the extent to which their own service could be misused for the dissemination of CSAM or for grooming (factors: User base, existing security functions, possibility of anonymous use, etc.). Identified risks should be addressed through mitigation measures – such as stronger content moderation, limiting direct messages from third parties or age verification where appropriate. The results of the risk analysis and the precautions taken should be reported to the authorities.
  • Discovery orders (detection orders): If the supervisory authority sees a high residual risk despite preventive measures, it could issue a detection order via a court order. This would oblige the company concerned to actively search for child pornographic material or grooming within a defined scope. Although such orders should be limited in time and specific in terms of content (e.g. restricted to known image hashes or certain chat conversations), in practice they could mean that a large proportion of data traffic would have to be screened. Companies would have to use suitable scanning technology for this – according to the draft, as “state of the art” as possible and data protection-friendly, with a low error rate. Indicators (hash databases, AI models) would be provided by the EU center so that not every company has to develop its own tools. Important: Encrypted services would not be exempt either – scanning would then have to be implemented via client app updates. Companies would face complex technical tasks in order to meet such requirements without having to completely restructure their IT security.
  • Reporting and removal requirements: If suspicious material is found during the scans, companies would be obliged to report it immediately. These reports would be sent centrally to the EU center, which would check them and forward them to the police. At the same time, hosting providers would have to delete or block the identified illegal material. If deletion is not possible (e.g. in the case of content on foreign servers), internet access providers could be obliged by order to block access to the content in question (similar to what is already the case with terrorist propaganda). The reporting obligation would primarily apply to messenger services, as private chats cannot be “removed” – but accounts used to spread abuse could be blocked.
  • Transparency and supervision: services should publish reports on their measures. National coordination bodies would be set up to monitor compliance. These authorities would have the power to issue prohibitions or sanctions in the event of violations. It is conceivable, for example, that a messenger that does not implement an effective scanning method could be obliged to do so by a court or ultimately be blocked in the EU.
  • Penalties for infringements: The regulation is expected to contain a robust sanctions regime. Similar to the GDPR or the Digital Services Act, severe fines are envisaged – up to 4% of a company’s global annual turnover are being discussed, possibly with fixed maximum amounts in the millions. For tech companies such as Meta, Google or Apple, this could potentially mean billions in fines if they refuse to comply. In addition, persistent violations could lead to conditions or withdrawal of the service license. Companies should therefore not take the upcoming rules lightly.

It is obvious that smaller app developers or start-ups in particular would face major challenges here. However, the obligations are not just a burden, but can also be used as a competitive advantage if approached correctly. Those who invest in child protection compliance at an early stage can demonstrate trustworthiness to their users and business partners. For example, messenger services that develop innovative solutions for detecting abuse in line with data protection could set themselves apart from less prepared competitors. The same applies to gaming platforms: A safe chat environment for children (with filtering and moderating functions) can convince parents and thus increase the user base. Legally compliant compliance – i.e. precise knowledge and implementation of legal requirements – thus becomes a quality feature that can also be used for marketing purposes. Companies that cooperate proactively with the authorities and present transparent protection concepts could also be privileged or treated with more lenient conditions in the event of a discovery order. In order to take advantage of this opportunity, companies should review their internal guidelines and technical systems at an early stage and adapt them if necessary. In doing so, it makes sense to obtain expertise: especially in the area of conflict between data protection, IT security and criminal prosecution, expert legal advice is recommended in order to remain compliant and still protect the rights of users.

Controversy: Arguments from supporters and opponents

The debate on EU chat control is polarized. Supporters – above all child protection organizations, domestic politicians and law enforcement officers – emphasize the urgency of action. Their central argument: “No perpetrator should be able to escape the protection of encryption”. In view of the attacks that take place online on a daily basis, the state must use all available means to protect children. The industry’s voluntary reports to date have not been sufficient and a legal obligation would send a clear signal that child protection takes priority over absolute confidentiality. Advocates point to the “success figures” of the scans: in 2020, for example, tens of millions of CSAM reports were generated across Europe by US services such as Facebook and Google, enabling thousands of investigations. This level must not be allowed to collapse – on the contrary, the “blind spots” created by encrypted messengers must be closed. Many acts of abuse today take place in secret; this can no longer be tolerated. Advocates of the regulation also insist that targeted filter technology should be used: it is not about spying out every private detail, but only about finding clearly illegal content with technical indicators. The privacy of innocent citizens remains protected, as no human officials are reading chats, but automated systems only search for hashes of known abuse images or typical patterns of grooming. Another pro argument: children also have rights, including to protection and physical integrity – these should apply in the digital space in the same way as the data protection rights of adults. From the point of view of supporters, chat control is a necessary compromise in order to strike the right balance between two important legal interests (child protection and data protection). Some even argue that without this regulation, companies would voluntarily do less for fear of liability (keyword “preventing a regression to the status quo”). Finally, it is argued that the regulation contains sufficient legal safeguards (judicial order, transparency, possibility of legal remedies) to prevent abuse. The advocates’ leitmotif can be summarized as follows: “We must not close our eyes – effective controls save children from exploitation.”

In contrast, a broad-based camp of opponents is forming, ranging from data protection authorities and the IT industry to civil rights organizations and academics. They warn that chat control sets a dangerous precedent for mass surveillance in the digital age. Fundamental rights activists complain that such a far-reaching intrusion into the intimate communication of all citizens is disproportionate and violates the essence of privacy and freedom of expression online. Even the noble goal cannot justify the preventive surveillance of millions of innocent people – a practice that is more reminiscent of authoritarian regimes than liberal democracies. The term “general suspicion” is often used: every user is treated as if he or she could be a perpetrator. Data protectionists emphasize that the current voluntary scanning practice is already legally controversial (the EU General Data Protection Regulation and ePrivacy Directive actually prohibit the reading of private messages). An extension by law would be a breach of taboo that is unlikely to stand up in court. IT security experts, on the other hand, see the danger that forcing client scanning will weaken security on the Internet as a whole. They ask: Who can guarantee that the built-in “scan backdoors” will not be exploited by hackers? Even today, criminals already have a variety of ways to evade detection – e.g. by moving to darknet forums, encrypted niche apps or disguised communication channels. The really hardcore criminals could therefore disappear, while the average citizen would be monitored. False detection and misuse are further counter-arguments: The false positives mentioned above could draw innocent people into criminal investigations, with their data ending up in an EU register even though no offense has been committed. This could destroy livelihoods before the errors are cleared up. There are also fears that authoritarian regimes could demand the chat control infrastructure for their own purposes – for example to spy on political dissidents (a real risk once the technology is established). The public and experts are also debating whether alternative approaches would be more efficient: for example, increased preventative intelligence, more funding for traditional investigations on the darknet, or mandatory reporting channels on the platforms without scanning all content. Opponents argue that resources would be better spent on combating organized abuse networks instead of building up a surveillance apparatus that is potentially easy to circumvent. Their overall conclusion is that the regulation overshoots the mark and jeopardizes fundamental freedoms – real child protection must be designed differently without sacrificing the principle of privacy. This view is shared by several EU member states and the EU Parliament, as outlined above.

In light of this intense debate, the outcome remains to be seen. What is certain, however, is that tech companies should inform themselves and prepare early on. The balance between child protection and data protection will remain a key issue for messenger regulation in Europe – and it is foreseeable that scanning obligations (or stricter requirements) will come in whatever form. A forward-looking compliance strategy and attention to both the legal and ethical dimensions can help companies to position themselves correctly in this area of tension.

Checklist: Preparing for the EU chat control

  • Monitor the legal situation: Stay informed about the progress of EU negotiations. Adjust your timetable whenever it becomes apparent that new obligations could come into effect.
  • Carry out a risk assessment now: Analyze the extent to which your service could be misused for the distribution of CSAM. Document youth protection and moderation measures that have already been implemented.
  • Strengthen technical protection measures: Check options for content scanning technologies (e.g. hashing of images, AI for grooming detection) that could be used in compliance with data protection regulations. Test these in small environments.
  • Data protection impact assessment (DPIA): Evaluate the impact of any scans on the data protection of your users. Make sure that you can name lawful bases (from the regulation) in an emergency and observe principles such as data minimization.
  • Age verification and child protection: If your service is used by minors, plan age verification or child protection modes. For example, younger users could not send messages to strangers by default or only receive media to a limited extent.
  • Set up internal processes: Set up a clear process scheme in the event of official orders – from legal review and technical implementation to reporting to the EU center. Define responsibilities (e.g. data protection officer, security team, legal department).
  • Train employees: Sensitize employees in development and support for the upcoming regulations. Ensure that there is awareness of the topic (e.g. in dealing with user reports of misuse).
  • Obtain legal advice: Consult experts (IT lawyers, data protection officers) at an early stage if you are unsure. This will help you to avoid costly mistakes and, if necessary, to recognize how you can combine compliance with user trust.
  • Prepare communication: Develop a transparent communication plan for your users in case you need to scan in the future. Proactive and honest information can help to maintain trust and dispel misunderstandings (e.g. “is someone reading my chats now?”).

FAQ on EU chat control

What exactly is the “EU chat control”?
This is the colloquial name for the draft EU regulation to prevent child abuse online. The plan behind it is to oblige providers of online services (messengers, cloud storage, social networks, etc.) to automatically search for child pornography and cybergrooming in private messages. Chat control is therefore not a single tool, but an entire regulatory package that prescribes scanning technologies and reporting obligations – comparable to a digital scanner law for child protection.

Who would be affected by the new obligations?
In principle, almost all services that allow users to share content. These include messaging apps (WhatsApp, Signal, Threema, iMessage, Telegram, chats in online games), email services, forums and cloud platforms for sharing photos/videos. Even smaller app providers would be covered if their application offers communication functions. Hosting providers (such as Dropbox, Google Drive) would have to search for illegal files; communication services (messengers, chats) for suspicious messages or media. Pure telecommunications providers (Internet access, telephony) are not the primary target, but they could be targeted by web-blocking orders to block access to foreign abuse websites.

Should encrypted chats also be allowed to be scanned?
Yes – this is one of the most controversial points. The draft explicitly includes end-to-end encrypted services. Providers such as WhatsApp or Signal would have to find a technical solution to access the content for scanning despite encryption. In practice, this would only be possible via client-side scanning (the chat content is checked before encryption on the sender device or after decryption on the recipient device). The Commission claims that “encryption is not broken” because the message is still transmitted in encrypted form – but critics counter that it makes no difference where the state-ordered access takes place. In fact, E2EE messages would no longer just be a private matter for the sender and recipient.

Doesn’t this violate data protection and fundamental rights?
This accusation is often made. In terms of data protection law, there is a grey area: although the regulation would create a uniform legal basis throughout the EU (thus fulfilling Art. 6 GDPR), Art. 8 of the Charter of Fundamental Rights also requires proportionality and respect for the essence of privacy. Many legal experts consider the blanket permanent monitoring of all communications to be incompatible with the EU Charter. The ECJ has already overturned data retention laws, for example, which were far less intrusive than the planned chat scanning. The EU data protection authorities (EDPB/EDPS) clearly stated in 2022 that chat control in its proposed form raises serious fundamental rights concerns. It is therefore likely that legal action would be taken immediately if it were adopted – with the possible consequence that certain parts could be declared invalid or severely restricted. In short, the tension between data protection and fundamental rights is considerable, and a final assessment would probably be a matter for the highest courts.

When would the regulation come into force and how likely is it?
It is currently unclear (as of early 2025) whether and when the regulation will come into force. Originally, the Commission hoped for a conclusion by the end of 2024, but negotiations have stalled. As European elections were held in 2024, the process has been further delayed. Optimistic estimates: An agreement in the course of 2025, entry into force in 2026 after a transitional period. Pessimistic view: The project could also fail completely or be reopened in the next legislature if the resistance remains too great. For companies, however, this does not mean the all-clear – rather, the issue should be taken seriously and developments closely monitored. The political dynamics may change (e.g. new abuse scandals could increase the pressure). It is advisable to be prepared in case the rule does come into force at short notice.

How can companies prepare themselves now?
Companies should first stay informed (e.g. via industry associations, newsletters on EU digital law). It makes sense to take stock internally: What data flows are there? Where could misuse occur? Many companies already use compliance tools (e.g. Microsoft’s PhotoDNA to compare known CSAM images in cloud storage) – check whether such tools are available and compatible with your data protection policy. It may be useful to test a pilot implementation of a detection technology to gain experience. At the same time, you should not rashly break your privacy promises to users: Transparency is the be-all and end-all here. Therefore, prepare communication strategies to explain changes to users. And finally: seeking legal advice is a good idea. The subject matter ranges between criminal law, data protection and technology – specialized law firms or data protection experts can help to create an individual, legally compliant roadmap for implementation (or positioning in the debate). This will put you on the safe side, regardless of how the EU chat control is ultimately structured.

Conclusion: The planned EU chat control exemplifies the tension between child protection and data protection. It poses immense challenges for messenger and online providers – both legally and technically. It is still unclear what form the regulation will take. Companies should use the time to prepare. Those who do their compliance homework and at the same time respect the fundamental rights of users can master this challenge – and ideally help to ensure that both children are effectively protected and their privacy is respected.

 

Marian Härtel
Author: Marian Härtel

Marian Härtel ist Rechtsanwalt und Fachanwalt für IT-Recht mit einer über 25-jährigen Erfahrung als Unternehmer und Berater in den Bereichen Games, E-Sport, Blockchain, SaaS und Künstliche Intelligenz. Seine Beratungsschwerpunkte umfassen neben dem IT-Recht insbesondere das Urheberrecht, Medienrecht sowie Wettbewerbsrecht. Er betreut schwerpunktmäßig Start-ups, Agenturen und Influencer, die er in strategischen Fragen, komplexen Vertragsangelegenheiten sowie bei Investitionsprojekten begleitet. Dabei zeichnet sich seine Beratung durch einen interdisziplinären Ansatz aus, der juristische Expertise und langjährige unternehmerische Erfahrung miteinander verbindet. Ziel seiner Tätigkeit ist stets, Mandanten praxisorientierte Lösungen anzubieten und rechtlich fundierte Unterstützung bei der Umsetzung innovativer Geschäftsmodelle zu gewährleisten.

Weitere spannende Blogposts

BVerwG with respect to an attachment order against DENIC

BVerwG with respect to an attachment order against DENIC
8. October 2019

For many companies, the domain is now a great intangible value. Particularly relevant domains are sold on marketplaces such as...

Read moreDetails

OLG Naumburg on IP Blocking in Injunction Proceedings

ITMediaLaw: Http3 on Litespeed Server
7. November 2022

Unlike the Hanseatic Higher Regional Court a few years ago in a case I represented, the Naumburg Higher Regional Court...

Read moreDetails

AI in startups: More than just text generators

ai generated g63ed67bf8 1280
2. October 2023

Last week it was a bit quieter here on the blog. The reason is that I was on the road...

Read moreDetails

BGH: Youtube advertising not a media service

youtube 3503481 960 720
7. November 2022

In its judgment of September 13, 2018, the Federal Court of Justice (BGH) ruled under the case number I ZR...

Read moreDetails

EU Commission vs. Germany over “Amazon” law

eu kommission vs deutschland wegen amazon gesetz
15. October 2019

The European Commission has today decided to send a letter of formal notice to Germany in connection with the distance...

Read moreDetails

When is the contract concluded in the online store or with SaaS services

2b14342543cc4e965f96a23212cffb88
26. June 2024

The time of the conclusion of the contract in online stores and for SaaS services is for providers of largehe...

Read moreDetails

BGH decides on the right to name in copyright contract law

BGH considers Uber Black to be anti-competitive
24. October 2023

Introduction On June 15, 2023, the German Federal Court of Justice (BGH) issued a significant decision in the context of...

Read moreDetails

OLG Cologne: Cloudflare liable as perpetrator

OLG Cologne: Cloudflare liable as perpetrator
9. November 2023

Introduction In a landmark decision, the Cologne Higher Regional Court has recalibrated the liability of service providers in the area...

Read moreDetails

Misconceptions about the Copyright Directive – by MEP!

copyright
7. November 2022

I just came across a couple of misconceptions that are haunting the minds of politicians after the approval of the...

Read moreDetails
Limited (Ltd.)

Limited (Ltd.)

16. October 2024

Definition and legal basis: The Limited (Ltd.) is a corporation under British law, similar to the German equivalent of the...

Read moreDetails
European Parliament

European Parliament

30. June 2023
Restoration to the previous status

Restoration to the previous status

30. June 2023
0a33de40 2ae4 401e 98dc b0d021b16861 204222195

Private equity

29. March 2025
Term clause

Term clause

16. October 2024

Podcast Folgen

092def0649c76ad70f0883df970929cb

Influencers and gaming: legal challenges in the digital entertainment world

26. September 2024

In this captivating episode, lawyer Marian Härtel takes listeners on an exciting journey through the dynamic world of influencers and...

8ffe8f2a4228de20d20238899b3d922e

Web3, blockchain and law – a critical review

26. September 2024

  In this insightful episode of the ITmedialaw podcast, we take an in-depth look at the intersection of Web3, blockchain...

da884f9e2769f2f96d6b74255be62c27

The role of the IT lawyer

5. September 2024

In this exciting podcast episode, we delve into the fascinating world of IT start-ups and find out why an experienced...

8315f1ef298eb54dfeed2f5e55c8b9da 1

First test episode of the ITMediaLaw Podcast

26. August 2024

First test episodeDear readers, I am delighted to present the first test run of our brand new IT Media Law...

  • Privacy policy
  • Imprint
  • Contact
  • About lawyer Marian Härtel
Marian Härtel, Rathenaustr. 58a, 14612 Falkensee, info@itmedialaw.com

Marian Härtel - Rechtsanwalt für IT-Recht, Medienrecht und Startups, mit einem Fokus auf innovative Geschäftsmodelle, Games, KI und Finanzierungsberatung.

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In
  • Informationen
    • Ideal partner
    • About lawyer Marian Härtel
    • Quick and flexible access
    • Principles as a lawyer
    • Why a lawyer and business consultant?
    • Focus areas of attorney Marian Härtel
      • Focus on start-ups
      • Investment advice
      • Corporate law
      • Cryptocurrencies, Blockchain and Games
      • AI and SaaS
      • Streamers and influencers
      • Games and esports law
      • IT/IP Law
      • Law firm for GMBH,UG, GbR
      • Law firm for IT/IP and media law
    • The everyday life of an IT lawyer
    • How can I help clients?
    • Testimonials
    • Team: Saskia Härtel – WHO AM I?
    • Agile and lean law firm
    • Price overview
    • Various information
      • Terms
      • Privacy policy
      • Imprint
  • Services
    • Support and advice of agencies
    • Contract review and preparation
    • Games law consulting
    • Consulting for influencers and streamers
    • Advice in e-commerce
    • DLT and Blockchain consulting
    • Legal advice in corporate law: from incorporation to structuring
    • Legal compliance and expert opinions
    • Outsourcing – for companies or law firms
    • Booking as speaker
  • News
    • Gloss / Opinion
    • Law on the Internet
    • Online retail
    • Law and computer games
    • Law and Esport
    • Blockchain and web law
    • Data protection Law
    • Copyright
    • Labour law
    • Competition law
    • Corporate
    • EU law
    • Law on the protection of minors
    • Tax
    • Other
    • Internally
  • Podcast
    • ITMediaLaw Podcast
  • Knowledge base
    • Laws
    • Legal terms
    • Contract types
    • Clause types
    • Forms of financing
    • Legal means
    • Authorities
    • Company forms
    • Tax
    • Concepts
  • Videos
    • Information videos – about Marian Härtel
    • Videos – about me (Couch)
    • Blogpost – individual videos
    • Videos on services
    • Shorts
    • Podcast format
    • Third-party videos
    • Other videos
  • Contact
  • en English
  • de Deutsch
Kostenlose Kurzberatung