In its ruling of January 25, 2024 (case no. 16 U 65/22), the Frankfurt am Main Higher Regional Court set new standards in the legal assessment of the responsibility of social media platform operators. In this ruling, the obligation of Meta, the operator of Facebook, to delete illegal content was significantly extended. This decision, which was based on the judgment of the Regional Court of Frankfurt am Main (case no. 2-03 O 188/21), represents a significant development in case law and could be formative for the future regulation of social media platforms. It reflects an increasing awareness of the need to protect digital communication spaces from illegal content and underlines the responsibility of platform operators in this context.
Case background and legal assessment
At the center of the case was a meme that depicted a member of the Bundestag from Bündnis90/Die Grünen with an invented quote that she had never made. This meme was spread on the Facebook platform and led to legal disputes. In its decision, the Higher Regional Court of Frankfurt am Main confirmed the opinion of the Regional Court of Frankfurt am Main that Meta, as the operator of Facebook, is not only obliged to delete identical content, but also content with the same meaning. This obligation arises from the need to protect the general right of personality and the right to one’s own word of the plaintiff, which is enshrined in Article 2 (1) in conjunction with Article 1 (1) of the Basic Law.
The decision of the Higher Regional Court of Frankfurt am Main is based on the principles of “Stoererhaftung” (Breach of Duty of Care), which state that a platform operator can be held liable for illegal content as soon as it becomes aware of it. This aspect is particularly relevant as it emphasizes the responsibility of platform operators such as Meta for the content published on their platform. Furthermore, the court refers to the E-Commerce Directive (Directive 2000/31/EC), in particular Article 14, which regulates the liability of host providers for information provided by users. This directive plays a central role in assessing the responsibility of online service providers.
In its reasoning, the court emphasized that Meta’s knowledge of illegal content triggers an active duty to act. This obligation is not limited to the removal of identical content, but also extends to similar content. This means that Meta is obliged to remove content that corresponds in its message and meaning to the original illegal post, even if it varies in its wording or presentation. This interpretation of the deletion obligations represents a considerable extension of previous case law and underlines the importance of the protection of privacy in the digital space.
Effects on meta and social media regulation
This judgment could serve as a blueprint for future cases against Meta and other social media platforms. It shows that effective control and regulation of online content is possible without exceeding the limits of reasonableness for platform operators. The decision suggests that Meta and similar companies must act not only reactively but also proactively to identify and remove illegal content. The ruling makes it clear that Meta bears extended responsibility for the content published on its platform. It stipulates that Meta must not only respond to specific indications, but must also develop its own mechanisms for identifying illegal content. This could require the use of advanced AI technologies for pre-filtering and more intensive manual checking.
Future case law and conclusion
The ruling by the Higher Regional Court of Frankfurt am Main (case no. 16 U 65/22) represents significant progress in the legal assessment and regulation of internet content and social media platforms. It clearly defines the responsibilities of platform operators such as Meta to take active action against hate speech and illegal content. This decision could be groundbreaking for future case law in this area and underlines the need for digital platform operators to develop effective and responsible methods for monitoring and controlling the content published on their platforms.
The ruling is particularly interesting as it comes at a time when Meta is increasingly falling into disrepute among lawyers. This is mainly due to the intensive use of artificial intelligence (AI) for moderating and blocking content on their platforms. These AI systems are known to be very error-prone in some cases, which can lead to unjustified blocking and inadequate filtering of illegal content. The ruling of the Higher Regional Court of Frankfurt am Main could therefore be seen as a critical signal to Meta and other platform operators to reconsider and improve their current practices. The decision emphasizes that a mere reliance on automated systems is not sufficient to comply with the legal obligations. It is expected that this ruling will further stimulate the discussion on the effectiveness and reliability of AI-based moderation systems and possibly lead to stricter regulation and monitoring of these technologies. For lawyers and other legal professionals, the ruling offers an interesting insight into the evolving legal framework in the digital space and underlines the importance of balanced and effective regulation of online platforms.