Introduction
Nowadays, bots are just as ubiquitous as smartphones and social media. They take on different roles and are indispensable, especially on platforms like Telegram and Discord. Although they are often seen as harmless digital assistants, they harbor a wealth of legal challenges. In my previous articles I have dealt intensively with the legal situation of automation bots, especially in the environment of games like World of Warcraft. In the process, I have accompanied a number of court cases that have made it all the way to the Federal Constitutional Court. I have also written extensively on the legal aspects of artificial intelligence. What is still missing, however, is a close look at the legal issues that arise in the context of bots in chat programs. In Germany, a country with a complex legal system – from regulations in the German Civil Code (BGB) to the German Telemedia Act (TMG) – the question of liability for such bots often remains unclear. In this article, I would like to take a closer look at this gray area and provide users, developers and channel operators with a legal guide.
Legal issues
There are many possible uses for bots: they can generate texts, share content, provide automatic responses or even host games. In each of these cases, there are legal aspects that must be considered. Copyright infringements, data protection violations and other legal problems can quickly become a pitfall. German copyright law deserves special mention. Pursuant to Section 97 of the German Copyright Act (UrhG), claims for injunctive relief and damages could become relevant here. Data protection also plays a major role, with provisions of the General Data Protection Regulation (DSGVO) and the Federal Data Protection Act (BDSG) to be observed. The German Telemedia Act (TMG) is another important regulatory framework, as it sets out the liability for Internet service providers. However, the question of who is liable for bots used in chat programs such as Telegram and Discord or in other online communities remains particularly exciting. In these specific contexts, bots can not only be useful, but also harbor risks that go beyond the general legal issues.
Liability: user, channel operator or bot programmer?
Who exactly is responsible for the actions of a bot is a complicated question and the answer has far-reaching implications. In the context of criminal law, sections § 303a StGB (data alteration) and § 303b StGB (computer sabotage) could be applicable to users who use bots for illegal activities. For channel operators, the situation is different again. According to § 7 and § 8 of the German Telemedia Act (TMG), they could be liable as service providers, especially if they have knowledge of illegal activities and do not act. Bot programmers, on the other hand, could be prosecuted in extreme cases under Section 202a of the German Criminal Code (spying on data) or Section 202c of the German Criminal Code (preparing to spy on and intercept data) if the bot is programmed to intercept sensitive data. The distinction between who is considered an “operator” is particularly relevant when it comes to data protection and inappropriate content. For example, the question arises whether Discord itself, as the provider of the platform, the channel operator, who must comply with Discord’s TOS, or the bot programmer, who may cache data on its own systems or aggregate external data using AI, is liable. In terms of data protection, this issue is of crucial importance, as different regulations and obligations may apply to the various parties involved. DSGVO and BDSG would come into play here and could create different obligations for the respective “operator”, depending on the case. It gets even more complicated when the bot is offered as software-as-a-service (SaaS). Here, the Bothersteller could cache data on its own servers, which could entail additional responsibilities. Or it aggregates data from external sources and outputs it to Discord. In such cases, the question of the bot programmer’s liability could come to the fore, especially if sensitive or personal data is involved.
Bots with payment functions
If a bot is used that enables payment transactions, the situation becomes considerably more complex. In Germany, the provision of financial services is strictly regulated. As a rule, anyone offering such services must hold a license pursuant to Section 32 of the German Banking Act (KWG). Therefore, in the case of a bot with payment functions, this paragraph could become relevant. But it gets even more complicated: Provisions of the Payment Services Supervision Act (Zahlungsdiensteaufsichtsgesetz, ZAG) could also be applied. Both channel operators and bot programmers should therefore be aware that in such cases they are not only liable under civil law, but also under regulatory requirements. In addition to traditional payment options, the relevance of blockchain technologies is also increasing, for example in the form of tokenization. Anyone who programs a bot that trades tokens or cryptocurrencies is operating in a legal gray area. Depending on the design, there could even be regulatory requirements for the issuance of tokens, for example under the German Securities Trading Act (WpHG) or the German Investment Act (VermAnlG). This in turn may have implications for the liability of channel operators and bot programmers. Tokenization can also be the subject of smart contracts in particular. In this context, §§ 705 et seq. BGB apply, which define the contract as such. The range of possible legal challenges is wide, and compliance with all relevant laws and regulations is therefore a complex task that requires expertise in several areas of law.
Bots that output content
The distribution of content by bots can have a variety of legal implications, even if they are seemingly “simpler” bots. Copyright infringements could be punished here in accordance with § 97 UrhG. The dissemination of fake news or inflammatory content could be punishable under the Network Enforcement Act (NetzDG). Furthermore, according to § 5 of the German Telemedia Act (TMG), there could be information obligations that may not be fulfilled by the bot. In Germany’s complex legal landscape, it is essential that both bot programmers and channel operators are fully aware of the potential legal pitfalls. Legal pitfalls can lurk even in seemingly innocuous functions. A bot that shares tweets or articles, for example, could quickly infringe copyrights if not properly cited or licensed in the process. This can have consequences not only under civil law, but also under criminal law. Furthermore, the automatic generation and distribution of content could violate personal rights, for example through the unauthorized publication of photos or personal information. Claims under Section 823 of the German Civil Code (liability in tort) could become relevant here. It is also possible that the AI behind such bots creates problematic content. This could create a whole host of legal problems, ranging from discrimination to defamation to torts. It is therefore crucial for developers and operators to fully understand the functionality of their bots and consider the possible legal consequences.
Bots that moderate content and block users
A bot that takes on the role of moderator raises a host of other legal issues. When is blocking or deleting content justified and when is it censorship? In this context, the general personal rights pursuant to Article 1 and Article 2 of the Basic Law could be affected. The question of accountability for incorrect or inappropriate decisions made by the bot also remains an unsolved mystery. The German Telemedia Act (TMG) and the General Data Protection Regulation (GDPR) could also play a role, especially with regard to automated decision-making pursuant to Article 22 GDPR. Contractual aspects should not be ignored here. A clear line has increasingly emerged in case law as to when the blocking of a user is to be regarded as an implied termination of a contract. Among other things, the General Terms and Conditions (GTC) of the respective service, but also legal provisions such as Section 314 of the German Civil Code (extraordinary termination in the event of loss of confidence) are used. It is therefore not only a matter of immediate legal responsibility, but also a contractual matter with potentially long-term implications for all parties involved. In the event of a breach of the GTC by the bot, not only the bot operator but also the channel operator could be held liable. The combination of civil, data protection and contract law aspects makes the legal assessment of moderating bots a complex undertaking that requires careful consideration.
Conclusion
In conclusion, it can be said that the legal challenges surrounding bots in chat programs and social platforms such as Telegram, Discord or Twitch are far-reaching and complex. We are moving in a gray area here that requires not only sound legal knowledge, but also a certain degree of technical understanding or, to put it casually, “nerddom”. The interfaces between civil law, criminal law, data protection and even contract law are manifold and often unclear. Due to the involvement of various legal areas such as the TMG, UrhG, StGB and also the KWG in the case of bots with financial transactions, the legal situation is not infrequently opaque. In addition, there is not just one, but several actors who could potentially be held liable: from the user to the channel operator to the bot programmer. Precisely because many legal issues have not been conclusively clarified, a pragmatic approach is needed to assess and minimize potential legal risks. In practice, this can mean thinking outside the legal box and dealing with the technical details of bot functionality. A purely academic or theoretical examination of the legal issues would hardly be sufficient in this dynamic and technically demanding field. Here, both legal expertise and technical know-how are needed to untangle the complex and often interwoven issues. Therefore, all parties involved – whether lawyers, developers or operators – should be aware of the complexity of this issue and, if necessary, form interdisciplinary teams for the evaluation and implementation of legal requirements.