Digital Services Act (DDigG)
Basics and objectives
The Digital Services Act came into force on May 14, 2024 as the national implementation of the European Digital Services Act (DSA). The Act creates a uniform legal framework for digital intermediary services in Germany and the European Union. The regulations aim to ensure a secure and trustworthy online environment for consumers. The protection of fundamental rights and, in particular, consumer protection are at the heart of the legislation. The Federal Network Agency has been appointed as the central coordinating body for digital services. The law replaces the previous Telemedia Act and large parts of the Network Enforcement Act. The regulations apply to all digital services that provide goods, services or content. The regulations affect both large and small platform operators. The obligations vary depending on the size and scope of the services. The new law harmonizes the German regulations with European law. Enforcement is carried out through a multi-level supervisory system.
Scope of application and obligated parties
The DDigG covers all providers of digital intermediary services on the German market. Particularly strict regulations apply to very large online platforms with more than 45 million active users per month. The regulations apply to online marketplaces, social networks and search engines. Smaller platforms must also fulfill basic content moderation obligations. The obligations apply regardless of where the provider is based. Cloud services and hosting providers also fall under the scope of application. The regulations distinguish between different categories of service providers. The obligations are graded according to the principle of proportionality. Responsibility depends on the role in the digital ecosystem. The requirements increase with the size and influence of the platform. Classification as a very large online platform is carried out by the EU Commission.
Duties and requirements
Platform operators must implement effective systems for detecting and removing illegal content. A transparent notice-and-action procedure for user complaints must be made mandatory. The decision-making processes for content moderation must be documented in a comprehensible manner. Very large platforms must carry out regular risk analyses and take countermeasures. The use of personal data for advertising will be severely restricted. Online marketplaces must check their traders before approval. The transparency of commercial advertising will be significantly increased. The rights of users in the event of account suspensions will be strengthened. Cooperation with trustworthy whistleblowers will be formalized. Protective measures for minors will be expanded. The fight against disinformation will be intensified.
Supervision and enforcement
The Federal Network Agency acts as the central coordinating body for digital services. It monitors compliance with the obligations of small and medium-sized platforms. The EU Commission is responsible for the supervision of very large online platforms. The Federal Agency for the Protection of Children and Young Persons in the Media monitors the protection of minors. The Federal Commissioner for Data Protection monitors advertising rules. The Federal Criminal Police Office is responsible for criminally relevant content. The authorities can impose severe fines for violations. Cooperation between national and European authorities will be intensified. Enforcement will follow a risk-based approach. The supervisory authorities will be given extensive powers of investigation. Sanctions can amount to up to 6% of annual turnover.
Rights of the users
Consumers will have comprehensive options to complain about platform decisions. Reporting illegal content will be simplified through standardized procedures. Transparency regarding recommendation systems and content moderation will be increased. Users have a right to justification for deletions or account suspensions. Appeals against platform decisions will be strengthened. Protection against manipulative design techniques (“dark patterns”) will be improved. Information rights in the case of commercial advertising will be extended. The enforcement of user rights will be made possible through class actions. Complaints procedures must be made user-friendly. The deadlines for platform responses will be shortened. Law enforcement will be made more effective.
Practical effects
The platforms must fundamentally adapt their systems and processes. The implementation of the new obligations requires considerable technical investment. Content moderation will be professionalized and standardized. Transparency towards users and authorities will increase significantly. The fight against illegal content will be made more effective. Legal certainty for platforms and users will increase. Harmonization creates a level playing field. The costs of compliance will be a particular burden for small providers. Innovation in the digital sector will be promoted by clear rules. The market power of large platforms will be curbed. The quality of digital services will be improved.
Future prospects
The DDigG marks the beginning of a new era of platform regulation. Harmonization at European level will continue to progress. Enforcement of the rules will have to become established in practice. Technological development requires continuous adjustments. International cooperation in law enforcement will be intensified. The importance of the platform economy will continue to grow. The balance between innovation and regulation must be maintained. The effectiveness of the measures will be regularly evaluated. The rights of users will be further strengthened. Europe’s digital sovereignty will be promoted. The development of new business models will be supported by clear rules. The social significance of platform regulation is increasing.