What's behind the new rules for the European digital economy?

The Digital Services Act (DSA) is part of the EU's Digital Strategy. It is intended to modernize the cornerstones of the European digital single market that were set with the e-commerce directive in 2000. The EU wants to achieve more transparency and security, as well as to strengthen European values on the Internet and on online platforms.

The regulation published by the EU Commission in December 2020 is a draft; changes are therefore possible during the legislative process. On this page, we would like to highlight individual provisions of the DSA and their practical impact on the digital business of companies, as well as provide information about the ongoing legislative process and eventual changes.

Digital Services Act in a Nutshell

Direct applicability

The DSA is a regulation. When it enters into force, the regulation, unlike a directive, applies directly without the need for a national legislative act.

Application also to providers based outside the EU

When applying the rules, it should not matter whether the provider has its place of business in the EU; it is sufficient if the users have their place of business or residence in the EU.

Illegal Content

One of the purposes of the regulation is the more effective combating of illegal content. Illegal content includes all content that is illegal under EU and national law.

Liability rules / Intermediary Liability

The fundamental liability rules and privileges for intermediary services from the E-Commerce Directive are maintained. This applies in particular to the existing principle that no general monitoring obligations is to be imposed on the services. In addition, liability requires knowledge of the illegal content. The liability rules for the mere conduit of information (“mere conduit”), which apply primarily to telecommunications and access providers, as well as the liability for caching services (“caching”) have been adopted identically. The fundamental rules for “hosting services”, e.g. social media platforms or marketplaces, also remain unchanged. New, however, is the introduction of a “Good Samaritan”-principle, which is about the fact that, for example, measures initiated specifically by the services to detect and remove illegal content should not automatically lead to the loss of the liability privilege.

Graduated model of obligations

The focus of the regulation is on hosting services, e.g. cloud and web hosting providers that store information. Depending on the business model and company size, the regulation further differentiates:

  • Online platforms: Many of the specific obligations only apply to hosting services that also publicly disseminate the information; typically such platforms are social media platforms, online marketplaces or app stores.
  • Very large online platforms (also referred to as “VLOPs”): This includes online platforms that reach more than 10 % of consumers in the EU (> 45 million) with their services on a monthly average. Comprehensive compliance and due diligence obligations apply to these hosting services.

Content Moderation

Means the entire handling of illegal content by a platform, i.e. from detecting and combating it, to influencing its visibility (e.g. by “downgrading” content), to possibly closing a user account due to the distribution of illegal content. The regulation contains rules that are intended to contribute to the transparency and verifiability of these processes.


For some services, the regulation generally imposes an obligation to implement a notice-and-action system for illegal content. In addition, the regulation specifies many rules for the processes: e.g., the obligation to confirm receipt as well as to give reasons for a “notice”.

Data access

Very large online platforms can be obliged to open access to data under certain conditions. This should be possible, for example, for the purpose of assessing compliance with the regulation or also for researching systemic risks.


For distance selling, the know-your-business-customer (“KYBC”) principle will be introduced for online platforms. Online platforms must require their merchants to provide extensive identification to enable an assessment of the merchant in terms of domicile, structure of the business and creditworthiness. This identification must be done in a verifiable manner.

Transparency in behavioural advertising

With regard to behavioural advertising placed online, online platforms must take comprehensive transparency measures. The provider must clearly label the advertising as such. It must be clear to the customer who the advertiser is and according to which parameters the specific advertisement is displayed to him.

Digital Service Coordinator

So-called “Digital Service Coordinators” are to be appointed in the competent national authorities of the member states, who are to be responsible for the implementation of and compliance with the regulation and serve as contact persons for the European institutions.

Trusted Flagger

Rules are created in the DSA to establish trusted and independent whistleblowers whose tips are to be given special attention by the platforms within the reporting system.


Similar to the GDPR, significant fines may be incurred for violations of the DSA (up to 6% of annual revenue or imposition of recurring daily revenue-based fines of 5% as leverage).

Take a look at our video (in German language):

Our 7 experts for the Digital Services Act

Marwah  Kamal

Marwah Kamal


Dr. Anna  Kellner

Dr. Anna Kellner


Christina  Kirichenko

Christina Kirichenko

Senior Associate

Dr. Christoph  Krück

Dr. Christoph Krück


Sandra Sophia  Redeker

Sandra Sophia Redeker


Johannes  Schäufele

Johannes Schäufele


Corinna  Schneiderbauer

Corinna Schneiderbauer