SaveTheInternet

Digital Services Act & Digital Markets Act

What:
 
EU Regulations
 
 
Name:
 
Digital Services Act (DSA) and Digital Market Act (DMA)
 
Alias:
 
Regulations and Laws for digital markets and services
 
Targets:
 
  • Renewal of the outdated european e-Commerce regulation from 2000
  • Reducing the power of large online platforms and creating a homogeneous european regulation for platforms
  • Creation of a safe, trustworthy online environment, where fundamental rights are protected effectively
  • Standardization of the digital Single Market in the EU: Setting up 'traffic lights' in the chaotic 'online traffic network'

Current status

On 15.12.2020, the Commission published its first proposal for the two planned regulations. The European Parliament and the member states will discuss this proposal as part of the ordinary legislative procedure. This will take until around 2022 or 2023. If the final text is adopted, it will be immediately applicable throughout the European Union.

Dangers: 

 
In general, many good and reasonable ideas have been developed which EU-wide rules should apply to platforms and gatekeepers. Compared to the copyright directive and TERREG resolution these acts highlight users and define their fundamental rights. Nevertheless: The devil is in the details. We are aware of the following risks:

Das heißt, eine Plattform kann wegen Artikel 17 zu Uploadfiltern verpflichtet wird, obwohl der DSA eine Verpflichtung zu Uploadfiltern explizit ausschließt.

Dies betrifft z.B. das nichtkommerzielle Wikipedia. Wikipedia würde sogar aufgrund seiner Größe als Gatekeeper gelten, wo es nochmal besonders strenge, nicht leistbare Vorschrifen gibt.

Der DSA unterscheidet bisher leider nur unzureichend zwischen verschiedenen Arten illegaler Inhalte. Dies verursacht Hindernisse in der praktischen Umsetzung.

If an alleged illegal content is justified with sufficient precision when it is reported, the platform is automatically liable if the content is not removed immediately. This favors automated blocking! And that is explicitly permitted. It just needs to be made transparent.

There are already quite a few cases of abuse, where notice and take down was used to remove for example critical reporting or other unwanted content from the Google search. In addition, mandatory account blocking is to be applied in the event of repeated violations of the law (see the 3-strike rule on YouTube and Twitch). This is problematic when filtering systems make mistakes, but also when, for example, the platform suddenly receives many complaints about some years old content.

Quotes

The draft for the Digital Services Act shows that the protests against upload filters have led to a real change of heart in Brussels. For the first time, the users of platforms (...) are considered as mature participants in a democratic discourse space.“ – Julia Reda, https://netzpolitik.org/2021/edit-policy-der-digital-services-act-steht-fuer-einen-sinneswandel-in-bruessel
 

Further Information

Digital Services Act (DSA)

The Digital Services Act defines how platforms should be held responsible for their content.
They should remove illegal content and include their users in their decisions, by explaining why content has to be deleted. Also they have to give users the possibility to object their decision.
 
Roughly speaking, the platforms are obliged to be more active against illegal content and at the same time to protect the fundamental rights of their users. Many of the measures are aimed at improving transparency.
Particularly strict rules are to apply to operators of especially large platforms with at least 45 million monthly users in the EU (so-called "gatekeepers"). These rules are defined in more detail in the DMA.
  • hate speech
  • terrorist content
  • discrimination
  • depiction of child sexual abuse
  • illegal distribution of private images
  • online stalking
  • sale of counterfeit products
  • copyright infringements
  • Platforms must not be forced to implement upload filters
  • If upload filters are deployed, this must be communicated publicly and filters may not infringe the freedom of speech
  • When platforms block content, they must explain their decision to users
  • The blocking decisions, including information about the report, must be stored in a central database (personal information excluded). A similar database already exists on a voluntary basis in the U.S. and has made numerous abuses of the DMCA public because journalists and researchers have identified false blocking patterns.
  • Platforms must inform their users why they are displayed certain advertisements
  • Online-Marktplätze müssen die Identität aller Händler überprüfen und gegenüber Verbrauchern offenlegen. Es muss eindeutig erkennbar sein, wenn ein Produkt durch einen Dritten angeboten/verkauft wird

Digital Markets Act (DMA)

The Digital Markets Act defines competion rules for big platforms (so called 'Gatekeepers'). These rules must not be disobeyed from the very beginning, with the purpose of creating fairer competition in the EU.
  • Prohibition to exclusivly install own applications on a device (e.g. Google search)
  • Prohibition to urge other operating system developers or hardware manufacturers to solely pre-install own applications. (keyword interoperability)
  • Permit users to uninstall pre-installed applications
  • Ban of " self-preference", e.g. to present own services/offers ranked higher in the own search query
  • Prohibition to use generated metadata only for own commercial benefit/activities. Data has to be made available for other commercial users.
  • Users have to explicitly agree prior to sharing data with other services of the provider (e.g. Facebook, Instagram, WhatsApp)
  • Algorithms to create consumer profiles have to be audited yearly and the results have to be published
  • Big platforms habe to inform competition authorities early about planned company mergers, company acquisitions and technical partnerships.
  • Obligation to maintain a publicly viewable database of all placed advertisements, as well as their coverage and target audience
  • Commitment to regular risk analyses, e.g. to prevent election manipulation or public health hazards on the platform
  • In some cases obligations to provide insight to science in order to research these risks
  • Obligation that all profiling (i.e.: all customized recommended content) can be deactivated by the user.