
The European Parliament has voted in favour of the draft Digital Services Act (DSA). The European Commission's package includes measures to combat illegal products, services and online content, including clearly defined procedures for their removal. Recipients of services have the right to claim damages. Unveiled in December 2019, the DSA aims to curb excesses on the internet, including hate speech and other criminal acts.
Online platforms must also offer more options for tracking-free advertising and there will be a ban on the use of data from minors for targeted advertising. Each country will have a services regulator, who will conduct risk assessments and enforce greater transparency on algorithms to combat harmful content and disinformation.
The text was adopted by Parliament on 20 January with 530 votes in favour, 78 against and 80 abstentions. Due to a series of amendments, the text now deviates from the original proposal of the European Commission. The text will be used as a negotiating mandate with the French Presidency of the Council, which represents the member states. The countries reached an accord on the proposed Digital Markets Act (DMAA) and DSA in November 2021.
Successor to eCommerce Directive
The DSA is the successor to the eCommerce directive from 2000. Online services have now become indispensable for almost all offline services. The DSA's proposal defines clear responsibilities and accountability for providers of intermediary services, especially online platforms, such as social media and marketplaces.
Platforms remain free from direct liability, but must act faster and more fully against illegal products and services and monitor their business users more closely. The European Parliament has also adopted additional measures to ensure that platforms handle complaints and procedures with due care.
Additional obligations for very large platforms
Very large online platforms (VLOPs) will be subject to specific obligations due to the special risks posed by the dissemination of both illegal and harmful content. This has generated much discussion, because not all malicious content is illegal, and not all illegal content is harmful.
The DSA should help tackle harmful content (which may not be illegal) and the spread of disinformation by including provisions on mandatory risk assessments, mitigation measures, independent audits and the transparency of so-called recommendation systems and recommendation engines, algorithms that determine what users see and do.