flib 50 jaar
Published on: 8 July 2024

Digital Services Act: online platforms and search engines

In the series on the European Data Strategy, now follows the third installment on the Digital Services Act (DSA).

The first part discussed the background of the DSA, the conditions for exemption from liability of online platforms and enforcement. In the second part, we discussed the adjustments online platforms must make to their operations.

In this third part, we discuss the application of the DSA to the Very Large Online Platforms and the Very Large Online Search Engines. These rules are effective as of Aug. 25, 2023.

Very large online platforms and very large online search engines

The DSA includes an additional regulation for Very Large Online Platforms and Very Large Online Search Engines in Section 5. These are online platforms and online search engines that reach an average number of monthly active customers of the service in the Union equal to or greater than 45 million and are designated as very large online platforms or very large online search engines under paragraph 4 of Section 33 DSA. Because of the high impact of their services, these large companies have an additional obligation to assess the risks associated with them.

These are the following systemic risks

  1. a) the distribution of illegal content through their services;
  2. b) any actual or foreseeable negative impact on the exercise of fundamental rights, in particular the fundamental right to human dignity enshrined in Article 1 of the Charter, the fundamental right to respect for private and family life enshrined in Article 7 of the Charter, the fundamental right to the protection of personal data enshrined in Article 8 of the Charter, the fundamental right to freedom of expression and information, including freedom and pluralism of the media, enshrined in Article 11 of the Charter, the fundamental right to non-discrimination enshrined in Article 21 of the Charter, the fundamental right to respect for the rights of the child enshrined in Article 24 of the Charter, and the fundamental right to a high level of consumer protection enshrined in Article 38 of the Charter;
  3. c) any actual or foreseeable adverse effects on civil dialogue and electoral processes, and public safety;
  4. d) any actual or foreseeable negative effects with respect to gender-based violence, the protection of public health and minors, and serious negative effects on the physical and mental well-being of the person.

The following factors should be considered in this regard

  1. (a) the design of their recommendation systems and other relevant algorithmic systems;
  2. b) their content moderation systems;
  3. c) applicable terms and conditions and their enforcement;
  4. d) advertising selection and display systems;
  5. e) provider’s data-related practices.

The assessments shall also analyze whether and how risks are affected by intentional manipulation of their service, including through inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and widespread dissemination of illegal content and information incompatible with their terms and conditions.

Assessment shall take into account regional or linguistic specificities, including those specific to a Member State.

Accountability

Providers of very large online platforms and of very large online search engines shall keep evidence of risk assessments for a period of at least three years after risk assessments have been carried out and shall communicate them to the Commission and to the digital services coordinator of the place of establishment upon request.

Management adjustments and measures to be taken

Providers of very large online platforms and of very large online search engines shall take reasonable, proportionate and effective risk mitigation measures tailored to the specific systemic risks identified under Article 34 DSA, with particular attention to the effects of such measures on fundamental rights. Such measures shall include where applicable:

  1. (a) adapting the design, features or operation of their services, including their online interfaces;
  2. (b) adaptation of their general terms and conditions and their enforcement;
  3. c) adaptation of content moderation procedures, including the speed and quality of processing of reports related to specific types of illegal content and, where appropriate, prompt removal of or prompt blocking of access to the reported content, in particular regarding illegal hate speech or cyber violence, as well as adaptation of all relevant decision-making procedures and specific content moderation tools;
  4. d) testing and adaptation of their algorithmic systems, including their recommendation systems;
  5. e) adaptation of their advertising systems and adoption of targeted measures aimed at limiting or adapting the display of advertising related to the service they offer;
  6. f) strengthening internal processes, resources, testing, documentation, or monitoring of their activities, in particular regarding the detection of systemic risks;
  7. g) establishing or adapting cooperation with trusted flaggers in accordance with Article 22, and implementing the decisions of out-of-court dispute resolution bodies under Article 21;
  8. h) establishing or adapting cooperation with other online platform providers or online search engines using the codes of conduct and crisis protocols referred to in Articles 45 and 48, respectively;
  9. i) taking awareness measures and adapting their online interface to provide more information to service users;
  10. j) taking targeted measures to protect children’s rights, including age verification and parental control tools, or tools to help minors report abuse or receive support, as appropriate;
  11. k) ensuring that information, whether generated or manipulated image, audio or video material that bears a noticeable resemblance to existing persons, objects, places or other entities or events, and is mistakenly perceived by a person to be authentic or truthful, is distinguishable by conspicuous markings when presented on their online interfaces, and in addition, the provision of a user-friendly functionality that allows service recipients to indicate such information.

Preparing for crisis action

A crisis is deemed to have occurred when extraordinary circumstances lead to a serious threat to public safety or public health in the Union or in significant parts of it. In such a case, the Commission, upon recommendation of the Digital Services Council, may adopt a decision requiring one or more providers of very large online platforms or very large online search engines to take one or more of the following actions:

  1. (a) assess whether, and if so, to what extent and in what manner the operation and use of their services contribute significantly, or are likely to contribute significantly, to the intentional serious threat;
  2. b) identify and implement specific, effective and proportionate measures, to prevent, eliminate or mitigate such a contribution to the serious threat;
  3. (c) report to the Commission by a specified date or at regular intervals specified in the Order on the assessments referred to in paragraph (a), the precise content, implementation and qualitative and quantitative impact of the specific measures taken, and on any other matters relating to those assessments or measures, as specified in the Order.

In adopting and applying the measures, the service provider or service providers shall take due account of the seriousness of the serious threat, the urgency of the measures and the actual or potential impact on the rights and legitimate interests of all concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter.

Supervision and reimbursement of supervision costs

Providers of very large online platforms and of very large online search engines are subject to independent monitoring, at their own expense and at least once a year, to assess compliance with:

  1. a) the commitments previously set forth;
  2. b) commitments made in codes of conduct and crisis protocols.

In addition to the aforementioned requirements, providers of very large online platforms and of very large online search engines using recommendation systems shall offer at least one option for each of their recommendation systems that is not based on profiling as defined in Article 4(4) of Regulation (EU) 2016/679.

Providers of very large online platforms or of very large online search engines that display advertising on their online interface shall establish and make publicly available in a specific area of their online interface a register containing detailed information on the origin and content of the advertising using a searchable and reliable tool that allows multi-criteria searches and through application programming interfaces, for the entire period during which they display advertising and for up to one year after the advertising was last displayed on their online interface. They shall ensure that the register does not contain personal data of the recipients of the service to whom the advertising was or could have been shown and shall make all reasonable efforts to ensure that the information is accurate and complete.

Providers of very large online platforms or of very large online search engines shall provide the digital services coordinator of the place of establishment or the Commission, upon their reasoned request and within a reasonable period specified in such request, access to data necessary to monitor and assess compliance with this Regulation.

Providers of very large online platforms or of very large online search engines shall provide a compliance function that is independent of their operational functions and consists of one or more compliance officers, including the head of the compliance function. That compliance function shall have sufficient authority, standing and resources in as well as access to the management body of the provider of the very large online platform or search engine to monitor that provider’s compliance with this Regulation.

Providers of very large online platforms or very large online search engines shall publish transparency reports no later than two months after they are designated as Very Large Online Platform or Very Large Online Search Engine and at least every six months thereafter.

In addition to the transparency and dispute information of Article 15 and Article 24(1) DSA, these reports published by providers of very large online platforms shall state the following:

  1. (a) the human resources deployed by the very large online platform provider for content moderation in relation to the service offered in the Union, broken down for each applicable official language of the Member States;
  2. (b) the qualifications and language skills of the persons performing the activities referred to in point (a), as well as the training and support of such personnel;
  3. (c) the accuracy indicators referred to in Article 15(1)(e) and related information broken down for each official language of the Member States.

The European Commission shall charge providers of very large online platforms and of very large online search engines an annual surveillance fee upon their designation.

Conclusion

Very large online platforms and Very large online search engines have the potential to exert great influence on the rights and freedoms of individuals, as well as on the functioning of the rule of law and the internal market. These Very Large providers should correspondingly contribute to curbing the negative impacts of those same services and contribute to the costs of enforcing those obligations.

Questions:

Do you have any questions? Then contact one of our lawyers by mail, telephone or fill in the contact form for a free initial consultation. We will be happy to think along with you.

Articles by Jop Fellinger

Send us a message

In case you have any questions or would like to schedule an appointment, please feel free to use the form below.