March 03, 2023

Countdown to the Digital Services Act

Share

1.  Executive summary

  • The new European Digital Services Act (DSA) imposes a number of new and varied obligations for online platforms and search engines. Many providers of online services must implement several significant operational changes in the light of these rules. These include, in particular, for providers of very large online platforms and very large search engines, the obligations to (i) diligently identify, analyse and assess any systematic risks in the European Union stemming from the design and functioning of their services (risk assessment) and (ii) put in place mitigation measures in compliance with fundamental rights.
  • Going forward, careful consideration will need to be given to platform design, especially in respect of how advertisements are shown and traders and consumers are dealt with.
  • Providers of online services would be well advised to carry out a risk assessment, and in the light of this, adopt a streamlined set of processes that allow for continuous compliance, notably with obligations such as transparency reporting and independent audits.
  • Given the layered and proportionate design of the DSA, specific advice should be sought in each case. Since non-compliance may lead to fines of up to 6 % of the annual worldwide turnover, all online actors should consider now, the extent to which their services fall within scope of the DSA and other digital rules, and what actions are needed to avoid penalties.

2.  Introduction

This month, we are beginning to see the practical impact of some of the EU's new rules on regulating digital markets, with the first significant deadline in the DSA for the biggest online platforms and search engines having passed on
17 February 2023.

Several large digital service providers have now publicly stated they should be classed as the biggest players under the new rules ("very large online platforms" - VLOPs or "very large online search engines" - VLOSEs).  This means that they could potentially face Commission action under the DSA from September 2023.  Others have a little longer, but still now less than a year, to comply with the relevant DSA rules.

In the meantime, the Commission has opened a consultation on its DSA enforcement procedures.  This covers a draft measure laying out its investigatory and enforcement powers, platforms' right of reply to enforcement actions, and their right to access Commission files in disputes.  It is open until mid-March 2023. The decisions made over practical implementation are likely to be of paramount importance.1

3.  Overview of the new regime

The new rules governing digital markets in the EU are contained primarily in the Digital Markets Act (DMA) and its sister, the Digital Services Act (DSA). Together, these aim to make the digital sphere a fair, innovative and safe playing field for both businesses and consumers. In particular, the DSA seeks to protect the digital space against the spread of illegal content including counterfeit goods, and ensure the protection of users’ fundamental rights such as privacy and freedom of speech, putting an end to the "wild west which dominates our digital space". It will do this by introducing standardized rules for digital content focusing on moderation and use of content including advertising. If the online businesses do not comply with the DSA on their own initiative, they may face a "take down order" – see more below.

To this end, the DSA:

  • creates new obligations and accountability for providers of “intermediary services”.  The definition of intermediary services in the DSA is wide, catching a vast array of digital service providers including internet access providers, cloud and hosting services, online marketplaces, app stores and social media platforms; and
  • applies to providers of intermediary services, regardless of their place of establishment, if they operate in the EU and there is a "substantial connection"2 to the EU.

The precise timing and obligations which apply to a digital player will depend on the nature of the services provided and the number of users, for which specific advice should be sought. At a high level though, the key focus of the DSA is the process by which platforms are notified and must take subsequent action on illegal content such as counterfeit and dangerous products, as well as manipulative and abusive activities. Note that the UK government has proposed an Online Safety Bill to tackle issues relating to online content which picks up several of these themes.3

Depending on the type and size of service, the DSA sets various requirements aimed at making processes and content moderation clear to users, including requirements in relation to content moderation practices, transparency of online advertising, protection of children from personalised advertising, provision of information by recommender services, traceability of traders, as well as a special crisis mechanism applicable in circumstances of war, terror or a pandemic.

The VLOSEs and VLOPs will also have to, amongst other obligations:

  1. Conduct at least yearly independent audits (at their own cost) to assess compliance with the due diligence obligations and certain other obligations under the DSA;
  2. Allow the Commission to access data to allow it to monitor the platform's compliance with the DSA;
  3. Give access to data to vetted researchers carrying out research to identify and understand systemic risks;
  4. Provide additional transparency around advertising including API access to a repository containing, for at least one year from the date an ad was displayed, certain information about every ad displayed to users (redacting any personal data of users to whom ads were displayed); and
  5. Submit additional transparency information such as six monthly reports, risk assessments and risk mitigation measures, audit reports and audit implementation reports.

4.  Complaints

Businesses will be required to remove illegal content such as counterfeit goods, hate speech, terror propaganda and discrimination without delay. Users will be able to report this kind of content via a "notice and action mechanism". Removal decisions should also be open to challenge.

Businesses will therefore need to put in place a system to deal with these complaints, ensuring that notices and complaints are processed in a systematic and fair way, which also respects fundamental rights such as privacy. To bolster the pipeline of "notices" to the online businesses, a new system of "trusted flaggers" will be used to report illegal content (and whose reports must be handled with priority).

As illegal content will be defined by EU laws as well as national laws, it will be crucial to monitor legal developments in all Member States which will potentially be a big challenge for companies. Similarly, this might be difficult from a global operations perspective, since where  content is illegal only in a given Member State, as a general rule it should only be removed in the territory where it is illegal.

5.  Clarity and caution

The melodic line of the DSA is increased transparency. This plays out in a number of the obligations e.g. the way in which illegal content is explained and reported, the terms and conditions for the platforms, how algorithms are used to push content etc. As such, businesses must prioritise compliance with the DSA. In the most serious cases, providers of hosting services are obliged to notify the relevant authorities if they suspect that a criminal offence involving a threat to life, or safety of a person or persons, has or is likely to occur. More generally, online marketplaces can be required to trace their traders under the "know your business customer" principle and any dark patterns in order to better track down sellers of illegal goods or those engaged in manipulative practices. From an advertising perspective, greater transparency on advertisement targeting (such as who sponsored the advertisement, how and why it targets a user) as well as a ban on certain targeted adverts are required. Moreover, clear information on why content is recommended to users will need to be provided.

6.  Who's in charge?

Much like the EU rules regulating anticompetitive behaviour, enforcement of the DSA is a joint effort between national regulators and the European Commission.

If online players do not themselves take down illicit content under the DSA, they may be issued with a "take down notice" from EU member state courts or other competent authorities.

  1. At the EU level, the Directorate-General for Communications Networks, Content and Technology will have exclusive power to supervise the obligations of VLOPs and VLOSEs to manage systemic risks under the DSA.
  2. At a national level, Member States have some discretion as to exactly how the implementation will occur, with each one assigning a national authority to the role of Digital Services Coordinator (DSC). As of next year, these bodies will be responsible for all matters related to the enforcement of the DSA at the national level including the power to issue take down notices (see below) and or the cessation of alleged infringements of the DSA, adopt interim measures to avoid the risk of serious harm and impose remedies, fines and periodic penalty payments. It's worth noting that some significant players in the online world do not qualify as VLOSEs / VLOPs, so will in principle, fall under the eyes of national DSCs.
  3. An independent European Board for Digital Services will help the Commission coordinate the actions of the DSCs and contribute to the drafting of templates and codes of conduct as envisaged under the DSA – this may well become akin to the European Competition Network in time.

7.  Next steps

The immediate practical effects of the DSA are likely to include increased compliance costs for certain online businesses, possible organisational or personnel changes at a compliance level and increased accountability to relevant authorities. However, in the longer term, it should make Europe’s platform economy stronger, more trusted and more competitive. The DSA provides new opportunities for businesses, who should be able to be more confident going forward, of, for example, illegal content being brought down quickly from websites. Compliance, or even going beyond the requirements of these new rules, gives online actors the chance to differentiate themselves from their competitors to ensure the safety of their users via proactive actions.

  • Companies need to be proactively planning and speaking to their advisors about concrete steps needed to comply with both the DSA and DMA, as both these regimes are running in parallel, albeit with different obligations and different time frames.
  • Several major digital companies have asserted that they do not cross the user threshold to be considered as a VLOP / VLOSE. Others no doubt still have some numbers to crunch.
  • EU Member States will have to appoint Digital Services Coordinators by 17 February 2024, when platforms with less than 45 million active users have to comply with the DSA rules.
  • Companies should be taking steps now to understand the requirements of the new regime in detail. Particular attention should be given to:

— platform policies and how these are followed in practice – now might be a good time to roll out staff training
     in this area; and
— use of automation vs. people.

The Mayer Brown team offers a distinctive combination of in-depth experience in fields that inspired the DSA regime and procedures, namely involvement in complex risk assessment and compliance audits at EU and national levels.  With the benefit of our established relationships with key stakeholders at EU and Member State levels, we are well-placed to assist multinational corporations in navigating the field of ever-increasing EU tech regulations.

 


1 Note also the ongoing consultation into the implementing regulation for the Foreign Subsidies Regime: European Commission Publishes Draft Implementing Rules for EU Foreign Subsidies Regulation: another layer of complexity | Perspectives & Events | Mayer Brown.

2 i.e. have an establishment or a significant number of users in the EU or target their activities towards one or more EU member states.

3 Service providers who come within the Online Safety Bill’s scope are subject to “duties of care” to (i) conduct risk assessments; (ii) take proportionate steps to mitigate risks of harms to users on their services; and (iii) to take steps to prevent, mitigate and/or minimize the presence of content on their services. Service providers also have a duty to balance these obligations with a duty to protect freedom of expression, privacy and content of journalistic and democratic importance. Companies who do not comply with the Online Safety Bill can be subject to fines of up to 10% of a provider’s worldwide qualifying revenue, and senior business managers can also be subject to criminal actions. Note that this legislation is still going through the UK Parliament and is not yet law.

Stay Up To Date With Our Insights

See how we use a multidisciplinary, integrated approach to meet our clients' needs.
Subscribe