With an effective date of February 17, 2024, the Digital Services Act ("DSA") will start applying to most online platform providers in less than a year. The DSA, which introduces due diligence and transparency obligations regarding algorithmic decision-making by online platforms, such as social media, video sharing or e-commerce, entered into force on November 16, 2022. For providers of Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs)—providers reaching 45 million EU users monthly—the DSA may apply four months following a notification to the provider concerned, which will be a date earlier than February 17, 2024.
In this Legal Update, we provide further detail on these obligations and note the steps businesses can take to comply with the DSA.
A Reflection Of The Current Ai Regulatory Environment
Along with the Digital Markets Act, the DSA is part of the EU Digital Services package, which proposes important changes for online platforms. For example, the DSA includes specific requirements for online marketplaces (such as e-commerce platforms), search engines, and the protection of minors.
The DSA’s algorithmic transparency and accountability requirements complement other EU AI regulatory efforts driving a growing number of legislative proposals such as the AI Act and the AI Liability Directive. The AI Act, for example, will regulate the use of AI in specific sectors, including a ban on specific AI systems posing “unacceptable risk.” Similar legislative efforts are ongoing in the United States—in July 2022, the House Energy and Commerce Committee approved the proposed American Data Privacy and Protection Act (“ADPPA”), which would, among other things, impose evaluation and reporting requirements for artificial intelligence tools and establish a Bureau of Privacy at the Federal Trade Commission to enforce its provisions. See our legal update.
A Focus on Intermediary Services
The DSA regulates the obligations of digital services that act as intermediaries connecting consumers with goods, services, and content (e.g., social networks, online marketplaces, app stores). A core part of an intermediary service provider’s operation is the manner in which information is prioritized and presented to the recipients of the service, such as by algorithmically suggesting, ranking and prioritizing information (cf. Recital 70). The obligations under the DSA are tailored to specific categories of intermediary services and are intended to match each category’s role, size and impact in the online ecosystem.
Categories of services and platforms are nested within other categories, each bound to its own specific obligations as well as the obligations of every category above it. So, for example, Online Platforms are bound to rules for Online Platforms plus the obligations for Hosting Services and Intermediary Services.
The DSA contains several obligations, with the goal of increasing algorithmic transparency and accountability, addressed to intermediary and hosting services, with more specific obligations for online platforms, and with yet further duties for VLOPs.
The DSA introduces transparency measures to enhance transparency around recommender systems and advertising.
- Under Art. 14(1), providers of intermediary services must include information on measures and tools used for content moderation, including algorithmic decision-making, in their terms and conditions. While the level of detail is not specified, recital 70 indicates that platforms must ensure that recipients of their service are appropriately informed about how algorithms impact and influence the way information is displayed.
- The DSA includes additional transparency obligations for providers of online platforms that use recommender systems. Under Art. 27(1), these providers must set out in their terms and conditions "the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters." Among other things, the main parameters must include the criteria that are most significant in determining the information suggested to the recipient of the service, Art 27(2). Providers also must make available a way for the recipient of the service to select and modify their preferred option.
Action Item for Businesses
In addition to monitoring and updating their privacy notices, affected businesses will need to make sure that these additional transparency obligations are reflected in their terms and conditions and on their online interfaces. As with privacy notices and cookie banners, guidance from authorities and case law should be continuously monitored and reflected, especially in the early enforcement phase.
Due Diligence Obligations
Providers of VLOPs are also subject to due diligence obligations in relation to risks stemming from their algorithmic systems.
- Under Art. 34(1), providers of VLOPs must, on an annual basis, diligently identify, analyze and assess any systemic risks stemming from the their algorithmic systems. The documents supporting the risk assessment must be preserved for at least three years and must be shared with the competent supervisory authorities upon request. The systemic risks to be assessed include actual or foreseeable negative effects on electoral processes and public security, Art. 34(2).
Action Item for Businesses: While level of detail is not specified, other EU legislation, such as the General Data Protection Regulation (“GDPR”), includes somewhat comparable obligations (e.g., "data protection impact assessment"). Business falling under the category of VLOPs should therefore prepare appropriate risk assessments and should be able to provide them to the competent authorities upon request.
Generally, the member state in which the main establishment of the provider of intermediary services is located is responsible for supervising and enforcing the DSA. However, the Commission has exclusive powers to supervise and enforce some algorithmic accountability obligations specific to VLOPs and VLOSEs:
- Providers of VLOPs and VLOSEs must provide access to data that are necessary to monitor and assess compliance with the DSA upon request. According to Art. 40 (3), this includes explaining the design, logic and functioning of their algorithmic systems, including their recommender systems.
- For the purposes of monitoring compliance with the obligations under the DSA, the Commission may order providers of VLOPs and VLOSEs to provide access to, and explanations relating to, their databases and algorithms, Art. 72(1). Additionally, the Commission has the power to conduct inspections at the premises of the providers of VLOPs and VLOSEs and ask them on-site to provide explanations of their IT system, algorithms, and data handling, among other things, Art. 69(5).
Action Item for Businesses: The DSA introduces a type of inspection that may require businesses to disclose information surrounding their algorithmic systems or maybe even the algorithms themselves. Businesses should prepare in advance on how to react to enquiries from the Commission regarding their algorithms. Since the inspection may require disclosure of highly sensitive trade secrets, businesses should set up a mechanism to only reveal as much information as legally required.
Failure to comply with any obligation under the DSA could result in a fine of up to 6 percent of the annual worldwide turnover of the provider of intermediary services concerned in the preceding financial year (Art. 52). The maximum fine for non-compliance under the DSA is therefore even higher than what is available under other EU legislation such as the GDPR. The provision of incorrect, incomplete or misleading information to the competent authorities may lead to fines of up to 1 percent of the annual worldwide turnover.
For VLOPs and VLOSEs
A specific sanctions regime exists for providers of VLOPs and VLOSEs, which are monitored and sanctioned by Commission. If a provision relevant to VLOPs and VLOSEs is found to be infringed, the Commission may issue a non-compliance decision (Art. 73) and in that decision impose fines of up to 6 percent of the total worldwide annual turnover where it finds that the provider intentionally or negligently infringed, among other things, Art. 74(1).
A New Authority
To help monitor compliance with the DSA, the European Centre for Algorithmic Transparency ("ECAT") was launched as part of the DSA. The ECAT will contribute scientific and technical expertise to the Commission's exclusive supervisory and enforcement role of the systemic obligations on VLOPs and VLOSEs provided for under the DSA.
Additional Author: Anna Lohoff
Impact of the Banking Crisis on Bank-FinTech RelationshipsTech Talks Podcast
2. März2023Tech Talks Podcast