top of page
  • Writer's pictureEinar Sigurdsson

EU Digital Services Act (DSA): Important Requirements to Meet Compliance

Updated: Apr 21, 2023

The EU Digital Services Act (DSA) covers a vast scope of social, technical, and business change for online platforms operating in the EU. The DSA puts in place requirements aimed at protecting users’ rights, reducing exposure to illegal content, providing legal certainty to service providers, giving access to EU-wide markets for business users, instilling democratic control for large platforms, and much more. In this article, we’ve covered some of the most important requirements online services will need to adhere to in order to comply.

 


Included in this article:


The DSA goes into force: Four layers of compliance


The EU Digital Services Act (DSA) came into force on November 16th 2022 and will soon affect all intermediary services, including hosting services, online platforms, very large online platforms (VLOPs) and very large online search engines (VLOSEs).


As illustrated in the presented graphic, the requirements that apply to each service type are arranged in a nested fashion, meaning that the applicable requirements compound with the next layer. For example, the requirements described under intermediary services apply to each of the service types nested within it, but the additional requirements described under, say, VLOP and VLOSEs, do not apply to the layers that came before them.


Intermediary services: Services providing network infrastructure, such as internet access providers, domain name registrars. Also include:


Hosting services: Cloud and web hosting services. Also include:


Online platforms: Online marketplaces, app stores, collaborative economy platforms and social media platforms.


Very large online platforms and search engines: Platforms whose monthly user base exceed 45 million, and who pose risks in spreading illegal content via their vast channels.


DSA compliance dates


Very large online platforms and search engines (VLOPs and VLOSEs) are required to comply with the new regulations just four months after the European Commission designates them as large enough to be categorized as a VLOP or VLOSE, positioning their compliance date to take place some time in 2023. All other online service types not categorized as VLOP or VLOSE are required to comply by 17 February of 2024.


Important dates:


November 16 2022: The DSA goes into force. Platforms have three months to report their active monthly user base numbers to the European Commission to determine if they should be designated as a VLOP or VLOSE.


February 17 2023: Deadline for all platforms to report active monthly user base numbers.


+4 months: After being designated as VLOP or VLOSE, platforms have 4 months to comply with DSA regulations.


February 17 2024: Deadline for all online service types–intermediary services, hosting services, online platforms–to comply with DSA regulations.


Understanding the EU Digital Services Act


Because failure to comply with the new requirements can result in penalties of up to 6% annual turnover or even a court asking for temporary suspension of online platforms service, it is important for all online service types to understand the requirements they’ll need to meet in order to comply. However, depending on the type and size of the online service, the requirements put into effect by the DSA may vary in scope and complexity, leading to differences in difficulty when making the necessary changes for compliance.


To help with some of the complication, we’ve created a short, easy-to-read list of some of the core requirements services need to adhere to stay in business, categorized below by type.


DSA requirements: All intermediary service providers


Broadly speaking, intermediary services refers to those offering network infrastructure, internet access providers, and domain name registrars, but also includes online players whose obligations are described throughout this article: Hosting services, online platforms, and very large online platforms (VLOPs).


All intermediary services are required to:


Provide transparency reporting


All online services are required to publish an annual report that lists the amount of notices submitted by designated authorities and third parties, and which describes how these notices were addressed.


Remove illegal content in compliance with orders from national authority


As it relates to the requirement listed above, all online services are required to remove content deemed illegal by the authority and, in the form of an annual report, inform the authority when/if such an act was taken.


Describe terms of services


All online services are to provide information explaining the specific restrictions that may be imposed when using their service, such as content moderation policies. For services whose common users are children, these terms should be described in a manner they can easily understand.


Establish a single point of contact (POC) and legal representation


All online services are required to establish a single point of communication with the EU Commission, Member State Authorities, and the Board. If a platform is not based in the EU but still provides services to its Member States, they will need to establish representation in at least one of the Member States where the service operates.


Clearly explain content moderation methods


Via terms and conditions documentation, all online services are required to provide an in-depth and clear description of their content moderation methods. This includes a description of the measures and tools (i.e., technologies, algorithms, and manual human review) that the platform uses to moderate content.


DSA requirements: Hosting services


Along with the requirements described above, hosting services such as cloud and web hosting services must comply with these additional obligations:


Set up notice and action mechanisms


All hosting providers are required to provide third parties (i.e., service users) with an easy-access mechanism for reporting potential illegal content. In the case of content removal, demotion, or demonetization, the reasons for the action should be forwarded to the parties involved, including the individual who filed the notice.


Report criminal offenses


Hosting providers are also required to notify law enforcement or judicial authorities of suspicious activity signaling severe criminal offense, such as life-threatening indications that an individual’s safety is seriously compromised.


DSA requirements: Online platforms


In addition to the requirements above, online platforms have their own set of special requirements. Online platforms bringing together sellers and consumers such as online marketplaces, app stores, collaborative economy platforms and social media platforms.


Note: Small and micro enterprises are excluded from the following requirements


Small enterprises (employing less than 50 individuals with an annual turnover of less than 10 million EUR) and micro enterprises (employing less than 10 individuals with an annual turnover of less than 3 million EUR) are exempt from cooperating with the following regulations.


Treat trusted flaggers with priority


Trusted flaggers are entities—such as Europol or organizations connected to the INHOPE hotlines for reporting child sexual abuse material (CSAM)—whose notices platforms will need to treat with priority. Organizations given status as trusted flaggers will need to qualify as such through meeting the guidelines established by Digital Services Coordinators. If the Digital Service Coordinator finds that a trusted flagger is abusing or misusing their position, this status can be revoked. Additionally, to satisfy DSA requirements, trusted flaggers will have refined expertise in detecting illegal content and be required to update the tools/methods they use to do so.


Set up an internal complaint-handling mechanism for content moderation and arrange out-of-court mediation


Online platforms are required to establish a free-to-use complaint-handling mechanism that allows for streamlined content moderation decisions. The mechanism should be available for use by all parties who might interact with the platform.


Types of disputes allowed to follow decisions made after introducing this mechanism:

  • An event where content is taken down

  • An event that leads to suspension or termination of service

  • An event that leads to suspension or termination of a user’s account

  • An event where a platform decides against taking action

  • An event where a platform doesn’t act on reported illegal content

In the case of any of the above listed events, online platforms are required to work with the users’ appointed out-of-court dispute settlement body. Platforms must comply with the decision(s) reached by the settlement body.


Ensure transparency of advertisements and protect children from abusive advertisements


Online platforms will need to ensure:

  1. That users can immediately confirm that the information they are engaging with is indeed an advertisement;

  2. That the advertiser or person responsible for the advertisement is immediately apparent;

  3. That users can access a clear explanation of why the advertisement in question is being displayed to them.

Additionally, online platforms are required to help protect minors via enhanced privacy and security, which includes a requirement to ban advertisements created based on profiling minors.


DSA Requirements: Very large online platforms (VLOPs) and very large online search engines (VLOSEs)


An online platform or search engine will likely be designated as a VLOP or VLOSE if their active monthly user base exceeds 45 million (10% or more of the total EU consumers). All online platforms are required to report their number of active users by 17 February 2023, and the European Commission will use this data to decide if the platform or search engine in question qualifies as ‘very large.’ After being designated as such, VLOPs and VLOSEs have 4 months to comply with DSA regulations. This includes carrying out the first annual risk assessment described below.


Carry out risk assessments related to use of service


VLOPs are required to analyse systemic risk and put mechanisms in place to mitigate said risk. Forms of systemic risk include but are not necessarily limited to illegal content, hate speech, violation of privacy, and political manipulation.


Undergo internal and external audits


VLOPs are subject to internal and external audits whose results/recommendations must be put in place within one month of receipt, or provide the auditing body with justification for why the VLOP has decided against following the recommendations made in the report.


Provide public repository to advertisements and give users a choice against profiling


VLOPs are required to provide easily accessible information about advertisements used on their platform each year, including:

  1. The advertisement’s content

  2. The person/entity/advertiser on whose behalf you have forwarded the advertisement

  3. The duration in which the advertisement was run

  4. If the advertisement was displayed to individuals based on profiling and why that group was chosen.

  5. The number of users/individuals that the advertisement reached via the VLOP.

Additionally, VLOPs are required to provide access to one recommender system that has not been built out of profiling methods.


Government-provided resources

If you wish to learn more about the EU Digital Service's act, access Question and Answers: Digital Services Act, provided by the European Commission.

bottom of page