The EU Digital Services Act (DSA) came into force November 16, 2022 bringing in a multitude of changes for the safety and security of social and business interactions online. Supplementary to the self-accountability that online service providers must apply to their companies, the DSA has established the official status of trusted flagger. Organisations that exist to identify illegal content online may apply to qualify for the official title of trusted flagger, whose notices are legally required to be treated by online platforms with the utmost urgency.
In this article, we break down the role of trusted flagger, their responsibilities, and what organisations can apply.
Included in this article:
What is a trusted flagger?
Under the Digital Services Act (DSA), trusted flaggers act as government-appointed entities, explicitly not individuals, devoted to the identification and notifying platforms of potential illegal content on their site. Specified in Article 22 of the DSA, the ‘trusted flagger’ status is defined by the following requirements:
Possessing the competence to effectively identify illegal content
Independence from any online platform
Submitting flagged content (i.e., notices) accurately and objectively
Required to publish an annual report describing actions and notices made the previous year.
(Article 22, Digital Services Act)
However, the term trusted flagger is not a new concept introduced by the DSA, and has for the majority of its short history been used to describe both individuals and organisations that work with certain platforms to ‘flag’ content that violates their respective community guidelines. But with this new legislation, the term has taken on an official badge of approval indicating that an entity–again, explicitly not individuals–has been given power by the government to aid in identifying illegal content. Such entities may include investigative hotlines and law enforcement agencies.
Trusted flagger: A brief history
Prior to being an official title in recent legislation of the DSA, positions such as “trusted flaggers”, “trusted partners” and “safety partners” began within platforms themselves, assigning third parties to flag content for review by a content moderation team. Due to the overwhelming amount of content posted to platforms every day, these flagging positions came to be in hopes of keeping the platform safe while lessening the workload of content moderators and limiting their exposure to disturbing content.
Youtube, for example, first implemented their own “trusted flagger” position in 2012, offering it to individuals, governmental agencies, and non-governmental organisations (NGOs), though they have since disallowed individuals from being a part of the program.
This position required participants to commit to regularly reporting content with their own discretion by understanding what constitutes a community guideline violation, while staying in contact with Youtube.
An official turn: Trusted flagger status according to the DSA
“Trusted flagger” is now used as an official status under the Digital Services Act (DSA), granted by the Digital Services Coordinator of the Member State to organisations that meet the following criteria:
Application requirements for trusted flagger status
it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content;
it is independent from any provider of online platforms;
it carries out its activities for the purposes of submitting notices diligently, accurately and objectively.
(Article 22, Digital Services Act)
Purpose of a trusted flagger
Since the DSA is structured to protect freedom of expression, it only aims to tackle illegal content, which trusted flaggers are required to have a vast knowledge of. Designated trusted flaggers will uphold the DSA by working to identify illegal content and notify platforms when such material is found on their site(s). In effect, re-establishing willing organisations with the official trusted flagger title will effectively focus more eyes and ears on the issue of illegal content, and add yet another level of security to online safety.
Trusted flaggers will answer to the EU’s Digital Services Coordinator. As detailed below, if an organisation fails to meet these responsibilities, or if there is an abuse or misuse of power, the Digital Services Coordinator may revoke their trusted flagger status.
Who qualifies as a trusted flagger?
Unlike platforms that give individuals the opportunity to be a flagger within their own site(s) and under the jurisdiction of their community guidelines, the DSA explicitly prohibits individuals from becoming official trusted flaggers, only granting the title to governmental and non-governmental organisations, consumer organisations, and semi-public bodies.
To be a trusted flagger, the organisation must:
have demonstrable proof of expertise in addressing illegal content and are already established as an organisation
Work accurately without bias or subjective views
Can be private, public, or semi-public governmental or non-governmental organisations (NGOs).
Examples of potential trusted flagger applicant types:
Part of the INHOPE network of hotlines, such as the National Center for Missing and Exploited Children (NCMEC) and more.
Law enforcement authorities of the European Union Agency for Law Enforcement Cooperation (Europol)
Responsibilities as a trusted flagger
1. Apply enhanced technology for flagging content
Trusted flaggers are required to use highly accurate technology when auditing a platform for illegal content. Leveraging such technology will help to automate the process where necessary to ensure transparency when answering to DSA enforcers, and to limit human exposure for both content moderators/trusted flaggers.
The DSA has set operational requirements for trusted flaggers to uphold when carrying out investigations of illegal content on these platforms.
When becoming a trusted flagger, organisations will need to enhance their current processes and ensure the technology can keep up with the ever-present (and often recycled) illegal content posted to platforms. Transparency and organised audit trails will be an additional necessity to aid all parties involved in case of an investigation by the Digital Services Coordinator.
Since the trusted flagger endeavour requires expertise and begets many employees working to identify illegal content online, implementing technological tools will tremendously improve this effort. Many organisations and law enforcement agencies may have a small number of staff or lack the operating systems to keep up with the quickened pace of identifying content as a trusted flagger. Below are features that optimize the progress of illegal content investigations in order to maintain compliance by having a traceable trail of information for each notice and report.
Key features of a trusted flagger tool:
Hash-matching technology for automatic identification of known content within the database(s) of content
Integration of other organisation’s databases
Connection to platforms to track action after notice is received
Ticketing system for each case of illegal content with date, time, and electronic location
Interconnectedness between modes of contact: emails, API, REST4
Content review screens for human review
2. Deliver notices of found illegal content to online service providers
Upon identifying illegal content on a platform, trusted flaggers must submit electronic notices to the designated in-house team that reviews the platform’s content. Once received, providers will need to take immediate action and remove the content.
DSA-compliant notices should follow this general outline:
Name and e-mail address of the organisation giving the notice
A clear explanation why the trusted flagger deems the content illegal
An precise indication of the exact location of the illegal content online
A statement of good faith to solidify the notice’s completion and validity
3. Publish reports once a year (at least)
Organisations granted the responsibility of flagging illegal content are required to submit at least one comprehensive report of notices given to very large online platforms (VLOPs) and very large online search engines (VLOSEs) within the duration of a year.
These reports must be detailed, including a list with the number of notices and the following:
A description of the identity of the online service provider and their site(s)/service(s)
A description of the type of allegedly illegal content within notice(s)
How the online service provider took action against the found content
4. Make reports public and send to Digital Services Coordinator
Trusted flaggers must send these reports to the Digital Services Coordinator as well as make them publicly available without including personal data. The Commission will then provide these reports in a publicly accessible database for full transparency of the illegal content found on the platforms or search engines and the actions taken regarding it.
5. Flagging content in accordance with specific expertise
The DSA states that trusted flaggers will need to act within their field of expertise, meaning that organisations specializing in identifying child sexual abuse material (CSAM) or terrorist and violent extremist content (TVEC) will identify illegal content in their respective areas.
Enforcement: Penalties for trusted flaggers
In cases where a trusted flagger abuses power or shows signs of incompetent, inaccurate, or inadequate flagging operation, online service providers are permitted to report them. If a service provider’s reasons for reporting the trusted flagger are found legitimate, the Digital Services Commissioner (DSC) can open an investigation on the trusted flagger and suspend status for the duration of the investigation.
Additionally, status as a trusted flagger can be denied or revoked if the entity in question is found to no longer meet the necessary requirements. In such a case, the trusted flagger will be given the chance to act on the digital service commissioner’s decision before status is officially lost.
Online service providers cooperating with trusted flaggers
The DSA requires online services, specified platforms, and search engines, to have an ongoing relationship with trusted flaggers and take action when given a notice of alleged illegal content.
Specifically, online platforms, very large online platforms, and very large search engines, are legally required to treat reports and notices regarding their services with the utmost priority and without delay.
Special requirements for very large online platforms (VLOPs) and very large online search engines (VLOSEs)
Throughout the course of spring 2023, platforms will be labeled according to their monthly user base. If an online service exceeds 45 million users per month, it will be considered either a Very Large Online Platform (VLOP) or Search Engine (VLOSE). The DSA stresses focus on these two groups due to the risks associated with the high volume of user-generated content, resulting in the inevitable spread of illegal content.
Steps for VLOPs and VLOSEs
Here are the ways VLOPs and VLOSEs will need to work with trusted flaggers under the DSA:
Have a single point of electronic contact between the platform and organisations
Training sessions with trusted flagger organisations
Adapt content moderation systems by suggestion from trusted flaggers
Integrate trusted flaggers into the platform for easy and prioritized reporting
If you would like to read more about the Digital Services Act, read this article to find out more about the necessary steps to compliance for all online services types.
Commenti