The free, open, and secure Internet has been an incredible tool for humanity. Yet, as child sexual abuse material (CSAM)Child Sexual Abuse Material refers to any material that portrays sexual activity involving a person who is below the legal age of consent. To accurately describe the exploitation and abuse of children and protect the dignity of victims, the European Parliament recommends using the term "child sexual abuse material" instead of "child pornography.", hate speechHate speech is any form of communication, whether written, spoken or otherwise expressed, that attacks or incites violence, discrimination or hostility against a particular individual or group on the basis of their race, ethnicity, nationality, religion, sexual orientation, gender identity, or other characteristics., disinformationFalse information that is circulated on online platforms to deceive people. It has the potential to cause public harm and is often done for economic or political gain., violence, fraud, and counterfeits pervade the Internet we use every day, online trust & safety has become a growing concern for businesses, users, civil society, and governments alike.
An increasing number of actors have emerged within this ecosystem, including human content moderators, community moderators, as well as trusted flagger organizations. The latter, often issued from civil society and organised as “hotlines”, are playing a significant role in making the internet – and the world – safer.
Who are trusted flaggersGenerally, this refers to individuals or entities that have proven expertise in flagging harmful or illegal content to online service providers. Within the meaning of the DSA, trusted flaggers are entities that have been awarded an official status by a Digital Service Coordinator. Online platforms will need to ensure notices from such organizations are treated with priority.?
‘Trusted flagger’ (TF) refers to an individual or entity that is considered by an online service provider (OSP) to have expertise in flaggingFlagging is a term used interchangeably with reporting, which refers to the act of requesting a review of online content, conduct, or a user account. The content can be flagged by an algorithm, a content moderator, or another user. and reporting illegal and/or harmful content online.
Tech companies have taken many such TFs on board to counter hate speech and abusive content on their platforms, prioritising their notices over user reports. However, the new Digital Services Act, expected to be published in September 2022, is transforming the role of TFs within the EU.
TFs will no longer be chosen by the companies themselves, but will be selected by the designated authority in the relevant EU Member State. For organisations that work as TFs in the Union, this means that they will now need to apply to gain the status of a ‘trusted flagger’ (TF) and OSPs will be obliged to collaborate with the designated TFs with priority and without delay.
How do you become a trusted flagger?
The regulation lays out that applicants can be public, NGOs, private, or semi-public bodies and need to meet the following conditions:
- They have the proven expertise and competence for detecting, identifying, and notifying illegal or hateful content;
- They represent collective interests and are independent from online platformsAn online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase "online platform" is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms...;
- They carry out their activities for the purpose of submitting notices in a timely, diligent, and objective manner.
New Duties and Obligations
The DSC of the relevant member state will award the TF status to a limited number of entities. This status will then be recognised by all online service providers impacted by the DSA. The table below outlines the new obligations for TFs:
Transparency reports | • TFs should publish comprehensible and detailed reports which should include: 1. Notices categorised by identity of provider 2. Type of content notified 3. Specific legal provisions allegedly breached by the content notified 4. Action taken by the provider 5. Any potential conflicts of interest and sources of funding 6. An explanation of the procedures in place to ensure that the TF retains its independence • TFs also need to publish annual reports on transparency and funding structure, including sources and amount of revenue |
Notices/Reports | • Standardised electronic submission of notices • TFs can issue correction notices for incorrect removal, restriction, blocked content, and suspensions or terminations of accounts |
Penalties and Loss of Status
The Digital Services Coordinator of a Member State can renew a TF’s status if it continues to meet the requirements of the regulation. Online service providers can report TFs if they have information that the TF has submitted a significant number of insufficient or inadequate notices, which can launch an investigation. It is also possible for a TF’s status to be suspended and possibly revoked if it is determined that the TF no longer meets the conditions required.
Implications for Trusted Flaggers
The DSA significantly reinforces the role of TFs as key actors in the trust & safetyThe field and practices that manage challenges related to content- and conduct-related risk, including but not limited to consideration of safety-by-design, product governance, risk assessment, detection, response, quality assurance, and transparency. See also: Safety by design ecosystem by requiring that all OSPs operating in the EU respond to their notices with priority and without delay.
However, the new regulation also imposes new responsibilities on these entities, as organizations wishing to gain TF status will have to enhance their existing tools and processes to meet the new obligations in terms of transparency and auditability. Such tools and processes should enable these organizations to ensure that their notices are sufficiently and adequately detailed to avoid suspension of status. Furthermore, they should integrate the necessary audit trails in case of a DSC investigation.
To optimize the time of TF operators, the tools used should automate the workflow where possible and increase connectivity across the Trust & Safety ecosystem. Along with increasing efficiency, such measures would ultimately protect TF operators through well-being technology and by automatically detecting if the reported content is already present in other databases (NCMEC, GIFCT, INTERPOL, etc.).
In addition, the rapidly evolving regulatory landscape will require these tools to be easily adapted to meet ever-changing obligations.
Want to apply to become a trusted flagger and learn more about the impact of the DSA? Contact us at info@tremau.com
Tremau Policy Team