Home

Donate

Building Trust in the Digital Service Act’s Trusted Flaggers

Ramsha Jahangir / Feb 23, 2024

Ramsha Jahangir is a fellow at Tech Policy Press.

A "flagger ahead" sign superimposed on a photo of the European Commission headquarters in Brussels, Belgium.

The European Commission’s Rita Wezenbeek declared the arrival of the “company cavalry” at the DSA and Platform Regulation Conference last week as the Digital Services Act (DSA) entered full force. She was referring to the newly established Digital Service Coordinators (DSCs), who join the Commission’s regulatory ranks to ensure online platforms serving EU users enforce the DSA and respect its content moderation obligations.

As these national enforcers (or perhaps digital sheriffs) assume their duties under the DSA, they take on a wide range of responsibilities. From fielding user complaints against platforms to overseeing data access requests from researchers and wielding enforcement tools like fines and inspections, their plate is full. It doesn’t stop there. DSCs have another crucial task: assessing and certifying “trusted flaggers.”

Trusted flaggers are public and private actors with privileged access to internet platforms through which they can identify illegal and harmful content. Before DSA, trusted flaggers existed as a voluntary system. Technology companies voluntarily worked with various organizations such as Internet Referral Units at Europol, national ministries, and non-governmental organizations like INHOPE to prevent violent and harmful content on their platforms. With the DSA in effect for tech companies, mandatory collaboration with designated trusted flaggers is now law. 

New Enforcers, New Obligations

Under the DSA, trusted flaggers will no longer be chosen by online platforms but will be assessed and selected by national authorities. Organizations wishing to become trusted flaggers under DSA will now need to apply to their respective national DSC, meet specific criteria, and demonstrate their expertise and neutrality. Furthermore, Article 22 of the DSA mandates online platforms to take the necessary technical and organizational measures to ensure that notices submitted by state-designated trusted flaggers are treated “with priority and without delay.”

In terms of obligations, the DSA requires trusted flaggers to publish – at least once a year – easily comprehensible and detailed reports on notices submitted in accordance with Article 16. The report should list at least the number of notices categorized by: (a) the identity of the provider of hosting services; (b) the type of allegedly illegal content notified; and (c) the action taken by the provider.

According to the law, those reports should include an explanation of the procedures in place to ensure that the trusted flagger retains its independence. Trusted flaggers should send those reports to the awarding DSC and make them publicly available.

To maintain their accreditation, trusted flaggers must also demonstrably continue to meet the requirements established by the regulation. If a trusted flagger fails to meet the requirements, the DSC has the power to revoke its status. In cases where a service provider reports a significant number of insufficient or inadequate notifications from a trusted flagger, an investigation may be initiated by the relevant authorities. Depending on the findings, the trusted flagger's status could be suspended or even revoked.

Who can be a trusted flagger?

To qualify, trusted flaggers must demonstrate expertise in their designated area of content, operate independently from online platforms, and maintain transparent funding. Additionally, they must commit to accurate and objective reporting and publish annual reports outlining their activities and reported content. Examples of potential trusted flaggers include law enforcement agencies, non-governmental organizations, consumer groups, and even industry associations – as long as they meet the stringent criteria and respect relevant legal limitations. There may be some interesting new organizations joining the fray; at the DSA conference in Amsterdam last week, the Commission’s Wezenbeek shared that the Irish Central Bank had expressed interest in becoming a flagger.

According to guidance published by Ireland’s DSC, Coimisiún na Meán (CNaM), trusted flagger status should only be awarded to entities, not individuals. The extensive guide divides the conditions into four main sections: (1) General Information; (2) Expertise; (3) Independence; (4) Diligence, Accuracy, and Objectivity.

Designated flaggers may be:

  • Industry federations and trade associations, e.g., Intellectual Property Owners organizations;
  • NGOs, e.g., consumer rights organizations, child-protection organizations, human rights organizations, environmental organizations, animal-rights organizations, etc.;
  • Members of established fact-checkers networks (e.g., IFCN);
  • Trade unions;
  • Non-regulatory public entities like Internet referral units (Europol) or regulatory bodies (with the exception of DSCs themselves);
  • Private or semi-public bodies (e.g., organizations part of the INHOPE network of hotlines).
  • Networks (the definition of an entity through Article 22 would not preclude networks or alliances of entities, at the national or European level, from applying).

Building trust in trusted flaggers

While the DSA outlines certain criteria for obtaining trusted flagger status, concerns remain about the ability of governments to use trusted flagger mechanisms to inappropriately and disproportionately seek content moderation.

Ongoing research on trusted flaggers by Utrecht University’s Jacob van de Kerkhof seeks to evaluate the implications of the EU shifting the power to designate trusted flaggers from companies to national authorities.

“So far, trusted flaggers have identified content predominately on violations of platforms’ Terms of Services, which made it easier to moderate. The DSA, however, only creates the opportunity for trusted flaggers to flag content based on illegality under national or EU law,” said Kerkhof. “The pressure of DSCs standing behind these flags could have consequences for freedom of expression,” he added.

Globally, said Kerkhof, cases are emerging that show the disproportionate use of trusted flagging channels by government authorities. For example, the Meta Oversight Board overturned a decision where Facebook removed drill rap videos based on a referral by London’s Metropolitan Police and requested Meta to be more transparent on the privileged channels it has with law enforcement, as such communications could harm freedom of expression. In the US Fifth Circuit of Appeals case called Missouri v Biden (now before the Supreme Court as Murthy v Missouri), the Circuit Judges partially upheld an injunction prohibiting officials from the Department of Health from communicating to social media platforms via privileged channels. On the other hand, in Adalah v Cyber Department, the Israeli Supreme Court held that the Israeli State Attorney's Cyber Unit using privileged channels to refer content to internet intermediaries was constitutional and did not violate freedom of expression.

Sharing examples of previous flags from Dutch authorities to online platforms, Kerkhof said that so far, authorities had mostly demanded the takedown of content on matters related to election misinformation. “We don’t have access to requests from all trusted flaggers. For instance, we don’t know how many requests the Dutch police have submitted – the DSA will improve transparency around such requests,” he said. Interestingly, while the transparency reporting templates require platforms to reveal the number of requests from trusted flaggers, they do not allow separate reporting of content moderation requests that come from trusted flaggers affiliated with a government, such as Internal Referral Units.

To address accountability and transparency concerns, Kerkhof suggested adding proportionality assessments in trusted flagger portals and indicating remedy options. “Proportionality is a standard in the European Convention on Human Rights and other content moderation practices. There are more remedies than content removal, labeling for example. We have seen that trusted flag requests are not in large volume so you can build proportionality and remedy into these processes,” he said.

Among other concerns, Kerkhof’s research considers the need for impartiality in the selection of trusted flaggers, as well as how to manage potential biases under DSA’s framework:

  • Eligibility: Should all entities be eligible for trusted flagger status, or can some be disqualified?
  • Balancing interests: How does the DSC balance the public interest with specific organizations' interests (e.g., a climate-skeptic organization applying to flag climate misinformation)?
  • Redress mechanisms: What recourse exists for organizations denied trusted flagger status or concerned about the selection of non-reputable organizations?
  • Potential biases: Can the state-appointed DSC’s political leanings influence its decisions, raising concerns about potential biases?

Disproportionate requirements for NGOs

Concerns about potential state interference in the trusted flagger system are not limited to experts like Kerkhof. Civil society organizations, especially those with experience working as trusted flaggers with online platforms, share these anxieties. They fear that close collaboration between trusted flaggers and DSCs across various national and international contexts could be perceived as “state interference,” potentially compromising the entire process.

More recently, at the DSA conference in Amsterdam, civil society participants shared that the proposed reporting obligations for trusted flaggers were disproportionate and could hinder participation. They argued that the current reporting requirements necessitate significant additional resources (staff, technology), which many NGOs lack, potentially excluding them from becoming trusted flaggers. Additionally, civil society flaggers, especially those representing under-represented groups, rely on funding from platforms. Under the proposed DSA, trusted flaggers might face a difficult choice: lose platform funding or risk losing their status due to potential conflicts of interest. At the conference, concerns about mitigating funding challenges were addressed by a panel of DSCs. While they clarified their intention was not to restrict funding, the DSCs stated that trusted flaggers receiving platform funding would need to adopt independent governance structures.

The DSA’s trusted flagger system represents a work in progress. Naomi Appelman and Paddy Leerssen, in their essay on the system, note that the concept of flaggers serves as a “site of contestation between competing interests and legitimacy claims in platform governance.” To truly ensure trusted flaggers contribute to DSA compliance, additional safeguards and sustained investment in the capacity and visibility of flagging entities are needed to prevent overreach. Hopefully, future guidelines from the EU on the trusted flagger program will expand on these concerns.

Authors

Ramsha Jahangir
Ramsha Jahangir is an award-winning Pakistani journalist and policy expert specializing in technology and human rights. Ramsha has extensively reported on platforms, surveillance, digital politics, and disinformation in Pakistan. Ramsha previously reported for Pakistan’s leading English newspaper, D...

Topics