Home

EU Releases Guidance for Strengthening Code of Practice on Disinformation

Justin Hendrix / May 26, 2021

The European Commission today published its Guidance on Strengthening the Code of Practice on Disinformation. The Code of Practice, launched in 2018, called on tech platforms to voluntarily commit to participate in a self-regulatory process to combat disinformation in a variety of ways. Under the Code, Facebook, Twitter, Google, Microsoft, Mozilla and TikTok submit regular reports to the EU Commission to monitor disinformation and detail efforts to combat it.

The European Commission is the executive branch of the EU.

The new guidance- developed in answer to "significant shortcomings" the Commission assessed last year, particularly with regard to COVID-19 disinformation- proposes a number of changes that expand the code and create new entities and enforcement measures. For instance, it seeks to expand the signatories, with a focus on the advertising industry; require specific commitments of companies that are peculiar to their platforms; increase the amount of fact-checking activity; and make more data available to researchers. It also suggests stronger efforts to demonetize disinformation on the platforms.

"It's a step in the right direction when considered alongside EDAP (European Democracy Action Plan) and DSA (Digital Services Act)," said James Pamment, a scholar in the Technology and International Affairs Program at the Carnegie Endowment for International Peace who wrote a significant brief for the EU Commission on the code of practice last year. "I'd argue that it is mainly about putting a functional framework in place, and the COP should be viewed for its medium-to-long term potential rather than judged on how it works right now." But, he emphasized, these are "statements of intent rather than solutions," and it will take time to flesh out the details.

Industry will have a voice in those details. According to the Commission's announcement, the "the signatories of the Code of Practice should convene to strengthen the Code in line with the Commission’s guidance and present a first draft in autumn."

Select recommendations include:

  • Broadened participation. The guidance suggests new signatories could include "smaller social media or search services," "private messaging services," "the advertising ecosystem," and other organizations relevant to the disinformation space.
  • Demonetizing disinformation. The guidance encourages the adoption of "brand safety tools" and other measures to ensure advertising expenditures do not support the production and distribution of disinformation.
  • Transparency and rules on political advertising. The guidance includes a variety of ideas around political advertising, including concerns on micro-targeting, dealing with disinformation propagated through advertising, and the provision of APIs and tools to allow for the review of political advertising.
  • Common vocabulary on manipulative behavior. The guidance encourages signatory companies to adopt common language around the "manipulative tactics, technique and procedures" that constitute inauthentic behavior on the platforms, and introduces concerns about AI systems that generate content, referencing the EU's proposed Artificial Intelligence Act.
  • Empowering users. There is a range of proposals to "foster more responsible behavior online" and to provide citizens with tools to combat disinformation. These include media literacy measures; 'safe design' recommendations; a focus on increasing the visibility of reliable information of public interest, such as COVID-19 public health information; and proactive warnings to users who have engaged with disinformation. Perhaps most interesting is a call for companies to "commit to make their recommender systems transparent regarding the criteria used for prioritizing or de-prioritizing information, with the option for users to customize the ranking algorithms."
  • Empowering the fact check community and access to data. The guidance references mechanisms proposed in the Digital Services Act to provide access to data to researchers concerned with "disinformation phenomena," and describes proposed mechanisms for such access. It also seeks to establish a "framework for transparent, open, non-discriminatory cooperation" between companies and researchers in the EU, including a provision to allow the research community to govern how research funds provided by signatory companies are spent.
  • Accountability and enforcement. The guidance imagines new key performance indicators, reporting requirements and more granular assessments of impact. It stipulates a variety of "enhance transparency" and recommends a "a permanent task-force aimed at evolving and adapting the Code in view of technological, societal, market, and legislative developments."
Vĕra Jourová, Vice-President of the European Commission, and Thierry Breton, European Commissioner, announce the Guidance for strengthening the code of practice on disinformation
© European Union, 2021

At a press conference, EU officials gave assurances the code is congruent with free expression.

"There should be no one authorized to be the arbiter of the truth, not the platform or its managers nor some state officials, the Commission or some ministries," said Věra Jourová, the Commission's Vice President for Values and Transparency, in a press conference. "I lived in Communist Czechoslavokia and I remember well the functioning and the very bad impact on society of the so-called Ministry of Information. This is not what we want to introduce in Europe."

Jourová said when the EU Digital Services Act- an omnibus package of rules that creates new accountability, transparency and competition requirements on online platforms- comes into force, the Act may provide a mechanism for the EU to introduce sanctions against companies "that do not behave responsibly" according to the code.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics