Home

Donate

Pentagon PSYOP Scandal Demands an Urgent Debate on Propaganda Ethics

Emma Briant / Oct 27, 2022

It’s time for a public debate about clandestine PSYOPs, argues Dr. Emma L. Briant, a political communication scholar who researches contemporary propaganda and information warfare, and its governance and ethics in an age of mass-surveillance.

A collage of social media posts and profile photos removed from Facebook and Twitter. Source

Last month, the Pentagon launched a sweeping review of its clandestine psychological operations (PSYOP) after reports of pro-Western influence operations on major social media networks. Twitter and Facebook removed two sets of suspected US military accounts that the companies found were violating their policies on “platform manipulation and spam.” Working with ‘portions’ of the campaign data provided by the platforms, Stanford’s Internet Observatory and Graphika then published a report, ‘Unheard Voice: Evaluating five years of pro-Western covert influence operations’. Subsequent media coverage and mounting concern in the White House and some federal agencies has now prompted the Department of Defense (DOD) to review its tactics.

Blatant falsehoods found in the newly-released cache– such as claims that the organs of dead Afghan refugees are being harvested in Iran– are being blamed on poor ‘training’ or ‘inadequate oversight of contractors,’ in a lucrative industry in which information operations contracts can be worth up to $1Bn for a single company. When the Cambridge Analytica scandal exposed how inadequate controls over a US government contractor’s PSYOP ‘innovations’ failed to stop their unethical redeployment in politics, this called for a reckoning on the governance of privatized influence firms. The current revelations illustrate an apparent failure to strengthen oversight, making this an all the more urgent problem.

Renewing Debate

In some ways, these revelations resurrect an old debate, one over emerging ethics in military propaganda tools and technology. When policies intended to better coordinate activities and define the use of such tactics were pushed through in 2007-8, this met with resistance, as some military critics were concerned that misleading or covert PSYOP campaigns on the internet could damage the credibility that DOD’s public affairs officers and the State Department carefully nurture through truthful communications. The critics lost the argument, and the new policies were pressed through.

The new National Defense Authorization Act (NDAA), introduced in 2019, sought to further clarify the chain of command and oversight processes to more swiftly enable clandestine U.S. military cyber operations, but it also raised concerns from some legal experts who suggest more immediate congressional oversight is needed, particularly for the most sensitive activities. With the new concern about disinformation and rapidly emerging surveillance technologies in the past decade, it’s beyond time for a more public debate around how democracies govern PSYOP online.

There are signs some public dialogue is underway. DOD’s Strategic Multilayer Assessment (SMA) program is the forum which hosts speakers and assesses challenging problems associated with Pentagon and military planning, and it announced a relevant session on “Ethics and success in the mind-tech nexus.” (One wonders why then, when Russian propaganda systematically leverages gender and targets women, are only male experts on a key panel leading this vital discussion of ethical response?)

Yet the Pentagon cannot be allowed to check its own homework. A ‘new ethics’ of influence operations for tomorrow’s challenges is badly needed and is as yet unclear. If a new way forward is to be found, and gain public confidence, the diversity of civil society experts and independent researchers must be able to interrogate the full facts and meaningfully engage in an informed debate over how democratic governments respond to today’s rapidly evolving challenges.

The Platform Releases

The take down of these pro-Western accounts also raises questions about the even-handedness and transparency of the process for the removal of inauthentic campaigns by the tech platforms, not least because some of the campaigns removed dated back to 2012 (Twitter) and 2017 (Facebook/Instagram). Why did it take so long to identify, remove and release them? Did the platforms negotiate their removal with the US government? In the Washington Post, Meta’s director for global threat disruption, David Agranovich, is reported to have warned the US military two years ago that it was a problem that it so easily ‘got caught’ by the platform– a revealing exchange given what we know about how Meta prioritizes US concerns.

And, there are legitimate questions to ask about how tech platforms and the government engage with propaganda and disinformation researchers, including those at universities, and the impact this has on investigations into influence operations. Here, it is important to ask why only a select group of researchers were given access to exclusive portions of data removed by social media platforms described in the ‘Unheard Voice’ report. The granting or refusal of access or data by governments, private firms and platforms is, in essence, the ability to exercise control over the academic field, including its diversity and debates. We might ask why this data wasn’t made available by the platforms for other experts, particularly those who have written previously on US military influence operations? What might we have gained if a more diverse set of researchers— including from the affected regions— had the same access as Stanford and Graphika? At a time when we need to ensure more researchers receive access to relevant resources and bring diverse new ideas, Graphika is critical of ‘disparate efforts,’ and is seeking funds to become a powerful hub reproducing its own methods and potentially playing a gate-keeping role distributing data and resources for chosen collaborators.

The exercise of control over access by the platforms may signal an intent to influence how that data is handled, and certainly over who gets to handle it. Limited access in this case gives a small group of experts selected by the platforms an outsize role in shaping public understanding of US propaganda, as well as ideas in the scholarly field and future policies. Social media platforms must enforce rules impartially, and public trust is damaged when they fail toact and communicate openly, evenly and clearly. The platforms must, for example, provide clearer explanatory information to researchers when they release a dataset, communicating how complete ‘portions’ of data provided might be. An opaque or uneven process could reduce trust not only in the platforms themselves, but in the Pentagon review, and potentially its outcomes.

So What Can We Learn from the Releases?

Of the portion of the campaigns released in the ‘Unheard Voice’ report, many of the fake accounts actually echoed the truthful messaging of US-branded public diplomacy. Readers may wonder why democracies bother faking the source behind such accounts. Democracies ironically use covert tactics widely to try to win trust of certain audiences who would doubt their message if received from government-branded outlets.

Covert interventions may well be necessary in some narrow circumstances. Yet deployed broadly, shady tactics of concealment can actually undermine an otherwise truthful message and reinforce a perception, pushed by the Kremlin, that nothing can be trusted. Finding a better way to engage is not "ceding an entire domain to an adversary" as one defense official put it. It’s easy to feel the extreme disregard for ethics of, say, the Russian government must be met with our own similar loosening of moral limits. But the costs of this type of activity for democracies are greater.

Justifications like this one, which pretend our only options are ‘all or nothing,’ in order to make the argument for complete abandonment of restraint, may dispel short-term inquiries from journalists, but they are both misleading and lazy. They stop short of answering key questions about what is most ethical and effective, in which circumstances, with which audiences, and why. These are questions DOD should help the academic community and civil society to consider by permitting greater access, information and dialogue. Participating fully in this debate would be reassuring for the public.

It’s also unacceptable for democratic governments to just arrogantly hope their own militaries don’t get caught. Not only do these newly-released accounts display average skillfulness at best, the world won’t be won over by ‘democratic’ AI-generated personas, lurking undetected, pumping out obvious foreign policy narratives. While – particularly during wartime– there is a place for highly targeted covert operations focused on an enemy, mass fakery is both ineffective and ethically questionable.

Wherever it is hard for the Kremlin to defeat truth with lies, we see efforts to erode trust in those delivering a message that is counter to its interests. This includes the use of tools and tactics such as doxing, spyware or hacking against civil society, industry and governments alike. Democracies should respond by building trust in their actions by prioritizing transparency and truthful messaging. Instead, this year the US Army shifted toward flaunting – not explaining - its use of deception, a marked departure from a previous hush around the use of PSYOP.

Democracies engaging in such activities will get exposed and must understand the long term harm they risk. The point is to create resilience with tactics that reinforce democracies’ strengths. For the sake of public trust and to protect us from the world’s worst despots, it is essential to work towards an information environment that’s not easily weaponized by any state, including our own.

Militaries do face real challenges of accessing the right networks and generating a sufficient volume of output to sway opinion against those spreading lies or enabling human rights abuses. While it may be politically difficult for democracies in the short term to engage with the uncomfortable truths of what they do in their PSYOP, a mismatched pattern of rhetoric and action is extremely damaging. Explanations after a disclosure like this one often have the appearance of spin and can prompt cries of hypocrisy, which may be exploited by enemies to stoke divisions. As a result the US system appears dysfunctional, its values arbitrary, and its military incompetent, contributing to the erosion of trust.

Tomorrow’s Information War

The US is not the only democracy where public outcries have prompted reassessment of policies governing influence activities. The Canadian Forces launched several investigations after reporting in the Ottawa Citizen and academic research revealed a series of scandals including military “influence” activities aimed at Canadians. Policies have not always kept up with the pace of evolving threats and technologies. In the UK, calls for “innovation and experimentation” accompany statements that “our legal, ethical and moral framework needs updating to deny our adversaries the opportunity to undermine our values.” Yet there is little debate and access to clear information to reassure citizens on how the Ministry of Defence intends to ‘update’ its moral framework for the future.

With the current review underway, the Pentagon has an opportunity to set an example for how democratic militaries around the world should develop ethical, transparent standards for tomorrow’s information wars.

To do this the Pentagon’s audit must also look beyond yesterday’s challenges. In the future we risk so much more. There are important issues such as what can be done in the case of disinformation or incitement within closed Whatsapp groups, for example, particularly as these are the main channels to communicate in some parts of the world. Social media has also enabled tactics that increasingly involve civilians in the fight, prompting some scholars to argue this amounts to ‘participatory disinformation.’ This type of phenomena risks blurring lines that protect non-combatants from being regarded as military targets; civilian cyberwarriors are ill-equipped for the potential consequences.

David R. Shedd, former acting director of the Defense Intelligence Agency and Ivana Stradner, an Advisor to the Foundation for Defense of Democracies, a think-tank and lobbying organization, argue the West should adopt “Russia-style” tactics - they suggest leveraging nationalism, during a time of already-dangerous resurgent fascism, and manipulating ethnic tensions. Any review should especially make clear acceptable limits on creating and encouraging social movements that can risk civilian lives, especially where incentives and coercion are applied, or where it is known that local people will risk persecution for protests and may be limited in ability to achieve their goals.

Add to this future automated cyberwars, with human effects we cannot currently predict. Exponential advances in computing will help AI identify deceptions. This will also enhance speed and reduce human control over influence operations as machines make decisions based on real time data from the battlefield, from air and space, and from individuals’ online lives, all before a commander has risen in the morning. All this prompts many questions, not least over human accountability when ‘decisions’ result in errors or atrocities; we know AI itself has its flaws and biases.

This debate can’t happen without more scholarship examining the bad and good of what democratic militaries really do and discussion around efforts to further integrate influence capabilities. What, if anything, does the military do if posts from covert proxy media and fake personas are picked up in Western outlets or by ‘disinformation’ researchers as authentic? When would this constitute domestic interference? Fact-checkers are working hard to rebuild the trust of the public in reporting of international conflicts. Militaries dislike revealing what they do or how they do it, but today’s information environment is one where disclosures like these platform take-downs must be reckoned with in the full light of day. While uncomfortable decisions may be made in war, the public needs more candour about why the US military makes the choices that it does in the information space, particularly where no war has been declared.

This is a problem faced by all democracies, whose militaries must learn that adapting to tomorrow’s information war is not only about acquiring the latest technologies, it’s also about understanding how today’s unique information environment transforms their relationships with citizens and foreign publics, and that this will increase the stakes if they fail in their commitment to truth, transparency and trust.

Authors

Emma Briant
Dr. Emma L Briant is a political communication scholar who researches contemporary propaganda and information warfare, and its governance and ethics in an age of mass-surveillance. She is Associate Professor of News and Political Communication at Monash University, and is a Fellow at Bard College an...

Topics