From Cambridge Analytica to Tenet Media: What Will it Take for the US to Regulate Influence Firms?
Emma Briant / Sep 24, 2024On September 4, the US Department of Justice (DOJ) announced the indictment of two employees of RT, the Russian state-controlled media outlet, who allegedly engaged in a ”$10 million scheme to create and distribute content to U.S. audiences with hidden Russian government messaging,” according to Attorney General Merrick B. Garland. DOJ alleged that RT secretly used a firm based in Tennessee called Tenet Media to pay far right influencers to produce and distribute videos “in furtherance of Russian interests.”
From 2016 to 2021, the Kremlin declared spending more than $182 million on lobbying, foreign influence operations, and propaganda to advance Russian goals in the US, according to Open Secrets. Now it’s gone underground. Funneling money and narratives to unethical distributors may be an old-school intelligence technique, but it has flourished as a digital industry as governments failed to fully investigate or quickly regulate the digital influence industry when they had the opportunity. That pressing but lost opportunity was following the Cambridge Analytica scandal.
Digital Influence Mercenaries
The digital influence mercenaries Russia is using to attack elections aren’t your usual marketing firms. As explained in my recent book–Routledge Handbook of the Influence Industry, co-edited with Vian Bakir–Russia and other malign actors have incentivized a network of connected influence industries, wielding techniques as diverse as they are shady. These digital influence mercenaries sprung up to offer coercive and covert methods that meet the demand of these unethical clients, and also take advantage of the economics and affordances of social media.
Obscurity and obfuscation may often be built into the business model. The Tenet Media example represents a kind of Russian operation that’s designed in such a way that it would give anyone receiving the cash deniability and protection. For example, there were efforts to mask the true source of funding behind a fake European investor “Eduard Grigoriann.” The indictment does not accuse the influencers themselves of wrongdoing. Some of them, including Tim Pool, who has been called a January 6th election disinformation 'super-spreader,' put out statements claiming they were ‘victims’ duped by Russian deceptions. But it appears clear they did little to scrutinize the source of the funds.
In 2018, I played a role in helping expose the Cambridge Analytica scandal, a series of revelations pertaining to the now defunct firm employed by former President Donald Trump’s campaign in 2016. I made recommendations to the US Senate Intelligence Committee and British Parliament that stressed the importance of regulation of influence firms to give stronger oversight as “networks of companies cannot obscure unethical practices, flows of data, financial interests or possible conflicts of interest with foreign powers – all concerns raised in the Cambridge Analytica scandal.” And indeed, the 2020 final report of the US Senate Intelligence Committee’s investigation into Russian Active Measures Campaigns and Interference in the 2016 US Election found that Cambridge Analytica “had a degree of intersection with and proximity to Russia, and specifically Russia's intelligence services.”
To me, tactics described in this month’s DOJ indictment were all too familiar, and the lack of action to curb the influence industry’s bad actors all the more disappointing.
What Should be Done?
Lawmakers and regulators have been slow to take the action necessary to prevent future Tenet Media or Cambridge Analytica-like threats.
More must be done. Regulating influence firms to curb their worst excesses is complex, but very achievable. It rests in part on industry-specific regulations like industry licensing, privacy and transparency, and in part on anti-corruption measures.
1. Industry Licensing
There are already recognized codes of ethical conduct in some promotional industries–PR for example–yet little to ensure such standards across the increasingly diverse and networked influence industries that proliferate in our digital age. Policies regulating particular techniques or practices fall short of responding to the complex, multi-layered, adaptive deception that can be built into business models, business practices or project designs involving, sometimes, chains of suppliers of unique services. As I have argued elsewhere, regulation or professional licensing, of the kind seen in many other professions, is also necessary for influence industry firms–whether those like Tenet Media that aim to finance and coordinate a strategically aligned army of political influencers, or like Cambridge Analytica, which provided data analytics or messaging for campaigns. This would apply to companies coordinating or funding deliberate influence operations, across multiple outlets and/or data sources. Professional industry licensing could be revoked on evidence of serious violations of ethical codes of conduct.
2. Transparency and Anti-Corruption Measures
In many ways, influencer marketing remains a lightly regulated space within a wider industry. Foreign influencers from countries like China have proliferated by lacing propaganda into content such as food and lifestyle. The issue of illicit talking points or funding raises particular problems where influencers present themselves as domestic journalists or independent voices of ‘free speech.’ The Federal Trade Commission (FTC) also requires influencers to declare sponsorships for products. Yet there remains little transparency in politics, enabling a kickback culture that now threatens, arguably, the most consequential election in US history. As others have argued, the FEC should require influencer transparency. But in reality dark money is the old problem driving the murky economy behind our influencer age. Efforts to shed sunlight on money flows in US politics have never been more vital, yet vital proposed transparency legislation, like the DISCLOSE Act, for example, are bitterly resisted.
Of course, new US regulations do require ‘Beneficial Ownership Information’ for most companies, designed to disrupt money laundering and other illicit activities. But it remains to be seen how effective this will be for transparency and accountability. As has been observed in the context of lobbying, the piecemeal nature of regulation can mean "interest groups employ strategies selectively from an integrated toolkit, to take advantage of policy blind spots." This could be improved with greater cross-organizational government coordination (or integration). The two indicted Russian nationals are accused of violating the Foreign Agents Registration Act, in addition to money laundering. But the reality is that penalties for foreign influence operations remain an inadequate deterrent–especially when one considers the high stakes of swaying the US election to Trump, which could halt aid to Ukraine, and change the course of Russia’s war. The maximum penalty for the most egregious FARA violation is five years in prison or a financial penalty of up to $250k–not exactly a dealbreaker for the Russians. Penalties like these should be raised to a point where they disincentivize foreign influence.
3. Privacy, Algorithms and Social Media Power
Of course, Russia would have been unlikely to target these influencers had they not already amassed millions of followers spreading highly engaging conspiratorial disinformation narratives across social media. The algorithms of such platforms distort public debate. They promote the content that makes the platform the most money from engagement–likes, shares, eyeballs, data harvested for AI–while the ‘free speech’ of less profitable users ebbs into digital obscurity. New regulations should aim to completely decouple the economy of social media from “likes” and user engagement, require AI to be opt-in, prevent or disrupt the use of audience engagement data for recommendation algorithms or targeted advertising.
Unfortunately the progress of US privacy legislation remains sluggish and cowed by Big Tech elites lobbying on AI. The reality is that the consolidation of data and financial power in the hands of tech oligarchs, is the greatest risk we face in terms of our vulnerability to mass social engineering, undisclosed interests and manipulation. Digital despots reign in totalitarian ways. Making no disclosures of their own financial interests or allegiances (foreign or otherwise), and with no incentive to explain their decisions, these elites craft our digital landscape like all-seeing data gods. For example, it was only through reluctantly unsealed court documents that we finally learned that the investors behind X's acquisition included Sean 'Diddy' Combs and Saudi Prince Alwaleed bin Talal al Saud - a level of transparency we should have around the ownership of such a significant communications platform by right.
Conclusion
It’s too late to put these measures into place ahead of the 2024 US election. But given the history of the past eight years, it’s clear that future elections will invite the same kinds of exploits by foreign governments, likely including Russia, Iran, China, and perhaps others. Lawmakers and regulators should connect the dots from Cambridge Analytica to Tenet Media and see where the trendline is heading. The time to act is now.
And ultimately there is a limit to what regulation, whether industry or government administered can or should do with respect to misleading voices. Other schemes–such as one considered in India, for example–which require individual influencers themselves to register as broadcasters for government regulation, raise issues of freedom of expression. There is a huge responsibility for educators, journalists, and other civil society organizations to help address these problems. While there has long been a debate over what journalism is, we are not sufficiently educating the public enough about a far clearer concept–what journalism is not. There is no speech less ‘free’ than that which is secretly paid for by an authoritarian state to meddle in a democracy. That doesn’t mean all such speech should or can be banned and removed, but we certainly must be better prepared to discern it from journalism.