Substack Cofounder Defends Commercial Relationships with Nazis
Justin Hendrix / Dec 22, 2023Justin Hendrix is CEO and Editor of Tech Policy Press. Views expressed here are his own.
Even as antisemitic and anti-Muslim violence is dramatically on the rise, one Silicon Valley company is doubling down on an argument that doing business with Nazis is a good thing. In the view of its founders, helping Nazis build subscription newsletter businesses ultimately helps build trust in society.
Substack, a venture-backed newsletter platform founded in 2017 by Chris Best, Jairaj Sethi, and Hamish McKenzie, styles itself as a bastion of “free speech,” seeking to differentiate its approach from platforms with more substantial content moderation policies. In keeping with its commitment to provide services to those with even the most extreme views, Substack hosts a number of newsletters that enthusiastically advance white supremacist and Nazi ideologies, including some that use “overt Nazi symbols, including the swastika and the sonnenrad, in their logos or in prominent graphics,” according to the journalist Jonathan Katz writing in The Atlantic.
Now, in response to a protest letter signed by nearly 250 Substack writers under the banner “Substackers Against Nazis,” cofounder McKenzie says the platform will continue to platform and monetize Nazi newsletters.
Chris, Jairaj, and I wanted to let you know that we’ve heard and have been listening to all the views being expressed about how Substack should think about the presence of fringe voices on the platform (and particularly, in this case, Nazi views). ... I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don't think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.
McKenzie’s argument is a straw man. No one thinks banishing Nazis from Substack will make fascism “go away.” But some do think that it is ethically wrong for Substack to do business with Nazis, and for it to distribute and promote Nazi content.
This is not the first time the Substack founders have thought themselves into this particular ethical cul-de-sac. In fact, they have been consistently building this self-serving argument for at least three years.
In 2020, the three founders issued a manifesto of sorts on content moderation, embracing a hands-off approach.
Ultimately, we think the best content moderators are the people who control the communities on Substack: the writers themselves. On our platform, each publication is its own dominion, with readers and commenters who have gathered there through common interests. And readers, in turn, choose which writers to subscribe to and which communities to participate in. As the meta platform, we cannot presume to understand the particularities of any given community or to know what’s best for it.
Those particularities, presumably, include curiosity about “the Jewish question” and promotion of the racist conspiracy known as Great Replacement theory, topics which are presently explored in newsletters published by Substack, according to Katz.
In 2022, the cofounders published another post titled “Society has a trust problem. More censorship will only make it worse.” In it, they wrote:
The more that powerful institutions attempt to control what can and cannot be said in public, the more people there will be who are ready to create alternative narratives about what’s ‘true,’ spurred by a belief that there’s a conspiracy to suppress important information.
Others have pointed out that the company does, in fact, control what can and cannot be said on its platform, since it does not permit pornography. And in his latest missive, McKenzie does allow that Substack’s “content guidelines do have narrowly defined proscriptions, including a clause that prohibits incitements to violence.”
Why Nazism isn’t regarded as fundamentally violent, given that it denies entire groups of people their right to exist, is one point of contention with those leading the protest. As Marisa Kabas, an independent journalist who writes the Substack newsletter The Handbasket and helped draft the “Substackers against Nazis” letter pointed out, “If being a literal Nazi who supports Nazi policies and encouragingly posts Nazi imagery isn’t an incitement to violence, then what is?”
McKenzie says a better plan to deal with fascism is to continue to publish and pay the Nazis. Giving them the ability to earn money and distribute their ideas, he reasons, will ultimately help to “strip bad ideas of their power” by exposing them to scrutiny.
This is, of course, patently absurd, as the lawyer Ken White points out today in his own Substack newsletter.
Substack is engaging in transparent puffery when it brands itself as permitting offensive speech because the best way to handle offensive speech is to put it all out there to discuss. It’s simply not true. Substack has made a series of value judgments about which speech to permit and which speech not to permit. Substack would like you to believe that making judgments about content “for the sole purpose of sexual gratification,” or content promoting anorexia, is different than making judgment about Nazi content. In fact, that’s not a neutral, value-free choice. It’s a value judgment by a platform that brands itself as not making value judgments. Substack has decided that Nazis are okay and porn and doxxing isn’t. The fact that Substack is engaging in a common form of free-speech puffery offered by platforms doesn’t make it true.
This is a similar line of argument as what Micah Sifry, who publishes the excellent newsletter The Connector on Substack (and who endorsed the Substackers Against Nazis protest letter), pointed out to me last year.
And as much as we bemoan the decline of trust and the rise of polarization, platform owners all discover that they can't be perfectly neutral, and that it's actually a healthy thing to declare your values and moderate content accordingly.
I published Sifry’s comment alongside useful critiques of Substack’s approach to content moderation from the writers Elizabeth Spiers and Bridget Todd. At the time, I think Sifry thought the Substack founders would have to eventually take more of a stand against dangerous ideologues, such as Nazis. But today’s statement suggests that’s not where their logic led them.
Indeed, it appears that McKenzie, Best, and Sethi have again affirmed their values, and are moderating (or choosing not to moderate) accordingly. It’s just that those values justify doing business with Nazis, an obviously “feckless and morally bereft position,” as Cinder cofounder and former Facebook director of Counterterrorism, Dangerous Organizations, and Content Policy Brian Fishman put it.
Whether Substack’s business will suffer remains to be seen. There is nothing technically all that interesting about the platform. Many alternatives exist that permit writers to host, publish, and monetize their newsletters. I’m not sure what its investors see in it. If the various circus acts it has managed to cobble together decide to decamp, there isn’t very much value in the empty tents they leave behind.
But even if the “Substackers against Nazis” campaign is not itself successful in prompting the company to change its practices, perhaps a catalyzing event of a different sort will lead to change.
"Substack's position on Nazis only makes sense if you ignore the last eight years of world events and tech policy debates," Melissa Ryan, the CEO of Card Strategies, an expert on far-right extremism, and one of the protest letter's signatories told me. "We know from experience that these choices have dangerous, sometimes deadly consequences."
She's not the only one imagining dark scenarios.
“There’s gonna be a moment in 2024 where someone starts some violent shit, and uses their Substack newsletter to do it,” noted George Washington University professor, Substack writer, and recent Tech Policy Press contributor Dave Karpf in a post on the social media platform Bluesky. “That’s when this position will dissolve.”