Home

Donate

Spotify, Substack, Misinformation and Grift: A Conversation with Bridget Todd & Elizabeth Spiers

Justin Hendrix / Mar 17, 2022

Audio of this conversation is available via your favorite podcast service.

In January of this year, hundreds of doctors, nurses, scientists, and other health professionals wrote an open letter to Spotify calling on the streaming media platform to “implement a misinformation policy” in the wake of controversy over commentary on podcaster Joe Rogan’s December 31st show, including promotion of an anti-vaccine and anti-mandate rally by discredited scientist Robert Malone. The rally, which took place on January 23rd in Washington DC, featured anti-vaccine personalities including Malone, and was attended by conspiracy theorists and far right extremists.

As a result of mounting criticism, Spotify was forced to issue a set of announcements about what it would do to combat misinformation, including adding content advisories to podcast episodes about Covid, and publicly publishing more specific platform rules. To many observers, it was all too familiar- a tech CEO forced to admit the company was woefully unprepared to deal with the fall out from content moderation issues.

Daniel Ek, the company’s cofounder and chief executive, eventually issued a statement saying “We know we have a critical role to play in supporting creator expression while balancing it with the safety of our users.” He went on to say, “in that role, it is important to me that we don’t take on the position of being content censor while also making sure that there are rules in place and consequences for those who violate them.”

But the controversy didn’t stop there. As Rogan drew more scrutiny, his history of making racist statements and hosting far right personalities came to the fore. Rogan was forced to apologize for dozens and dozens of instances where he used racial slurs, in addition to a range of other racist comments, such as comparing a movie theater in a Black neighborhood in Philadelphia to “Planet of the Apes,” and wondering about the differences between the brains of Black people and white people.

Right around the same time that the Rogan controversy was commanding headlines, the newsletter platform Substack issued a manifesto of sorts on its views on content moderation. Its cofounders used the essay to differentiate themselves from the social media platforms, while pointing out that the company seeks to take a hands-off approach to moderation.

“Ultimately,” the Substack founders wrote, “we think the best content moderators are the people who control the communities on Substack: the writers themselves. On our platform, each publication is its own dominion, with readers and commenters who have gathered there through common interests. And readers, in turn, choose which writers to subscribe to and which communities to participate in. As the meta platform, we cannot presume to understand the particularities of any given community or to know what’s best for it.”

A month later, The Washington Post’s Elizabeth Dwoskin published an article with the headline Conspiracy theorists, banned on major social networks, connect with audiences on newsletters and podcasts: Newsletter company Substack is making millions off anti-vaccine content, according to estimates. The article cites research from the nonprofit Center for Countering Digital Hate.

Just before the Post published its story, the Substack cofounders published another post, titled Society has a trust problem. More censorship will only make it worse. In it they wrote that “The more that powerful institutions attempt to control what can and cannot be said in public, the more people there will be who are ready to create alternative narratives about what’s ‘true,’ spurred by a belief that there’s a conspiracy to suppress important information.”

Micah Sifry, who publishes the newsletter The Connector on Substack, pointed out to me that

...what's interesting about Substack's statement is that they note, in passing, that they do have rules allowing them to censor content ‘at the extremes,’ which is good. But they don't explain how decisions about such content are made, how users or readers might flag it for attention, the processes by which take-down decisions are made, or how often any of these things (flags, warnings, take-downs) happen. Like it or not, these are all necessary features and all mature platforms have them in place. And as much as we bemoan the decline of trust and the rise of polarization, platform owners all discover that they can't be perfectly neutral, and that it's actually a healthy thing to declare your values and moderate content accordingly. So, while it's good to see Substack's founders wrestling openly with their approach to the issue of platform moderation, I don't think their response is sufficient. If they are indeed making millions from publishers who spread misinformation about vaccines, as the Center for Countering Digital Hate has reported, that only sharpens the question of their own moral responsibilities.

To talk more about these issues of speech, editorial intervention, content moderation, and democracy, I invited two expert commentators, Bridget Todd and Elizabeth Spiers, to the Tech Policy Press podcast.

  • Bridget Todd is the creator and host of the award-winning technology and culture podcast There Are No Girls on the Internet and communications director for UltraViolet, a gender justice advocacy organization working to build a feminist Internet. Bridget wrote a great piece for The Nation, titled It’s Not Just Joe Rogan. The Entire Digital Space Is Rotten.
  • Elizabeth Spiers is a writer, NYU journalism school professor, political commentator, and digital strategist. She is also the former editor in chief of The New York Observer and was the founding editor of Gawker. Elizabeth wrote a trio of columns on Medium addressing the issues raised by the Rogan controversy and the Substack statements.

What follows is a lightly edited transcript of our discussion, which took place at the end of February.

Justin Hendrix:

So, we're now more than two months out from the beginnings of this controversy around Joe Rogan and his podcast. His December 31st podcast, I believe, was the one that sparked the latest round of controversy. And both of you have written on the topic about what it tells us about the information ecosystem more broadly. I want to start by getting you to reflect on this debate that's been playing out about Rogan, Spotify, and speech on the internet over the last eight weeks. Bridget, we'll start with you.

Bridget Todd:

My first take is that it is really funny to watch this play out in real time where these big deal tech executives at Spotify seem to be really surprised to find out the actual content of this podcast creators they paid millions and millions of dollars to host on their platform. I think for me, my biggest takeaway is that we have unfortunately created a digital media ecosystem where liars, scammers and people who are trafficking in the most extreme ideologies and lies are the ones who are handsomely rewarded, amplified, and incentivized and we need to change our media landscape or those are not the folks who are taking up the most oxygen in the room.

Justin Hendrix:

Go ahead, Elizabeth. Do you want to jump in?

Elizabeth Spiers:

So, there are a few reasons why Rogan is in hot water with the general public and not so much with Spotify. The first element is that I think it came to light in December that he had been either wittingly or intentionally spreading COVID misinformation, when he had a specific guest on who's a well known anti-vaxxer. And I think that was the first point at which Spotify got any major blowback for licensing the rights to Rogan’s show. But of course, when you put Rogan’s show under a microscope, there are so many other problems with it. And to echo Bridget, Spotify doesn't seem to have done very much due diligence on what they were buying. They've tried to remedy that a little bit by deleting some of the more problematic episodes where Rogan has said things that were racist and transphobic, but that sort of meager effort has just led to more people really digging into prior episodes.

And, of course, the right has been embracing all of this as a free speech issue, which they've come to sort of define as the ability to say whatever you want, whenever you want, on any platform, which is not what free speech principles are about. They're about whether the government can intervene to censor what you're saying. But nonetheless it's turned Rogan into a kind of cause célèbre for right-wingers who want to be able to frankly say offensive things with no real repercussions.

Justin Hendrix:

And on some level, not just right wingers. I mean, it's been interesting to see who's come out of the woodwork to support Rogan. I don't know if you've kind of noticed that either of you. There have been some significant voices on the left that have defended him or kind of use this as an opportunity to critique the general concern in the media about misinformation.

Elizabeth Spiers:

Yeah. And I sort of understand why there are people who view this sort of after the fact editing of Rogan as maybe a slippery slope. They sort of insist that this could be applied to things that are actually useful for society, but I just find that argument disingenuous because Spotify is a private company, they do have terms of service just like everybody else. In the sense that private companies are allowed to choose what they're going to publish, every private company in existence is a censor in that case. I wrote a thing on Medium just suggesting that Spotify edit Rogan which would be exactly what would happen in any normal corporate podcast, because they're acting as a publisher when they buy exclusive rights to distribute him. And even that was met with some Rogan fans in my comments yelling at me about free speech and not realizing for their part, that if I wrote this on Medium, that they in fact were publishing their comments on a platform that also has a terms of service agreement and restricts what they can publish.

So, there are people who are seemingly not aware of the ways in which speech is restricted all the time. You can't say you're going to kill the president. The government will restrict that. But private companies will restrict things like hate speech. So, it's not unreasonable for Spotify to say, "We're acting as a publisher, so we're responsible on some level for whatever Rogan's putting out there and maybe we should edit him." I mean, editing generally makes these shows better anyway. It would have the added benefit of increasing the quality, I think. So, when they throw their hands up and say, "We can't really do anything about it," that's a choice. And if they decided that they were going to edit Rogan, there's plenty of precedent for that. That's the most professionalized podcast. So, I think the real issue is that Spotify doesn't want to alienate Rogan's audience who sometimes likes it when he says racist and transphobic things, or they're on board with the COVID misinformation because they believe it.

Justin Hendrix:

Bridget, you've connected this back to the broader information ecosystem and to the incentives in it.

Bridget Todd:

Yeah. I mean, I think Elizabeth is exactly right. I think that, well, first of all, I think Spotify and Joe Rogan are in hot water. They got in hot water initially over the COVID misinformation, which is rightly so, but people have been pointing out the ways that his show has been trafficking in lies about trans folks, people of color, women for a very long time. And so, it's interesting to me that COVID misinformation, like that was the thing that really stuck. And I think for me as someone who really does a lot of work in the misinformation space, I think it really goes back to the fact that when you lie, whether it's lies about COVID or vaccines or lies about trans people and their bodies or women in their bodies, all of it is a threat to public health.

All of it is a form of health misinformation or medical misinformation. And so, I think it really demonstrates that trafficking and lies about people and about our bodies and about science are all kind of connected. They're sort of a thorny ball of yarn that if you traffic in one, it's odds are you're trafficking in another. And so for me, it just demonstrates that we have a lot more work to do to really illustrate to people that, yes, lying about vaccines and COVID is wrong, lying about trans people is wrong, lying about people of color is wrong, lying about women is wrong. Using your platform to amplify those lies is bad business.

And it says especially for Spotify, incentivizing it, creating a system where you say, "Listen, we have so many interesting, thoughtful creators on our platforms. Those people are not going to be amplified as much as somebody like Joe Rogan who traffics in lies. They're not going to be paid nearly as much as somebody who traffics in lies.” I think that Spotify really needs to take a step back and think about what they are incentivizing and what kind of culture they're contributing to you when they set that kind of a precedent. I don't feel like that's something that happens in a vacuum and Spotify is such a big, powerful company that the way that they do business really can have ramifications for the entire audio landscape. And so, I just think that Spotify is trying to play big and play small at the same time. They're trying to say, "Oh, well we're just a little podcast company. We can't be responsible for editing Joe Rogan. We don't even know what he publishes. He doesn't even know what he says half the time," and also try to have this big footprint in the audio landscape. It really, for me, this kind of can't be both.

Justin Hendrix:

So, I want to just point out a couple of similarities between the approach that Spotify CEO Daniel Ek has taken, and another company that you've written about recently as well, Elizabeth: Substack. This idea that silencing Joe is not the answer– which to some extent is a straw man, I don't think anybody was suggesting that Joe Rogan should be summarily taken off of Spotify or off the internet or otherwise completely silenced– there's sort of a similar vein that runs through this recent statement from Substack executives, this idea that censorship is the danger that we have to contend with out there.

Elizabeth Spiers:

I thought the Substack statement was even more absurd than what Spotify said. Substack's PR person did a long Twitter thread about defending their decision to leave controversial stuff on the platform. And they made an argument that just kind of defies credulity. It's that by having misinformation on the platform, it somehow increases reader trust because you're giving the readers some kind of option to determine whether something's true. What we see– I mean, having worked in media for two decades now– we have seen a big erosion in trust of the media, but it's very much tracked with distrust of institutions generally, and we see that as a big problem. I mean, people don't trust the government, they don't trust scientists. And then part of what the right has done strategically is to erode confidence in these institutions. So, when you look at the problem of people not trusting the media, it's not because people don't have access to a variety of types of information or a variety of viewpoints, it's because they just don't trust institutional media categorically.

So, in my experience, people don't trust the media in a very sincere way if they think the media is lying to them or the media gets stuff wrong. So, the idea that letting people get stuff wrong and Substack somehow enhances trust seems not just disingenuous to me, but backwards.

Justin Hendrix:

So, both Ek and the Substack folks seem to have this idea around the nature and the quality of debate, the Substack folks write, “We prefer a contest of ideas. We believe dissent and debate is important. We celebrate nonconformity.” Ek wrote, "Looking at this issue more broadly, it's critical thinking and open debate that power is real and necessary progress." I'm sure there are a lot of folks that are listening to this that would think that sounds very, very reasonable. What specifically is wrong with that line of reasoning in your point of view?

Elizabeth Spiers:

I mean, I think when we talk about productive debate, there's a sort of assumption embedded in that, that everybody's using facts that are true. And the whole problem with misinformation or disinformation is there's no guarantee that people really are capable of always triangulating on what's true and what isn't. So, if you're operating in a sort of dialogue where one side is just misrepresenting reality, it's not really a debate in that case. You're litigating something, but it's certainly not constructive, especially if the default heads become kind of memes and you end up spreading more lies than you began with, I think it's even worse, then you become a kind of vector for more disinformation. And that just has nothing to do with constructive debate.

Bridget Todd:

I'd love to add something to that. One clip that really just sticks with me from Rogan is this kind of casual way where he has a mixed race guest on, and he's like, "Oh, one of your parents is Black, one of your parents is white. So, you have the brain of a Black man in the body of a white man." And he's like, "Black brain is a different kind of brain." And he says it really casually and there's absolutely no pushback. And when people say you should be responding to this with open debate in a marketplace of ideas, I'm a Black woman. If we're starting off a conversation about whether or not I have a different brain in my head, that my physical makeup of my ability to understand and reason is at a different level, what kind of debate can we actually have?

Exactly as Elizabeth said, we're not having a conversation that is like, where we can have a debate because we're just not starting in the same place. And so, I really feel like that example, how is one supposed to counter that? If that's our starting place, how can we even go to a place where we're going to get to common ground or that a debate is going to be fruitful or effective, or even a good use of my time if it's like, "Okay, we're starting from a position where you don't even think I have the same brain as you."

Elizabeth Spiers:

Yeah. There are certain things that should not be up for debate. And the essential, full humanity of all people is one of them. There shouldn't be a debate about whether trans people are allowed to exist. So, there's a bit of a, the way people define what constitutes acceptable discourse really varies and it varies heavily politically. The right sort of wants a scenario where anything is acceptable discourse, especially when it comes to potentially disenfranchising women and minorities and that's, I mean, it's more, let me think about how to say this. It's a kind of permissiveness that just has no real constructive point. It doesn't lead to a healthier marketplace of ideas. All it does is generally harm people. And I do think that has to be taken into consideration when you're thinking about where to draw lines. And there are some areas where there's not a hard line and that's why companies like Spotify opt first for just saying, "We're not going to moderate. We're not going to make these decisions because eventually they'll bump up against one where there's some gray area and people will disagree all over the place."

But if you're going to act as a publisher, then you know what, that's part of the responsibility that comes with it.

Justin Hendrix:

So, some of the folks that have waded into this debate have defended, well, let me think about the way to say this. Some folks that have waded into this debate have talked about the kind of changing nature of veracity itself. The fact that one idea can go from fringe conspiracy theory through to accepted fact or at least entertained possibility over a period of time. I noticed in Matt Yglesias' essay about the problem of misinformation, The misinformation problem seems like misinformation, that he addresses Rogan. He talks about how some of what seemed like fringe ideas about COVID ended up turning into things that people would consider more closely. So, for instance, about severe side effects of vaccines and things of that nature.

So, what he's saying is that there's some value to what may seem like misinformation, that when you try to stamp that out, that to some extent you might be setting yourself up for ultimately failure. I don't know. How do you think of that line of argument when it comes to someone like Rogan?

Elizabeth Spiers:

I think it's a little bit of a red herring because the things that people are objecting to are not claims he made over what we all know and understand to be evolving science. But when he hosted Robert Malone and Malone said something like, "I believe our elected officials are hypnotizing everyone and what we're looking at is just a kind of mass delusion." A reasonable person would hear that and say that's nonsense and Rogan kind of entertains the possibility that it's true. And so, it's one thing to kind of be reasonably skeptical that we don't know everything yet and that science is always evolving, but when you hear something that's so insane that any reasonable person understands that it's not true and Rogan doesn't push back, that's a problem because he's sort of implicitly endorsing it when he nods along or suggests that it's plausible. And that's very different from having a guest on who says something crazy and you actually push back and say, "No, that's not true."

Bridget Todd:

I've seen that argument so often in that Rogan conversation, that he doesn't say these things. He'll have a guest on and the guest will say it, and that he's not acting as the expert. He's not saying to 'trust me,' but as a podcaster, you decide who you want to have on your platform. You decide when someone says something on the show, you are making a choice whether to keep it in or cut it out. That's just how the medium works. And so, this idea that it's an interview show, he's not the expert, he's not positioning himself as the expert... by putting somebody on your platform, you are kind of endorsing what they have to say and especially if they say something that is completely off the wall and you just nod and move on instead of challenging it, or even pushing back in the slightest-- which he almost never does-- I think is a real problem.

And I also think this idea of the evolving truth... he talks about things that maybe at some point would've been considered a conspiracy theory, but science changes. It's like an issue where even a stopped clock is right twice a day, right? If I ranted about a topic that I have no idea about, I might stumble on a few things that were accurate, right? But that doesn't mean that it's valuable, that the other 98% of what I said is complete BS just because I stumbled on a couple things that maybe have some truth. That doesn't mean the entire thing is valuable or productive or a healthy way to think about discourse.

Justin Hendrix:

So, I want to switch gears in the conversation a little bit to look out into the future. We've got another election cycle that's about to heat up in the US around the midterms. I've been looking a lot at the polls around January 6th and the 2020 election and the extent to which folks in the Republican Party in particular have embraced the idea that the election was stolen. Those views seem to be very persistent. They've held up more than a year since the insurrection. And to some extent that's because political elites and media on the right have continued to promulgate those ideas. Not always exactly the idea that the election was stolen, but perhaps ideas that are adjacent to it, that we should seriously consider the possibility that there was widespread fraud or that Democrats are attempting to introduce reforms because they intend to perpetuate or to execute another fraud in the upcoming cycle. Do you see a relationship between this Rogan saga and the upcoming election cycle and how we'll have to think about how to handle things like the Big Lie?

Elizabeth Spiers:

Well, I think a lot of Democrats in leadership really underestimate the extent to which right-wing media really shapes perception of Democrats at large. There's an assumption that you can sort of persuade a very partisan Republican that Democrats can do good things for them, but you sort of can't if the people are already in a media ecosystem that relentlessly tells them that Democrats are out to get them, or that they can't be trusted. You're just not in position to have any kind of constructive debate where you come to some common ground in that scenario. So, we have a problem where leadership acknowledges that the democratic brand is in trouble, but they kind of underestimate the extent to which that's not a good faith thing that happened. It's an intentional strategy on the part of right-wing media and it's very difficult to deprogram that.

Bridget Todd:

Yeah, I completely agree. And I would also add that I think that is the point of this kind of disingenuousness and bad faith attacks. It gums up the works of legitimate discourse. And so, you can't even have a substantive conversation about the issues. I'm a Democrat, but I have plenty of issues with the way the last couple of years have gone, or the last few years have gone. And I can't even have a substantive conversation about that because the gears of discourse have become so gummed up and we can't even have legitimate conversations anymore about actual issues and get to common ground and move forward. And I think bad actors and hucksters and liars and people who profit off of those kinds of things, that's what they're counting on. That benefits them. And so, it's really sad that we've gotten to this point that I think even people who would say like, "I'm a reasonable person who's interested in a debate about the issues." Doesn't see the ways that that is no longer really possible, and that bad actors are profiting from that being not possible.

Elizabeth Spiers:

I mean, this is maybe a minor point, but the whole dialogue about whether Rogan has his own free speech rights is really depressing to me because it's a symptom of how little civics education we have in this country. The fact that otherwise intelligent people don't seem to understand that free speech applies to how the government intervenes and not how people make everyday decisions to edit things or curate content is depressing. But things like this really bring that to the forefront. And yeah, that could be said of a million other civic issues, but it points to a deficiency we have in not just our educational system, but the way we talk to the electorate, I think.

Bridget Todd:

That was such an interesting thing to watch play out, especially when Spotify took down, when those episodes of this show were removed, and then Spotify put out a statement basically saying that Joe decided to remove them. And yet you still had people, even his own guest whose episode had been removed, being like, "Oh, I'm being censored." And it's like well, if Joe Rogan decides he doesn't want to have this episode up anymore and takes it down, is that censorship? I wish I could sit down with these people and really get to the bottom of what they think censorship is, what they think is an attack on their free speech rights. I think that you're absolutely right that it points to a real deep failure in terms of something along the way of our media literacy and our civic education.

Justin Hendrix:

Haven't we kind of got to the point though where being, "Censored or canceled," is a kind of badge for people? I mean, you have people like Dan Bongino who seem to really court getting removed from YouTube or having Google take action against his site. It seemed he really wanted that because it was actually part of a promotional campaign for his other activities and his efforts on other platforms. Is all of this just part of the business, part of the theater?

Elizabeth Spiers:

I think for the right it is, because part of what's going on there, in terms of internal logic, is if I can say whatever I want no matter how offensive, then I'm exercising a type of power. When I harm other people, I'm exercising a type of power. They'll never come right out and say that, but that's sort of what they're doing. When they say, "Well, I want to be able to say trans people aren't human or whatever," and then wait to get the blow back and then they sort of hold it up as a badge of honor just to sort of note that, "Well, I did it and I got away with it."

But how many times has anybody actually been canceled, especially on the right? A lot of their audience is really built around the idea that they can say offensive things. That was a common thing that right-wingers said about Trump. They would say, "Oh, well, he speaks his mind." What they really mean is that he's willing to say things that are outside of the realm of acceptable discourse and that's part of their brand.

Bridget Todd:

I would kill to be canceled in this way to get like a Netflix first look deal, a cushy speaking tour, hundreds of millions of dollars to speak my mind. If that's what being censored is, sign me up, I would love that. Yeah. I mean, I think at this point it's just blatant marketing. And I think people who cry about having been canceled, I think are no longer even pretending that it's anything other than a marketing strategy. So, I completely agree. And it is interesting that I don't think that we have that same kind of culture on the left. I don't think that you could make this grand claim of being silenced or suppressed or censored and have that boost your sales or boost your platform.

But I think on the right, it's clear that it's an actual strategy. And I guess that's what I mean when I say that we currently have a media landscape that incentivizes nonsense, it incentivizes drifting and lying and being the loudest voice in the room and we all lose when that's the case. When someone who is a grifter or a bad actor knows they can get eyeballs and ear holes and money by drifting, that's a problem and I think that we have that. All Dan Bongino has to do is continue to say that he is being silenced and that will increase his platform and build his brand and that's a real problem.

Justin Hendrix:

To some extent, that might be the thing that is a through line through all those guests that Rogan has had to strike from his past catalog. It's not just the racism, but typically it's the fact that those individuals who are to some extent monetizing their various grievances.

Bridget Todd:

Absolutely. I did an interview with this technologist Ifeoma Ozoma who used to work at Pinterest– she's this woman who really spearheaded medical misinformation being pulled from Pinterest– way before COVID, early on. And one of the things that she told me is that nine times out of ten, somebody who is pushing medical misinformation on social media platforms, they're doing it because they're selling supplements or they're selling some kind of other thing where they say, "Oh, the vaccine is not safe. But coincidentally, you know what is safe? This nonsense supplement that I am selling on my website." And so, these people are scammers and hucksters and grifters. We've gotten to this place where we are legitimizing and mainstreaming them and having these conversations about the marketplace of ideas and that gets us away from the reality that nine times out of ten, these people are just scammers.

Justin Hendrix:

And maybe that's a good place to stop. Thank you very much.

Elizabeth Spiers:

Thank you for having me.

Bridget Todd:

Yeah, this is great. Thanks so much.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics