Home

Unpacking New Mexico's Complaint Against Snap Inc.

Justin Hendrix / Oct 6, 2024

Audio of this conversation is available via your favorite podcast service.

Last week, Wall Street Journal technology reporter Jeff Horwitz first reported on details of an unredacted version of a complaint against Snap brought by New Mexico Attorney General Raúl Torrez. I spoke to Horwitz about its details, and questions it leaves unanswered.

What follows is a lightly edited transcript of the discussion.

Evan Spiegel (Senate testimony):

Chairman Durbin, Ranking Member Graham and members of the committee, thank you for convening this hearing and for moving forward important legislation to protect children online. I'm Evan Spiegel, the co-founder and CEO of Snap. We created Snapchat, an online service that is used by more than 800 million people worldwide to communicate with their friends and family. I know that many of you have been working to protect children online since before Snapchat was created and we are grateful for your long-term dedication to this cause and your willingness to work together to help keep our community safe. I want to acknowledge the survivors of online harms and the families who are here today who have suffered the loss of a loved one. Words cannot begin to express the profound sorrow. I feel that a service we designed to bring people happiness and joy has been abused to cause harm. I want to be clear that we understand our responsibility to keep our community safe…

Justin Hendrix:

That was Snap founder and CEO Evan Spiegel during a hearing on "Big Tech and the Online Child Exploitation Crisis" hosted Wednesday, January 31st in the Senate Judiciary Committee. This week, the Wall Street Journal first reported details of an unredacted version of a lawsuit filed by New Mexico Attorney General Raúl Torrez. The suit argues that the design and implementation of features on Snapchat make it one of the most pernicious purveyors of child sexual abuse material and harm inducing features on children's electronic devices.

As Tech Policy Press staff writer Gabby Miller reported when the redacted lawsuit was first published in September, New Mexico says Snapchat was designed to attract and addict young people, that it openly fosters and promotes illicit sexual material involving children, and that it facilitates sextortion and the trafficking of children, drugs and guns. This type of phenomena, when combined with the company allegedly misleading the public on the safety and design of its product, amounts to a public nuisance violation of the New Mexico Unfair Practices Act, according to the 164-page filing.

New Mexico Attorney General Raúl Torrez appeared on the Tech Policy Press podcast just a couple of weeks before the Snap lawsuit was made public to discuss a similar complaint his office is pursuing against Meta. Here's what he said when I asked him if we could expect more activity from his office on social media and child online safety matters.

Attorney General Raúl Torrez:

The boundary between an online encounter and a real encounter with a predator is paper thin and I think we've gotten into a place where we somehow treat technology different, right? It's a technology company, so somehow we're going to apply a different set of rules, a different set of standards we're suddenly going to start focusing in on and using as a stalking horse. Questions about the First Amendment and all these other things, and the way I like to describe it is pretend for a moment that rather than talking about an application on your phone, pretend for a moment that right here in DC across the street there was a warehouse and in that warehouse there were a series of stalls where there were children and there were images of child pornography and that was a known location for pedophiles to show up in person every single day and sort through the children that they wanted to target and try and identify the people that they wanted to target you and I would not be having a conversation about whether or not I should file a civil action that takes five years and what are the dimensions and what are the First Amendment implications as a human being, you would not ask a person in my position to treat it in that way.

You would demand that I call the police department, that we walk into that warehouse, that we put people in handcuffs, that we deal with it in a criminal manner, and you would also demand more information about who owns that building, what did they know? Were they aware? Are they leasing this space, creating this space fully aware that this is activity is going on and if that were happening and I bought a criminal indictment against that person, you wouldn't have a second thought about it and you wouldn't come to me and say Maybe he has a First Amendment right to hang a sign on the wall and it should be protected speech or protected activity. But somehow because somebody has a computer science degree and wears sneakers to work and goes to work in Menlo Park, I'm supposed to treat them differently. I don't think so.

Justin Hendrix:

To learn more about the new claims revealed in the unredacted filing, I spoke to Wall Street Journal reporter Jeff Horowitz, who helped lead the team that first reported on the Facebook files and who has been covering child online safety issues in particular at Meta for some time now.

Jeff Horwitz:

So I'm Jeff Horowitz. I am a technology reporter for the Wall Street Journal out of San Francisco.

Justin Hendrix:

Jeff, this week you brought us news of an unredacted version of a lawsuit brought against Snap, which is the operator of Snapchat by the New Mexico Attorney General. The lawsuit we had already seen, we had reported on it at Tech Policy Press and it was already pretty bad, but this version is extra. I think the kids are saying these days you reported in the Wall Street Journal that the state of New Mexico accused Snap of giving priority to growth over safety and a failing to effectively address or disclose design features that make its platform a haven for various types of predators giving priority to growth over safety. This is becoming a pattern in terms of what you tend to look at when it comes to social media.

Jeff Horwitz:

It's, look, it's a familiar tension. Obviously the same could be generally said of some of the work we did related to the Facebook files and the internal documents there. Look, I think New Mexico has taken a fairly aggressive stance. They previously sued Meta on child safety issues and looking at how specific product mechanics might be harmful to kids, which I think is interesting. They're trying to figure out a way around Section 230, an entire wave of litigation from states is trying to do and they seem to be focusing very specifically on safety and safety in the physical slash protection from predation sense.

Justin Hendrix:

In your reporting, you focused on a handful of the claims that are in this lawsuit or in the unredacted version of this lawsuit, which stand out to you as the most concerning.

Jeff Horwitz:

There's a little bit of everything. There's stuff about guns, there's stuff about drugs, there's stuff about recommended content from strangers. Those generally seemed like they were just echoes of things that had come up in previous investigations and been a good part of the last year doing child safety stuff related to meta. So the thing that I thought was really notable about this lawsuit was that it seemed to basically allege that the failure mode of Snapchat is peer to peer abuse, usually involving predation and maybe sometimes involving drug dealing that while Snapchat has some of the problems of other platforms in terms of connecting you to people who are not great slash serving up content that isn't great, that really there's something different that makes the platform almost uniquely vulnerable to predation.

Justin Hendrix:

Do we get any insight into what that something different is?

Jeff Horwitz:

Yeah, I think there's… to some degree there's an argument to be made if you're just reading between the lines of the suit that Snap is a victim of its own success at convincing young users that it is in fact a more informal casual, less permanent form of social media. They have highlighted from the very beginning that messages don't have to be forever and that this is for communicating peer to peer more than it is for broadcasting life updates to massive audiences of strangers and I think there is a certain intuitive logic there. Let's put it this way, I have never been much for sending nudes of myself, but I can understand why just the mechanics of a more private platform might encourage that sensibility. And so I think that New Mexico's look, everyone's trying to figure out how to get around Section 230 and I think part of the way that New Mexico is trying to do that is to make the case that Snap understood that its product was higher risk than it publicly let on, and that therefore this is a disclosure and product safety liability case, not a Section 230 related issue.

Justin Hendrix:

And so that puts it squarely in the same category as New Mexico's prior lawsuit against Meta, but there are a lot of claims in here that match up to that conclusion. We've got this idea that Snap's own internal analysis found thousands and thousands of reports of sextortion, but that the tally really likely represent a small fraction of this abuse. That's according to the internal team. Looking at the question, you've got this SNAP employee talking about how they've twiddled their thumbs, rung their hands all year with regard to their response to sextortion. You've got various other kinds of back and forths over the extent to which the company has both technically and from a design perspective addressed harms like child sexual abuse material. This doesn't look great for Snap.

Jeff Horwitz:

No, I mean, and I wouldn't necessarily expect it to. I think, look, this is something that maybe I'm a little more sensitive to just given that I spent a lot of time with internal meta documents is you would expect and Snap made this point to me, you would expect that in the process of trying to deal with something like child predation, there would be self-criticism and concerns raised and that those sorts of discussions, they're always going to be something that looks pretty bad in them if you're looking, even if it's holy hell, I can't believe we're facilitating something at this scale. That's a pretty damning quote, but at the same time it's what you'd want to see. Again, there's definitely some more heavy criticism levied by internal safety staff in terms of we've been twiddling our thumbs and then there were a lot of expletives associated with that in the direct quote.

But I think the thing that really sticks out to me is the question of product design and disclosure and public disclosure. I think there were a few things about say the Snap maps. Snap publicly said this can't be used by strangers to find kids, which is a pretty obvious concern. It turns out there might've been a few exceptions to that for New Mexico that in fact there were ways that you could use it for that. Likewise, New Mexico alleges that SNAP misled users as to how private the ephemeral messages were and how successful and comprehensive it's in-app alert when someone took a screenshot would be obviously if after the fact if someone takes the screenshot and then you get the alert, they've still got the screenshot, you can't pull it back then. But there are obviously also other ways to record supposedly ephemeral message that don't trigger that alert, whether it's third party apps or just snapping a picture of the screen of the other phone's screen that has it on it. And the allegation by New Mexico is that they understood the stuff was being used for sexting and that there was a high risk here but just didn't want scare people.

Justin Hendrix:

One of the things that I found most concerning in your report and in the unredacted complaint was this idea of employees discussing that by design over 90% of account level were reports are ignored today. That's a quote from Snap employees and then this idea that their own internal investigation of confirmed sextortion cases concluded that 70% of victims didn't report their victimization and of the 30% that did report there was no enforcement action by our team. It seems like a number of very significant failures.

Jeff Horwitz:

Oh yeah, no arguing with that. Again, I'm a little jaded. I spent a lot of time with some of Meta's enforcement systems and that's a company that has functionally infinite cash as opposed to Snap and many of the same issues have arisen in the past with them. We ran investigations related to them not hitting the bottom of their CSAMM reporting queue even they didn't even look in some for child abuse reports and outcomes of large scale moderation systems very rarely look acceptable. I think for any platform it just seems to be something that the economics don't support doing well almost for anyone that said, yeah, there's definitely some large scale moderation failures. The 279 accounts reports of abuse that they studied and basically found that SNAP had not of the 30% that had reported that SNAP hadn't dealt with any of them functionally, so there was also one account cited that had 75 child safety and child predation apparently related reports filed against it by users and it was like still live.

This is obviously very bad and I think again, there's a recognition that this isn't going well that's cited by some of the staffers internally and a sense that some of Snap's protections are not what they should be. I think one that really stuck out was that, and this is something New Mexico tested was the recommended accounts you should follow feature, which I've forgotten Snap's particular variation on its name, but basically here's all the other people who you might know in 2022 they released a statement saying, Hey, we've made it so that kids aren't going to get recommended to stranger adults and vice versa, unless there's serious ties to their social circle, we won't make those recommendations. New Mexico's test certainly seem to suggest that is not ironclad and I think also very notably Snap's own tests in 2023 indicated it was not working right, that there was still recommendations that they were recommending kids and vice versa in ways that were pretty high risk.

I think a lot of this comes down to the user base as well, which is that this is Snap's core user base is very young and unusually vulnerable. There's going to be obviously extra incentive for people who are interested in sextortion to try to move kids onto this app and that's something that we've seen Thorn's analysis of NCMEC reports about sextortion recently found that, it was a few months back, found that Snap was the number one platform to go to from other platforms for sextortion purposes, right? It's familiar, feels private and to some degree that's something that kind of almost works against it in this particular problem set.

Justin Hendrix:

So another thing that I think you're probably maybe better placed than most to talk about is this kind of tension between the trust and safety operation and the CEO or the executive level there at Snap. My read on this document is that there's perhaps not the same level of egregious behavior by the CEO with regard to these problems as perhaps there might have been revealed in the Facebook files, but still there seems to be a mismatch of course between the incentives of safety and whatever forces are motivating Evan Spiegel.

Jeff Horwitz:

The Meta suit that was filed previously by New Mexico did name Mark Zuckerberg directly when it was first filed. This one doesn't [name Spiegel]. That said, there's plenty of references to Evan in the suit, most of them about Evan saying, look, we haven't made the same extremely public product features and stranger content recommendation feature decisions that other platforms have. We're not TikTok, we're not Instagram and we feel really good about that. So it's like a lot of him hyping up Snapchat's relative virtues for safety, privacy, et cetera. And there are a few references to some stuff, some past comments from him, but some of them seem really old, so I'll be curious to see if more comes out on that front. I don't know that we really get into Evan’s approach, but I do think that the elephant in the room on some of these things is Snap's financial picture and their business model, which is to some degree the thing that is at the product's core, which is peer to peer communication.

There's not a great history of monetizing that, so when you see things that seem like there may be carve outs to the ‘we really want you just to be connecting with people front’ such as the creation by Snap of QR codes that you could leave on other social networks. Just strike up a direct connection on snap. These have a bit of growth mindset baked into them and I think there's a question about whether that is higher risk for snap given the demographics, consumer demographics and emphasis on private stuff, whether the cross platform links are uniquely dangerous in a way that I don't know that a link from TikTok to Instagram or vice versa might not be. I

Justin Hendrix:

Ask you about one specific part of the unredacted suit that really concerned me and I was trying to get my head around it, which is point number 106 in the suit, which is about CSAM images and whether it had synced its database of CSAM images utilizing photoDNA in order to be able to identify legal content. There's this suggestion in here that the company had in fact not done so that they had not implemented a photo DNA update that they had essentially for two years failed to put in place the technical capacity to make these matches. What do you make of that, the kind of conversation around it there?

Jeff Horwitz:

I'd love to know more about the timeframe of that. I think obviously the way you would like documents to read if you are the company is that you always were doing everything you could at all possible times that said questions of photo matching and whether or not you would store your own database of content that had been detected on your platform. I don't know that the world was as clear as to how to deal with those things a decade ago as it became later on. If in other words, if that was happening in 2022 where there'd be an effort to not produce matches or to do anything less than build the most comprehensive collection of hash content to work with photoDNA, that would be deeply concerning. That said, there was a period of time when hosting this content was viewed as extremely high risk because nobody outside of NCMEC was authorized to do so and so, yeah, I'd be curious about what that looked like. I think something that for Snap just goes throughout this stuff is that if this is Snap in the process of dealing with basically a unprecedented explosion of sextortion and commercialization of this problem internationally and so forth, like an offshoot of pig butchering and what have you, if that's the case, then I think that is one thing and you're looking at past struggles. If however, the company isn't on top of this stuff in fairly short order, I think at that point New Mexico's claims start seeming really fundamental.

Justin Hendrix:

It is a good flag that while other points in the lawsuit that are adjacent to this concern over photoDNA and CSAM appear to be contemporary or roughly contemporary, kind of 2023, 2021, 2022, the specific timing of those quotes and those events is not revealed in the lawsuit. Good point. To maybe dig into a little bit in the future, are there any other phenomena that you looked at in this lawsuit that you think of as particularly of interest or things that you would hope to find out more about going forward?

Jeff Horwitz:

I will say that for a while now, I have personally been really interested in how people are finding kids on Snap. There's been a number of reports both out of the UK and the US suggesting that Snap is on a per user basis really yielding more sextortion cases than other platforms are, and some of this is just inherent risk, right? YouTube is going to not yield very many sextortrion cases because guess what? There's no DM feature. It's pretty hard to commit that crime there. But I think the question of how are adults finding kids on a platform that as the lawsuit notes Evan Spiegel has said very specifically was intended to make such a thing pretty difficult and whether maybe some business imperatives might have infringed on that. The goal of keeping things really private that I think is the really big underlying question here and where the balance is between growth and safety work.

One of the things that really caught my eye in there was, this is actually from the marketing department in Snap. They were mulling over a program intended to warn kids and their parents about sextortion and as is often the case in internal conversations, they were a hell of a lot more forthright about it than I think any company would ever be in public. The issue was whether they basically tell people not to send nudes, which the marketing document acknowledged had become pretty commonplace among Gen Z in particular, but honestly older users as well that just like it's almost expected that a 14-year-old is probably going to be sending or receiving nudes from peers. So they can't just say don't do it because that candidly isn't very helpful. But at the same time SNAP was like, yeah, the really helpful message, which is for the love of God, do not include your face identifying backgrounds or truly identifying physical details when you do take self nudes, which you inevitably will Snap, can't say that for, and I have a lot of sympathy for them not being able to say that because Snap cannot be running a guide onto how to best produce self-generated CSAM for kids, right?

That's obviously a no go, but at the same time they can't be like, don't do it because that's not really going to work. That's more of a ‘platform meets cultural norms’ issue and it's a difficult one. Snap actually in this instance didn't move forward with this program and while the documents don't say why, I can understand why it was difficult for them to do. Again, it's something I would wish parents and schools and to be very aware of is like, alright, the messaging may be being like, okay, if you're going to do this, please do it in a way that isn't dumb and that allows some heads off catastrophe if it blows up on you. But that's a hard message to get out there from the platform itself.

Justin Hendrix:

And we know the polls suggests that the numbers are through the roof in terms of the amount of sharing of self-produced, as you say, intimate photography and video.

Jeff Horwitz:

Yeah. Yeah. So anyhow, I don't know exactly how you train people in the hygiene of plausible deniability if this stuff leaks, but again, it seems like a thing that Snap really was grappling with to some degree

Justin Hendrix:

On the date of your report, and I suppose the date that the unredacted lawsuit was made, public Snap issued a statement on its website. It says, “we designed Snapchat as a place to communicate with a close circle of friends with built-in safety guardrails and have made deliberate design choices to make it difficult for strangers to discover minors on our service,” and goes on to say “we care deeply about our work here and it pains us when bad actors abuse our service.”

I found myself thinking about Snapchat and thinking about arc of time on some level, and if in 10 or 20 years we'll look back and say that was just a bad idea, giving a lot of kids an application that wasn't completely secure, that did permit them to create ephemeral messages that did allow for strangers to communicate with underage people. Is there just something fundamentally difficult to fix here? Is this one of those ones that eventually we'll look back on and just say, yeah, not like that.

Jeff Horwitz:

It seems hard to take issue inherently with the concept of ephemeral messaging as a feature or as peer-to-peer small group communication as a feature that exists on WhatsApp that exists on Signal. Some of these things are, so I think the question is, in other words, that feature set feels honestly more natural almost than some of the more public, than the idea of say, a TikTok feed. It's like closer to free internet communication in my mind almost. So it seems like on those level of mechanics, that feels hard for me to think, and again, I'm above my pay grade. I do not design product, I just criticize other people who do. But I think the question of what sets of features are attached to that, is there something perversely that the fact that it feels silly and safe and that there's noses and lots of silly features and it's, is there something about that kind of lulls young users into a false sense of confidence of their safety and all that?

I think those strike me as being questions we might look back on and be like, okay, what is that? And I think there's always the question of is it an enforcement problem or is it a fundamental design problem? And everyone obviously always wants it to be an enforcement problem because you'd rather fix, if you can just enforce a little more, then you don't have to change the fundamental product issues. I think that TBD on this, the thing that again, even before this seemed pretty clear was that Snap seemed to produce more of these cases, more sex ocean cases than other platforms did on a per user basis. I think New Mexico's suit lays out some reasons why that might be worse. I will say I'm still unclear as to what the fix is, if that makes sense. There's obviously a whole bunch of different tactics being deployed here by, there's New Mexico, there's the other state coalition, it's more focused on mental health.

There's private litigation. We still get into the, okay, so what are the, all of these suits seem like they're premised on the idea of surely you can't run a platform that facilitates these sorts of things and not be responsible. That's the undertone for all of this stuff. So I'll be really curious to see how some of this goes, and I think interesting to see a state take issue with some of the more granular features and product defects and actually test whether or not the recommendations for people to follow are working as they're supposed to and so forth.

Justin Hendrix:

Well, that has been the other thing that the New Mexico Attorney General has done that is a novel, I suppose, in a case against Facebook, and they followed that up with a sting operation where essentially they went and caught, I think they called it Operation Metaphile, where they actually caught people using the platform in order to engage with young people.

Jeff Horwitz:

This is something that I felt like just doesn't happen. It couldn't possibly happen enough that people, whether they are in or outside the platforms, and obviously if you're outside the platform, it's almost the only way studying up a test account and seeing what happens and what are wrecks like, what gets surfaced when you connect with a node that's problematic. Does that just open a door to a torrent of awful options? So yeah, I think that's been really interesting to read through some of these things and yeah, they have arrested few people along the way. It sounds like

Justin Hendrix:

Jeff Horwitz, thank you very much for sharing your reporting with us.

Jeff Horwitz:

Thank you.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics