Home

New Mexico Attorney General Raúl Torrez on His Lawsuit Against Meta

Justin Hendrix / Aug 11, 2024

Audio of this conversation is available via your favorite podcast service.

Raúl Torrez was sworn in as New Mexico’s 32nd Attorney General in January 2023. A former state and federal prosecutor, he served as a senior advisor in President Obama’s Department of Justice, and led the Albuquerque metro area's District Attorney’s Office. Attorney General Torrez is a graduate of Harvard University, the London School of Economics, and Stanford Law School. He is a Democrat.

Last December, Attorney General Torrez filed a lawsuit against Meta for allegedly failing to protect children from sexual abuse, online solicitation, and human trafficking. The suit also argues that the flawed design of Instagram and Facebook led to the recommendation of child sexual abuse material and child predator accounts to minor users. The suit references a substantial amount of internal material from Meta, such as a 2021 internal Meta presentation that estimated 100,000 children per day received online sexual harassment, including pictures of adult genitalia, via its platforms.

The Attorney General claims that Meta’s practices violate New Mexico’s Unfair Practices Act and have created a “public nuisance” by “creating and maintaining a condition that is harmful to the health and safety of thousands of New Mexico residents and interfered with the enjoyment of life in violation of New Mexico law.”

In a January statement reported by the Associated Press, Meta said it has spent “a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online,” and that “[[t]he complaint mischaracterizes our work using selective quotes and cherry-picked documents.”

In May, the Attorney General announced the arrests of three individuals suspected of being online predators, following a months-long undercover investigation named "Operation MetaPhile." The suspects were caught after engaging with decoy accounts that were set up by the New Mexico Department of Justice, posing as underage users.

Later that month, a federal judge denied Meta's motion to dismiss the suit over the question of jurisdiction and the company’s claim of immunity under Section 230 of the Communications Decency Act. The judge did, however, grant Meta founder and CEO Mark Zuckerberg’s request to be dropped from the lawsuit.

The denial of Meta's motion to dismiss allows the case to proceed, and its outcome could have broader implications for how online platforms are regulated and held accountable for user safety in the future, including through litigation.

I spoke to Attorney General Torrez in advance of a panel discussion he participated in alongside the Attorney General of Virginia at the 2024 Coalition to End Exploitation Global Summit on Wednesday, August 7, in Washington DC. The Summit was hosted by the National Center on Sexual Exploitation and the Phase Alliance, a group that includes two foundations that campaign against sexual assault and exploitation, and one charity that campaigns against pornography.

Below is a transcript of the discussion, lightly edited for clarity.

Justin Hendrix:

Attorney General Torrez, thank you for joining us today. We're going to talk a little bit about the topic that you are speaking at on a panel today at The Coalition to End Sexual Exploitation Global Summit there in Washington, DC. You are joining the Attorney General of Virginia, I understand, to talk about ‘how to protect citizens from tech giants.’ I want to delve into your now multi-year effort to take on tech firms in this regard and get into the lawsuit in particular against Meta. Let's just start with what specific harms to New Mexico children and teens did your investigation uncover regarding Meta's platforms?

Raúl Torrez:

Well, I think like a lot of people who have been engaged in this space for some time, attorneys general, policymakers here in Washington and across the country, I have been increasingly concerned about the psychological harm and specifically the increasing evidence of the linkage I believe exists between rising levels of anxiety and depression and excessive social media use. I know there is a lot more work and research that needs to be done to really drill into the nature of that dynamic, but there was, I think another aspect of the rise of social media platforms that frankly went largely unnoticed or simply just didn't receive as much attention as I thought it should. And that was the way in which these platforms had become channels, in fact, breeding grounds for predators to identify, target, groom, and potentially exploit children sexually. My background is as a frontline prosecutor, as a child abuse prosecutor, and I spent the early part of my career working first on cases that involved hands-on abuse.

I then worked as an internet crimes prosecutor, an ICAC prosecutor in the attorney general's office many years ago. And when I started doing that work, the technology that facilitated this kind of behavior was very different, and it was before really the advent of smartphones. And the spaces online where predators used to go were off the beaten path, right? They were in the darker corners of the internet. This is more in the sort of the Napster days, and so there was a lot of peer-to-peer file sharing that used to go on. And that's how we used to identify a lot of CSAM material and child pornography. But the technology landscape had changed so much that I don't think that there was a corresponding awareness in the minds of the public and in the minds of policymakers that what had previously existed in the far reaches of the internet now actually existed on the most prominent, the most frequently visited social media platforms around the world.

And it was in sort of reconnecting with frontline investigators and lawyers who worked in that space that it was brought to my attention. And I will admit it was a bit of a revelation. As somebody who did this work so long ago, I simply couldn't fathom that the same kind of applications that I had downloaded and used, that my family had used, that we, like most Americans, most people in the world used to communicate with one another or are delivered advertising for specific goods and services that match our interests, were also the same places where predators were going to target children. And in trying to determine which of these spaces and applications deserved our immediate attention, it became very clear to me that Meta was at the top of the list because of the reach and the volume that was sort of apparent. And frankly in the policy decisions and the design features of the platform itself.

Meta is still one of the few, or not one of the few, but unlike other platforms, TikTok for example, Meta still allows unknown adults to direct message minor users. (Editor's note: in an email to Tech Policy Press after this transcript was published, a Meta representative said the company announced new settings to restrict unwanted communications with minors in a January 2024 blog post on the company's website, which was subsequently updated to on August 9.)

And they're also at the same time aware that there are underage users on the platform despite what they say in their terms of service. They're well aware that those kids are there. In fact, there is a profit motive and a business motive in getting young users onto these platforms because there is a long-term benefit to the company for being able to market goods and services. And so it was the sort of revelation that the company was itself aware of that problem and the fact that I didn't think that they had done enough to address it that prompted our action.

Justin Hendrix:

What evidence do you point to around the fact that Meta was aware that these issues existed but failed to take action? What do you think is most prominent, stands out in your mind?

Raúl Torrez:

Well, what jumps out at me are the revelations that we have from, not from external third party reporting, but from people in the trust and safety team inside the company itself. Who have for years, many of them have now left the company because they were so frustrated that the warnings that they were making, that the red flags that they were raising had gone ignored by senior executives there. But they had raised for years very specific elements of the design of the platform that had facilitated this kind of connection.

There were stories of executives at other companies, for example, I think it was an Apple executive who complained to somebody at Meta about the fact that their teenager was being solicited on the platform. And that specific report made its way back and made its way up to the highest levels, prompted I think a deeper investigation into some of these issues. And those warnings were just disregarded or certainly I don't think given the kinds of resources that are necessary to address the volume and the scope of what they found. And that frankly, those revelations, I think were confirmed by our own independent undercover investigation, which tracked pretty closely to the kind of criminal investigations that I used to organize 20 years ago.

Justin Hendrix:

I'm going to ask you about the law. The suit claims that Meta has violated New Mexico's Unfair Practices Act. Can you explain how that law applies in this case and what are the specific aspects of it that you are pursuing?

Raúl Torrez:

Right. So in general terms, our Unfair Practices Act is very similar to most consumer protection acts around the country. And what it prohibits any company who is engaged in selling goods or services inside the state from doing so in a way that is deceptive, doing so in a way that is dishonest with intended consumers and potential customers about any aspect that has a material impact on the decision of a consumer to use or purchase a good or service. And so it's applied in lots of different contexts. The most common application has to do with car dealerships, for example, about whether or not a local car dealer has been deceptive about the quality of the car, about the fact that it's been in an accident, things of that nature.

But it actually has a very broad application, and that law gives us the ability to actually compare the sort of public statements, the assurances, the general marketing approach that a given good or service presents to the marketplace, and then to match and test whether or not the public statements and the assurances that are provided by the company are actually backed up by practices and design features and things of that nature.

And so it is certainly a law that came into being and has been used in many different contexts and long before the advent of social media platforms. But I think it fits very well in part given the disparity that I see between what these companies tell the public, what these companies tell families, what they tell policymakers about the safety, about the fact that it is a place for people to come together and form social connections. And the reality, which is while those things exist in most circumstances in a way that is safe for a lot of users, particularly adult users, it is not safe for children and it is something that the company just hasn't take sufficient measure to address.

Justin Hendrix:

So what specific remedies would be available to you if this suit is successful?

Raúl Torrez:

Well, there are financial penalties associated with every proven instance of a violation of the UPA. But to be honest with you, I am not as focused on the monetary relief, the damages component, although that will be something we look into. I am fundamentally looking at the injunctive relief, the ways in which a court can order the company to modify business practices, to modify design features in a way that elevates consumer safety and specifically child safety to a much higher degree. So there are, I know, and at this conference there's been conversations about age verification, there's been conversations around encryption, things of that nature. I think all of those things will be elements of a proposed set of solutions that our team and the sort of technical advisors that we'll be working with will work to develop. But I think separate and apart from specific technological solutions, part of what we're really looking for is a culture shift, right?

Because we can come up with technological solutions that fit the landscape, the technical landscape as it exists today, and pretty soon those solutions will become outdated because the technology will have moved forward into a different place. And so what we're really looking for is a much firmer commitment on the part of the executives in these teams to take these issues more seriously. To frankly stop doing what they've done in the past when they've had whistleblowers inside their company come forward and say, "Hey, this is a problem. We need to do something about it." Rather than brushing those people aside or pushing those people aside, they need to take those concerns to heart and not wait for a financial impact in the form of a tumbling stock price for them to change their behavior.

Justin Hendrix:

So I do want to just ask one question around encryption since you mentioned it. You have had some pushback on efforts you've taken around encryption. Have your thinking evolved based on some of the developments over the past couple months on this issue? Of course, this is a sort of third rail issue in tech policy.

Raúl Torrez:

Yeah. Look, I take a different perspective. I just have a fundamentally different perspective on it. I think oftentimes technology companies are sophisticated actors in the market, and they're sophisticated players both in the policy arena and in the area of litigation. And so I think it is much easier for them to oftentimes come forward and say, "Hey, we're taking this step because we are so concerned about your privacy. We are so concerned about all of the great potential downside that could come from cooperating with law enforcement or having access to these sorts of issues. That's why we're doing it." It's much harder to say, "You know what? It costs us a lot of money to process search warrant requests." And so the classic example, which is outside of the sort of focus of today's conference, is the shooting that occurred in San Bernardino, California, and the shooter's phone was a locked Apple device. And the company basically looked at the FBI and said, "Sorry, we can't help you. We won't help you." And so we had to turn to a third party vendor to crack into that.

I don't think... I understand and respect the need for privacy protections, but people have to understand also that that's why we have systems in place. We have warrant requirements, we have independent judicial review. We have to take affidavits to a judge to have it independently reviewed. In other words, there are legal mechanisms for addressing and dealing with those issues. I think it is irresponsible for companies to create these technologies knowing full well how they're going to be used and in what context they're going to be used. And so I think they ought to just be forthcoming about it. I find it difficult to believe that companies like Facebook, whose entire business model, or Meta, whose entire business model is predicated on consuming, monitoring and monetizing every single bit of data that they have about you to turn around and say, "Oh, but we care so much about the security and privacy of your data."

No, they don't. They care about it and have no problem accessing it and utilizing it in ways that maximize their ability for their algorithm to target you for a new pair of shoes or a new pair of jeans or something like that. But suddenly they're concerned about privacy when we're worried about a predator grooming a child online? I don't buy it, and I don't think most other people buy it.

Justin Hendrix:

I want to ask you about some of the other actions you've taken with regard to child online safety and exploitation, Operation MetaPhile. Can you explain a little bit about that? And also whether your office continues to engage in that type of investigative work? Can we expect more such indictments brought?

Raúl Torrez:

Well, I can tell you without getting into the specifics that yes, you can. And the industry and other companies can expect those kinds of operations to be ongoing, those kinds of criminal investigations to continue in part because I think it's imperative that we demonstrate to policymakers, to educators, and to parents frankly, that the harm that we are talking about is not some ephemeral harm that exists in the virtual world, right? Oh, someone sent a picture or someone solicited my child, but maybe that's not as bad as other types of harms. What I'm trying to demonstrate to people is that without real directed urgent action in this space, the boundary between an online encounter and a real encounter with a predator is paper thin.

And I think we've gotten into a place where we somehow treat technology different. It's a technology company, so somehow we're going to apply a different set of rules, a different set of standards. We're suddenly going to start focusing in on and using as a stocking horse questions about the First Amendment and all these other things. And the way I like to describe it is pretend for a moment that rather than talking about an application on your phone, pretend for a moment that right here in DC across the street, there was a warehouse. And in that warehouse, there were a series of stalls where there were children and there were images of child pornography. And that was a known location for pedophiles to show up in person every single day and sort through the children that they wanted to target and try and identify the people that they wanted to target.

You and I would not be having a conversation about whether or not I should file a civil action that takes five years and what are the dimensions and what are the First Amendment implications. As a human being, you would not ask a person in my position to treat it in that way. You would demand that I call the police department, that we walk into that warehouse, that we put people in handcuffs, that we deal with it in a criminal manner. And you would also demand more information about who owns that building, what did they know? Were they aware? Are they leasing this space, creating this space fully aware that this is activities going on? And if that were happening and I bought a criminal indictment against that person, you wouldn't have a second thought about it, and you wouldn't come to me and say, "Maybe he has a First Amendment right to hang a sign on the wall," and it should be protected speech or protected activity.

But somehow, because somebody has a computer science degree and wears sneakers to work and goes to work in Menlo Park, I'm supposed to treat them differently? I don't think so. And I have a real problem with people who fall into the trap of constructing a whole new narrative, of constructing a whole new legal framework for addressing that behavior. Because as bad as it is in terms of the hypothetical that I just described, Meta is a million times worse than that because it isn't just one building. It is everywhere. It is every country in the world, and it is not a handful of kids in a single building in one city.

It is a global problem, and it's a problem that they're not only aware of, but they're willing to tolerate because they make money off of it. They make money off making sure that those kids stay in that building because they know even though there's a chance that a predator might find them, maybe I could sell them a few pairs of shoes for the next 10 years. I don't think that's where most people in this world are comfortable in terms of a business model or a business strategy.

Justin Hendrix:

You're in DC today. There is a broader conversation happening, on Capitol Hill about child online safety legislation. There is a bill, I believe still before the Senate in New Mexico, an age-appropriate design code. What do you make of these legislative efforts? Do you think that they will address the types of harms sufficiently that you are pursuing in this lawsuit?

Raúl Torrez:

Yeah, I mean, I think it's going to make a difference. I think the legislation in DC, and I know it's gone through a lot of different iterations. As I understand it, they sort of combined COPPA 2.0 and KOSA into a new bill with some modifications. I was impressed to see that some of the critiques that people had from the human rights campaign and other places were taken to heart by the drafters. I think that's important to make sure that we have the kind of protections that communities that gather online and are looking for resources are concerned about. I think it frankly makes sense to allow the FTC to be the enforcement mechanism there to sort of remove it from the local political considerations. I don't have any problem with that. I quite frankly think that the larger issue is Congress's inability to address and deal with and effectively amend Section 230.

I don't think that if we had gone down that path twenty-some years, nearly 30 years ago, that we would be in this place now. If these companies had had to bear the cost and the liability associated with facilitating these kinds of connections and interactions online from posting abusive material, violent material, material associated with terrorist activity, all sorts of things. If they had to contend with the financial liability of all those things, I don't think these algorithms would work the same way. I don't think whistleblowers would be disregarded in the same way because there would be a financial consequences hat attached to it. So ultimately, I think that needs to be addressed and it needs to be done in a way that's careful to make sure that we still have the ability for people to come together online, communicate privately online if they need to. But just because we have those concerns doesn't mean that we can't also see the larger issues at play and address those at the same time.

As for the local legislation, we haven't been directly involved with the latest draft of the kids code. We are planning, and my team is currently working with some technical advisors from around the country to unveil some new proposed AI regulation, because again, I think Congress is moving too slowly in this space. I think we are going to see a rapid transformation of the technological landscape with the rise of artificial intelligence in the next few years. And I think there's some basic things that we can do there to try and keep people safe. But the reality is, and the thing that I always try to get policymakers to think about is the pace and speed of the law needs to move and try to approximate the speed and pace of technological change.

Because what happens is we create a legal framework, the technology rapidly outpaces it, the harms that we never could conceive of emerge. It takes way too long for us to come up with a framework that we modify. By the time we get around to modifying it, the harm has changed, the technology has changed, and everything has moved into a new phase. We have to try and compress the amount of time between the evolution of technology and the evolution of the law so that the potential harm to vulnerable populations is reduced and mitigated in a way that we haven't frankly been able to do effectively.

But I think that's the challenge. And I think that's the real shame in the kind of polarization that we continue to see and the lack of action that we continue to see specifically at the federal level. Because ultimately, I can bring an action in New Mexico. We can try to push legislation in the state of New Mexico, but we are a small state. We're not driving global policy change. We need national action. And I hope that by participating in events like this, that we raise awareness for everyone about the urgent need to engage in these issues more constructively.

Justin Hendrix:

Meta had sought to see this lawsuit dismissed, but it will move forward, though not with its CEO and founder named. When do you expect your day in court? What's next?

Raúl Torrez:

Well, I will admit, Justin, that I'm a criminal prosecutor by training. The pace of civil litigation is one of the things that I have learned in my new role to really try and temper my expectations. But we are going to be moving into the discovery phase. We're going to be gathering all the information, deposing people, trying to understand more about what the company knew and when it knew it, what actions it took. I would hope to see some sort of resolution of this within the next couple of years, but you never really know, especially when you're contending with a global behemoth the size of Meta with the resources that they have.

I can tell you this. If we navigate all of the legal traps, all of the motions and the army of lawyers that they're going to be throwing at this thing, if we get our day in court and I have a chance to stand in front of a group of citizens in the state of New Mexico and make this case to them, Meta should be very, very concerned about not what I have to say, but what those group of citizens are likely to do when they hear about how this company has conducted itself. And what it's failed to do to protect children.

Justin Hendrix:

Well, Attorney General Torrez, I hope we have an opportunity to speak about this again down the line when perhaps some of those developments are behind us. Appreciate you taking the time to speak to us today.

Raúl Torrez:

You bet.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics