Podcast: How Predators Use Facebook Groups to Target Children in Latin America
Justin Hendrix / Mar 2, 2025Audio of this conversation is available via your favorite podcast service.
Last week on Tech Policy Press, we published the results of an inquiry into how adult Facebook users take advantage of Spanish-language groups that form around celebrities and influencers to engage with users who indicate they are children, often below the age of 13. They share sexually explicit material, sometimes including links to adult porn and apparent CSAM videos, and encourage accounts that indicate they belong to children to share information and engage in direct communication.
To investigate this phenomenon, Tech Policy Press joined the Latin American Center for Investigative Journalism (EL CLIP) in publishing a report and series of articles documenting how adult users use public Facebook groups to identify and target accounts that indicate they are children for sexual exploitation.
The “Innocence at Risk (Inocencia en Juego)” project, coordinated by EL CLIP with participation from Chequeado, includes a report from Lara Putnam, a professor of Latin American history and Director of the Civic Resilience Initiative of the Institute for Cyber Law, Policy, and Security at the University of Pittsburgh, and independent reports from journalists across Latin America investigating a pattern of behavior on the platform’s public groups.
The reporting collaboration stems from Putnam's research. In 2022, Putnam published an analysis in Wired detailing public, Spanish-language Facebook groups where children were being targeted for sexual exploitation. Although many of those groups she identified were subsequently removed, in 2023, Putnam discovered a new trove of groups engaging in similar practices, which she documented in a report published by Tech Policy Press in January 2024.
In 2024, Putnam approached Tech Policy Press with additional research documenting how popular Facebook groups centered around Latin American and K-Pop celebrity fandoms, YouTube stars, and influencers become venues for child predation. Tech Policy Press and Putnam brought the report to Chequeado, which engaged with EL CLIP to coordinate reporting with journalists in Colombia, Venezuela, and Argentina. This week, they published their reports in EL CLIP, Chequeado, Crónica Uno, El Espectador, and Factchequeado.
I’ll note that we presented a set of questions to Meta about these phenomena. A Meta spokesperson responded with a statement and a link to efforts the company takes to proactively address these and similar phenomena:
Child exploitation is a horrific crime. We work aggressively to fight it on and off our platforms and to support law enforcement in its efforts to arrest and prosecute the criminals behind it. Our policies prohibit child exploitation, inappropriate interactions with children, and the sexualization of minors; these rules apply globally, in different languages, including English and Spanish, and across each of our platforms. While predators constantly change their tactics to evade detection, our global teams and tools work to identify and quickly remove violating content.”
In this episode, you’ll hear my discussion with two of the principals on the project, the historian Lara Putnam and the investigative journalist Pablo Medina Uribe.
What follows is a lightly edited transcript of the discussion.

Lara Putnam:
I'm Lara Putnam. I'm a professor of history at the University of Pittsburgh where I'm also the director of the Global Studies Center and I'm the faculty lead of the Civic Resilience Initiative of the University of Pittsburgh's Institute for Cyber Policy, Law, and Security.
Pablo Medina Uribe:
I'm Pablo Medina, a journalist and an editor at the Latin American Center for Investigative Journalism, also known as El Clip, and I specialize in investigating disinformation.
Justin Hendrix:
Lara, I'm going to start with you. I want to ask you a little bit about how your now years long effort to look at this particular set of phenomena began. How did you first find these groups and why did you start to track them?
Lara Putnam:
So, just as you say, I'm a historian and a historian of Latin American gender migration, and I've gotten involved in thinking about technological changes as well and looking at how social media is impacting broader cultural processes. But I never intended to become some kind of specialist focused on either child harm or Facebook. I happened to be looking in Facebook groups. One of the things I was studying was grassroots organizing of different kinds. And so I'm in the 11th Ward of the city of Pittsburgh, and I know that sometimes people form political groups on Facebook with the name of the ward that they're in. So I was like, oh, I should look and see if there are political groups for the 11th, 12th, 13th Ward of Pittsburgh.
So I went into Facebook groups, and I just looked for 11, 12, 13, and what came up was groups whose name was “Buscando novia de diez, once, doce, trece anos…” So, in Spanish, I'm looking for a boyfriend or a girlfriend who's 10, 11, 12, 13. And it wasn't just one group, it was multiple groups that appeared that had children's ages in the title and were explicitly looking for children of these ages. And then, when I clicked through, there was not sort of childlike romantic content. There was a little of that, but it was super explicitly sexual content. There was like, if you see the initial is of your name, I'll send you a picture of my… and then an eggplant emoji or a pickle emoji. There were anime-style cartoons of naked people engaged in sexual acts, there were gamified things. Anyway, it was all content that struck me as a mother of four as super appalling, and it was just shocking to me that this was available in a public Facebook group. But I thought, okay, well, now I'm going to do the thing that I should do and just report it online.
And surely Facebook has a very sensitive reporting system. So, if someone says there's sexual harm to children going on in a Facebook group, this will be taken down immediately. Somehow. This content must have somehow snuck in through what I was assuming were really strong filters. I mean, after all, Facebook knows everyone magically can suggest to you every person you went to high school with as a potential friend. So in my mind at that point, I was assuming that there were really sophisticated systems flagging the kind of thing that I was seeing. And then I found that I couldn't get it taken down, and I spent the next three months trying through different means to report items. And also I found that this was not just one group, there were multiple similar groups. So, I kept reporting them. I started tweeting about this. I contacted people I thought might know people at Facebook and I couldn't get anything taken down. And I ended up writing an article for Wired magazine basically just because I thought, well, at least if I publish, Facebook will have to take action against these kinds of groups. And in the short term, they did. But as the article that we've just published together shows what I've just described for you, took place in 2021, 2022, the Wired article came out in 2022. Here we are three years later, and close to identical content is still available.
Justin Hendrix:
So, for the listener that hasn't quite read your full report yet—I hope they will click through to Tech Policy Press and look at it, as well as the links to the reports from Pablo and the team from El Clip and across various news organizations in Latin America—can you just give them a brief sense of high level what we're talking about here, this kind of game mechanic, this mechanism of essentially luring children to talk about their age, to engage on sexual matters on a false premise?
Lara Putnam:
Absolutely, and it's especially clear in the group that I profiled in the article that has just come out with you. I profiled multiple dozens of groups encompassing over 2 million group members. And all of these are public groups that are focused around different, especially commonly focused around different kinds of celebrity fandom for child stars. And what you see in these groups is accounts sometimes anonymous, very frequently sort of pretending to be the accounts that impersonating these celebrities, trying to establish relationships with children. And you can see exactly how they're trying to connect to children who are online and take advantage of the fact that in the current technological moment, there are lots of children with access to a phone or another device that has a camera and is connected to the internet and they're trying to convince those children to send them photos, to send photos of themselves.
And you could already see how the, even just from the post, they're saying, oh, send a photo, or I need girls age 6, 7, 8, 9, 10 to do dares who are daring or who are clever or who are hot or who are willing to do special things. And then one of the things that I found and is mentioned in the articles in some cases in replies to these posts, people who've interacted with these accounts, reply posting girls don't trust him. It's not the star. It's an old guy who's going to send you pictures of himself naked or it's someone who's going to ask you for pictures of yourself naked. So it's very clear that what's happening, and this is a pattern that I discovered has been well established by specialist researchers in this field in which bad actors, child predators internationally meet children in the open online spaces and engage them through luring through what sometimes is called grooming to create a relationship.
And then they move that relationship out of the public space and they move it into private encrypted channels usually. And they increasingly pressure manipulate, persuade children to send them photos of themselves, sometimes videos of themselves, sometimes by pretending to themselves be another child. There's been some publicity in recent years about cases of this where young teenage or preteen boys in the United States are deceived by accounts that are presenting themselves as young as girls or girl teenagers that turn out to actually be men who are part of a criminal ring based in different parts of the world who are then going to take those videos that a young man or young boy might have provided and then use that as the basis for extortion, threatened to expose them. There've been tragically some suicides in the US linked to this kind of international criminal activity. So again, it's taking advantage of the fact that you can create an accountant, present yourself, pretend to be another child, or pretend to be a star and pull a child into a space, into private digital spaces where there aren't guardrails and manipulate that child into performing acts that they obviously otherwise wouldn't be doing.
And that is creating a permanent video or a photographic record, and then you can sometimes blackmail them with that to continue the process of manipulation and pressure.
Justin Hendrix:
Pablo, I want to bring you in now, Lara just mentioned the scale of what she found, which was just by using Facebook's own search bar and being able to just enter keywords in terms there, but you all have a little more investigative capability at L Clip. At the time we started talking, you had access to CrowdTangle. Of course, now that's gone away. But I wanted to ask you a compound question, which is first, what is Clip's approach to investigative journalism, and why did this particular set of phenomena appeal to you to investigate?
Pablo Medina Uribe:
Yeah, so our philosophy at Clip is to do cross-border collaborative investigations. So, every investigation we do, we get partners usually from more than one country and usually from Latin America, though we've partnered with people in the United States, like in this case now some people in Europe, and also we're working right now with some people in Australia and Indonesia for example. But the main thing is to pull resources so we can all together do a better job than if we were all doing it by ourselves. And so the idea is to collaborate not only with the contents that we produce, but also with the resources that we have, investigative resources, maybe one outlet that's partnering has a very good data investigator, he helps or she helps the outlet in the investigation, or maybe one of them is really good at graphic design.
And for this particular investigation, we were contacted by one of the partners, which is a fact checking organization in Argentina and who I think had worked with Tech Policy Press before. They were aware of latice previous research on this, and they were aware that you guys wanted to maybe try to expand this investigation a bit more. So they reached out to us because we have coordinated this sort of collaborative investigations. And I was particularly interested because the first thing that I saw is that there were a lot of Latin American users replying in the comments, those comments that Lara was describing a little while ago. One of the questions they were supposed to answer is, say your country and your age, or something like that, in a lot of them mentioning they were from a country in Latin America and it was very spread out.
So I thought it was very interesting that it was not a localized phenomenon, but it was happening throughout the region, and I'm sure it's happening in other countries. We only did this investigation in Spanish, but I'm sure that happens probably in other languages. Actually, Lana mentioned in one of her pieces that it happened in Tagalog as well. So I'm sure that happens. But also I was interested because in many of the investigations we have done, we have encountered somewhat of a disinterest from the big tech social media platforms about Latin America or in general about places that are not the United States or Europe. And the fact that Lara had reported this three years ago now and nothing had really happened, just made me think this is just another one of those instances. So this is another thing we can try to investigate so users in Latin America are not on their own.
So there's someone who's trying to do something for the safety of users of social media in Latin America. So we had Chequeado from Argentina, who I mentioned already, we also had EL Spectador in Colombia, Crónica Uno in Venezuela, and we had Factchequeado, which is an organization in the United States, but they work with Spanish content. They do fact-checking in Spanish. What we found using CrowdTangle, as you mentioned, was that the same patterns that Lara was just describing just kept repeating themselves. There's no stopping them. And it was a lot of new groups, a lot of new users, a lot of new content. It was being posted all the time. I think I did that research the week that CrowdTangle closed, which was in the middle of August last year. But when I did it, the most recent post that fit the criteria was one minute older or something like that.
So it was just continuously happening, and we found more fan groups of more bands, a lot of K-pop bands and influencers. We also found a lot more groups that were dealing with romance or teenager relationships and stuff like that, which is some kind of way to disguise the real intent of having more sexual content in there. But basically, what we found is that there is no mechanism being activated inside the Facebook platform that was stopping this because Facebook just continuously growing.
Justin Hendrix:
Lara, in our diligence on this, we also talked to multiple experts who've been studying these issues across multiple other geographies as well, in the UK, Europe, and the United States, in addition to the experts that Pablo's partners talk to. I guess I wanted to ask you about what you feel we learned from that process. I think one of the things that really sticks out to me is just the degree of frustration. People feel the sense of constantly being told that Meta is best in class, that it's done all it can do to address these things while witnessing the scale of what seemed to be mass vulnerabilities, real vectors for harm.
Lara Putnam:
Absolutely. I mean, no one who we talked to was shocked, right? No one we talked to said, oh, I've never seen this material. They looked through, they're like, yep, that's the, so the kind of child sexual solicitation that we were finding, and that I should add, not only finding through super basic keyword searches, for instance, for posts including children's ages or asking for children's ages, but also once I would find any one of these groups, if I simply would click on groups like this, recommended groups, the algorithm itself would suggest, oh, here are some other groups that are just like that. And that would quite routinely themselves also have been sort of flooded by predators. So it's very clear that the same structures that drive engagement on the platform are easily available, can be used for these predatorial ends by bad actors. Not that I'm more diligent than your average predator.
And one of the things that was pointed out to us is that the bad actors internationally are very aware of where kids are. They're aware of where platform controls or oversight is weaker. In part, this is sort of similar to what Pablo was saying about Latin America not getting attention from social media platforms as a space sort of meriting attention to protections for users. And at the same time, Spanish speaking users are almost certainly the largest number of any language group of users of, for instance, Facebook internationally. And so you have people being targeted and you have an attractive for bad actors, you've got an attractive target, and you've got the platforms that have created the tools for engagement are immediately and easily put to ill ends. And that was part of what came across through when we talked to folks, especially when we talked to folks who have had experience as working in trust and safety on the inside of platforms, is that I began to understand the difference between things that looked to the platform like tractable problems and things that were just off the table.
So design changes that would make an impact on putting in place fundamental ways of, for instance, generating knowledge about what kinds of harm people are doing. One of the things we talked to one of the folks about was the reporting tools and the fact that the reporting tools are so, they're so clunky in addition to not giving results. It's like a seven-stage click-through process, and then you have to wait and see whether you get the response or not. So, as opposed to some parts of using a social media platform where you've got the smoothest possible user experience on reporting tools, you've got the clunkiest possible reporting experience. And one of the questions that someone suggested to us that Facebook, for instance, never really answers is of the people who begin reporting violating content or disturbing content or content that they feel should be removed, how many people actually finish that seven-step cycle?
So the kind of changes would make it easier to report or the kind of changes that would make it, that would put a breaker in, for instance, accounts that are rapidly friending and by again, super rapidly friending dozens, hundreds of underage users. Those are the kinds of design changes that would be protective for children on the platforms and would slow down engagement overall on the platform. They would hit the engagement numbers on the platform. And so, rather than having a safety conversation focused on these design features that would acknowledge that there are kids using Facebook and do more to keep kids safe or to provide real routes for response when kids aren't safe. Instead, Facebook really has focused on and keeps the focus on individual items of content and whether they fall on the line of being violating or not and taking them down. So Facebook, as you said, Facebook says, insists that it's best in class with regard to, for instance, takedowns of explicit video and photographic child sexual abuse materials.
And I recognize it's true that Facebook doesn't, the one kind of content you can get taken down if you see it on Facebook and report it is videos that explicitly have videos of naked people doing things if they seem like they could be children. If you report those, those are routinely taken down. And Facebook reports is a mandated reporter who reports to the National Center for Missing Exploited Children millions of items of content. But the danger that the space of Facebook within the child endangerment and the child exploitative ecosystem is less as a place where in digital items of child pornography or child sexual abuse materials are exchanged. You don't need that. The dark web exists for people, for bad actors to exchange CSAM, Facebook is a space where children are vulnerable to being reached and for these relationships to be created and taking steps against that would slow engagement on the platform.
Justin Hendrix:
Pablo, would you be willing to characterize any of the observations from experts that the journalists spoke to in the Latin American outlets?
Pablo Medina Uribe:
So one of the things that most of the pieces reported was that there is little legal recourse in Latin American countries when this sort thing happens, and I don't mean about Facebook, but if a minor is sexually exploited or sexually abused online, most countries don't really have the laws in place that would have this grooming as a specific type of cybercrime. And so, a lot of time,s the cases are really hard to investigate because there's no legal framework to do it. The only exception we found was in Argentina, where there is cybercrime law specifically about grooming, and it's still difficult there because it's somewhat new and the cases that have been investigated haven't been that much. This is still a lot of things that happened. And also, the experts that the journalists consulted said that people in Latin America are particularly vulnerable because of many things, but especially because of the context, the socioeconomic context, there's a lot of people living in poverty here, unfortunately.
So that leaves children even more vulnerable. And a lot of people don't really know that this can happen online being both kids and their parents. So a lot of people don't take the necessary steps to prevent it, and that also leaves kids more vulnerable to this. And so we found out about a lot of cases that ended up with kids, unfortunately, being trafficked into international sex trafficking networks run from other countries. It's not from the cases we found on Facebook, but from the experts we consulted. That said, it is unfortunately not that uncommon for people to meet someone online and then be abused or trafficked. It's also really difficult, from what the experts thought, in these investigations to create a framework to help these kids, like the victims of these crimes sometimes because there's not enough government support sometimes because it's hard to find these victims, and sometimes because sometimes that's because people are scared of reporting that they happened because they don't really know where to go to.
Justin Hendrix:
I want to ask you both, what's next? Lara, this has been a volunteer activity for you outside of the course of your normal work. Do you think you'll continue to work on this particular issue? And if so, do you have any thoughts about how,
Lara Putnam:
One thing I have started doing is working with a friend who's a law professor, and she's a specialist in business and human rights law and an amazing, brilliant person. Her name is Jena Martin, if she's out there listening, and she and I were just talking through the stuff I was finding, the articles I was writing, and we started realizing that there were points where her expertise as a business and human rights lawyer sort of helped me understand what was or wasn't possible with the current existing jurisprudence within the US and then beyond its borders. So she and I have written, we wrote an article together thinking through liability regimes in the Metaverse where we tried to think through it put as one of the cases, the kinds of harm that exist online, the kind of sort of wholly online child sexual exploitation that we've been describing here.
I mean, Pablo points to some cases in which people are met online by other children are met online, by adults who are sort of physically nearby enough that they can be then physically trafficked or in-person meetings can happen. That's horrible. Obviously, that's horrible in its own unique way, but the fact that children with cameras with smartphones can be blackmailed by people halfway across the world, and that can happen at scale. Humanity has never been in a place where that's the case before where you've brought so many children into a space where they can be contacted by a nearly infinite number of potential predators. And that's just being scaled up with, for instance, AI that allows people to impersonate actors of different ages using translation, using video fakes and so on. And so one thing that Jen and I developed as a idea or a thing to worry about is what we think of as the veil of scale.
The way that actors can commit crimes or commit torts, acting behind the veil of scale, which is created by social media where it's possible to create an account, commit great harm, and then disappear in a way that scales super rapidly because of the affordances of a platform. And law enforcement cannot scale at that speed. All the more so, law enforcement and the capacity of reach of a developing country, a country like Venezuela or a country like Peru or Honduras, is not going to be able to put 50 trained professionals on the plane to fan out across the world to try to track down folks doing harm to kids from that country online. And so thinking through, there just aren't really vexed issues raised by the emergence of new kinds of crime and new kinds of harm that can be committed in these purely online settings.
And so that's something that she and I are, as I said, we have one article coming out in the Yale Journal of Law and Technology, and we're writing another article that asked the question, well, who could Mexican children who've been harmed in the kinds of groups that we're describing? What kinds of redress would they have if they can't, for instance, bring a lawsuit against Meta because Meta is not based in their country? Can they bring a lawsuit against their own government for failure to protect them? Given that, for instance, the government of Mexico's signature to the International Convention on the Rights of the Child has a whole series of laws that protect children's dignity and integrity, their right to live free from sexual harm. So could a child or a class of children in Mexico bring a lawsuit against the Mexican government before the Inter-American Human Rights Court hypothetically saying we had a right to be protected and our government didn't take those steps. I really want to get out of the business that of looking at afford things on Facebook because it's kind of soul-killing. But on the other hand, when their kids being harmed, I feel like, and I know what's going on, I have an obligation to do something about it. So, I'm trying to reach out through multiple different sources and do different kinds of things to try to move our collective dialogue forward on these issues.
Justin Hendrix:
Pablo, what about you? I mean, El Clip is in the business of accountability journalism. Will you continue to pursue this story? I know you certainly plan to pursue tech accountability stories going forward.
Pablo Medina Uribe:
Yeah, definitely. We'll keep pursuing those types of stories. Also, I think we're going to keep an eye on what happens to these groups. As we contacted Meta, they deleted some of the groups, some of the groups that we mentioned to them…. actually, I don't know if they deleted them. They're unavailable now, so I don’t know if they went private or they're suspended or what, but they're unavailable, so they're not public anymore. So we're going to keep on that to see what happens to the other groups and the other kinds of posts that we're tracking. And we're going to definitely keep working on more stories about the possible harms to users from the, let's say, the big tech platforms not being very active in trying to moderate their content. This is something that really interests our investigations, and we're not just going to do it with Meta. There's a bunch of others that we're going to look into.
Justin Hendrix:
I just want to thank you two for the opportunity to work with you on this, and as ad hoc teams go coming together without having worked together before, this is a great experience and I'm really grateful for all the hard work that both of you put into this as well as to the journalists across the network that El Clip was able to pull together. Thank you so much.
Lara Putnam:
I mean, thank you, Justin. This dossier wouldn't have come together without you being there to support pushing this conversation forward, even when sometimes it feels a little bit like we're pushing a boulder up a hill. So can't thank you enough for the leadership you've shown on this.
Pablo Medina Uribe:
Yeah. Justin, likewise, thank you very much for bringing us together, and it was really a pleasure to work with all of you and all of our colleagues that are not on this podcast right now.
Justin Hendrix:
For more, see the show notes for links to these pieces on Tech Policy Press and El Clip.
Authors
