Home

Transcript: Senate Hearing on Social Media and Teen Mental Health with Former Facebook Engineer Arturo Bejar

Justin Hendrix / Nov 8, 2023

Justin Hendrix is CEO and Editor of Tech Policy Press.

Arturo Bejar Former Director of Engineering for Protect and Care at Facebook, testifies in the Senate Judiciary Subcommittee on Privacy, Technology, and the Law, November 7, 2023.

On Tuesday, November 7, former Facebook engineer Arturo Bejar testified to the Senate Judiciary Subcommittee on Privacy, Technology, and the Law in a hearing chaired by Sen. Richard Blumenthal (D-CT). Bejar testified about Meta's knowledge of the harm its platforms cause to children and teens, and its failure to take action. Bejar discussed evidence he presented to top executives, including Mark Zuckerberg, Adam Mosseri, and Sheryl Sandberg, regarding the prevalence of harmful experiences on Facebook and Instagram. He criticized the company for engaging in a strategy of distraction, denial, and deception, and for disregarding recommendations to make the platforms safer. Bejar also highlighted the need for legislative reform and transparency from social media companies.

Senators expressed concerns about the addictive nature of social media, the spread of harmful content, and the need for legal liability for these platforms. Sen. Josh Hawley (R-MO) pledged to seek a vote on the bills passed by the committee before the end of the year, and emphasized his view of the importance of holding social media companies accountable through a private right of action. Sen. Marsha Blackburn (R-TN) discussed Facebook's failure to address the harm caused to kids. Sen. Blumenthal expressed his support for the Kids Online Safety Act and the need for transparency and accountability from social media companies. The hearing concluded with a commitment to vote on the bills before the end of the year.

See Bejar's written testimony here. What follows is a lightly edited transcript of the hearing.

Sen. Richard Blumenthal (D-CT):

This hearing of the Judiciary Subcommittee on Technology. Privacy, and the Law will come to order. Thank you everyone for attending my thanks to Ranking member Hawley and particularly to the Chairman of the Judiciary Committee, Dick Durbin, for giving us this opportunity and he is vitally interested in this topic and I'm going to call on him after Senator Hawley for his remarks. We are gathered today to hear testimony from a blower, an engineer, widely respected and admired in the industry, and not just any expert but a engineer hired specifically by Facebook to help protect against harms to children and make recommendations for making Facebook safer. We've known for more than a decade that rates of teens suffering from suicides, hospitalizations for self-harm and depression have skyrocketed. As he knows these numbers are more than statistics. They're real people and his daughter is one of them.

Arturo Bejar is the former director of engineering for Protect and Care at Facebook, and he will tell us about the evidence he brought directly to the attention of the top management of Facebook and Meta Mark Zuckerberg, Sheryl Sandberg, and others in meetings and memos. He resoundingly raised an alarm about statistics showing Facebook's prevalent and pernicious harms to teens telling Mark Zuckerberg, for example, in a memo that more than half of Facebook users had bad or harmful experience, just within the last week. Instead of real reform, he will testify that Facebook engaged in a purposeful public strategy of distraction, denial, and deception. They hid from this committee and all of Congress evidence of the harms that they knew was credible and they ignored and disregarded recommendations for making the site safer and they even rolled back some of the existing protection. Now, Mr. Bejar is not the first or the only whistleblower to come forward. We heard from Francis Haugen who showed that Facebook's own researchers described Instagram itself as a perfect storm and that it exacerbates downward spirals of addiction, eating disorders and depression.

Mr. Bejar is the first to show in documents, not just in his recollection but in documents, how he warned the top management of Facebook and Instagram of the ongoing harms their products were causing. We're going to present those documents for the record and they show, for example, that over a quarter of young teens, 13 to 15 years old report receiving sexual advances on Instagram. Nearly a third of young teens have seen discrimination based on gender, religion, race, and sexual orientation. A quarter of young teens report having been bullied or threatened, and nearly a quarter of young teens report experiencing feeling worse about themselves, about their bodies and their social relationship, the type of experience that lead to serious depression and eating disorders. And when users reported harmful content to Facebook, it took action only 2% of the time, remedies only 2% of the time. There's a history here in August of 2021, Senator Blackburn and I wrote to Facebook about the impact of their products on kids.

We asked, quote, has Facebook research ever found that its platforms and products can have a negative effect on children's and teens mental health or wellbeing? Facebook refused to answer. In October, 2021, Senator Blackburn and I held a hearing. We heard from Haugen about Instagram's harms, and on that same day, Mr. Bejar sent an email to Mark Zuckerberg, Sheryl Sandberg, Adam Mosseri, and other executives validating Ms. Hagan's testimony. That email actually demonstrated even greater harms than were then public, a chilling and searing indictment of Instagram and Facebook, and I'm going to ask that it be made part of the record without objection. In December of 2021, Mr. Sari then testified to the committee to our subcommittee after he met with Mr. Bejar discussing these numbers and statistics relating to suicide.

And during that hearing, a number of us asked him about Facebook promoting suicide. Mr Me knew, but he didn't disclose that. On a weekly basis, around 7% of Facebook users overall counter content promoting suicide and self-harm with 13 to 15 year old, seeing it more often than others, there's a pattern here with Facebook. It hides risks by saying things like bullying and harassment is only 0.08% of content. When in reality Meta executives know that 11% of those 13 to 15 year olds face bullying every single week, every single week on Instagram. And just to be absolutely clear, that's millions of children and teenagers. It's not just a number behind every one of those numbers is a real person, a teenager, a child whose life is changed maybe forever. By that searing experience of bullying, eating disorder, content, suicide promotion, we can no longer rely on social media's mantra.

Trust us, we can no longer depend on its putting the blame or responsibility on parents. What's needed now is legislative reform, the Kids' Online Safety Act. Senator Blackburn and I have enlisted more than 45 of our colleagues, almost half the United States Senate in favor of the Kids' Online Safety Act, and the final point I would make is that social media in particular, Facebook still fails to take these threats seriously. This June, the Wall Street Journal found that Instagram was hosting open markets for child abuse material even recommending pedophiles to each other. Young teens were being extorted and coerced into sexual acts. Instagram was complicit. Mr. Bejar, you provided Mark Zuckerberg, Adam Mosseri, and others in management with specific recommendations to prevent teens from experiencing this unwanted sexual contact and harassment. Those recommendations were never adopted. You have put your career on the line to come forward, an experienced and trusted industry expert whose job was to make Facebook safer and your recommendations were purposefully ignored or disregarded or rejected. I'm just going to remind my colleagues that we've heard from young people as well as parents about these harms and one of them told me how many more children have to die before Congress will do something. That's why we're here today. I want to thank all of my colleagues who are present, truly a bipartisan group on behalf of this cause and turn to the ranking member Senator Hawley.

Sen. Josh Hawley (R-MO):

Thank you very much Mr. Chairman. Thank you for convening this hearing. This is such a vital hearing on a vital topic and to be honest with you, this hearing concerns, I think every parent's nightmare and I see you're nodding, Mr. Bayer, you're a father. That subject composes that reality composes some of your testimony. I'm also a father of three and what you have brought to this committee today is something that every parent in America needs to hear. The numbers are really stunning that one in four teenagers, minor children will experience sexual solicitation on Metis platforms. At some point. One in eight say that they have experienced unwanted sexual advances. We're talking about children now. These are not adults. Children have experienced unwanted sexual advances just in the last week within the last seven days, and of course we know from Meta's own internal research that they knew the extent of this problem even as they were ignoring you.

And I want to turn to some of that research that Senator Blumenthal just referenced. Here's what Meta, these are Meta's own words from their own internal research on the effect of their own product on children, particularly young women. We make body image issues worse for one in three teen girls, teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups. Teens told us they don't like the amount of time they spend on the app, but they feel they have to be present. They often feel addicted and know that what they're seeing is bad for their mental health, but feel unable to stop themselves. This is the reality that Meta and Instagram, Facebook, they knew these things were happening. These quotes are years old that I just read. You pointed this out to them too, Mr.

Bejar, and still they did nothing. In fact, they did worse than nothing. What your testimony shows is when you brought these concerns to them, when you expose this reality rather than respond, they cooked the books. If I understand your testimony correctly, they started telling the public, including Congress and of course every parent in America that, oh, we get 90% of unwanted sexual material, child sex, abuse, material, pornography, terrorism threats. We take it down, our AI systems find it and take it down. But what you expose is in fact those AI systems are catching only a small percentage of that kind of abusive material online. So when Facebook is out there promoting to the world, oh, we're taking down the vast majority, it's simply not true. And in fact they know it's not true and that statistic is designed to mislead. They're deliberately misleading parents about what's on their platform. They're deliberately misleading parents about the safety of their children online. And I just want to echo something that Senator Blumenthal has said. It is time for Congress to take action. It was time years ago for Congress to take action. It is an indictment of this body to be honest with you, that we have not acted and we all know the reason why. If I could just start with a little plain talk here this morning, big tech is the biggest, most powerful lobby in the United States Congress. They spend millions upon millions upon millions of dollars every year to lobby this body. And the truth is, as every reporter in this room knows, and I hope you'll report it after this hearing, they do it successfully. They successfully shut down every meaningful piece of legislation every year. I've only been here for four years and I have seen it repeatedly. In the short time I have been here. We'll get all kinds of speeches in committees, we'll get speeches on the floor about how we have to act and then this body will do nothing. Why? Money. That's why gobs of it, gobs of it, influencing votes, a hammer hold on this process. It is time for it to be broken and the only way I know to break it is to bring the truth forward and that's why we are so glad Mr. Bejar that you are here today to do it. Thank you Mr. Chairman.

Sen. Richard Blumenthal (D-CT):

Thanks Senator Hawley. The only footnote I would add is this time must be different. They have armies of lawyers and lobbyists. They spend tons of money, but this time must be different. Senator Durbin.

Sen. Dick Durbin (D-IL):

Thank you, chairman Blumenthal and Senator Hawley and let me follow up with Senator Hawley's comments. I couldn't agree more. I could not agree more. And in the Senate Judiciary Committee, after some graphic hearings where parents and victims came forward and told us what had happened to them online, we decided to take action. We passed six bills related to this issue, child sexual abuse and similar issues, six bills and something happening that was miraculous. All six passed unanimously. Every Democrat and every Republican take a look at the folks who are up into the table. It's across the political spectrum. We all agreed on this. What has happened since nothing. Six bills waiting for a day on the calendar, six bills waiting for a national debate. Six bills passed unanimously on a bipartisan basis and they put real teeth in enforcement too, and I think that's why they've gone nowhere.

Big tech is the big kid on the block when it comes to this issue and many other issues before us. That's the reality. I want to thank Chairman Blumenthal and Senator Hawley for bringing together so many members at this hearing. Our philosophy in putting together the subcommittees was to say to each of the senators in charge of them, do your best. Take your issue that means something to you and do your best to bring it to the American people and legislation to the floor of the United States Senate. This committee is one that I'm counting on to be successful in this regard. Mr. Bejar, thank you for the courage of stepping up and speaking up. The only amendment I would make to the chairman's remarks and Senator Hawley's is it's not only a parent's issue, it's a grandparent's issue too. We see this and scares the hell out of us.

So thank you for what you brought us today. I'm particularly intrigued by your idea of a survey so that we find out from the source what's really happening. My experience at Capitol Hill goes back several years. I took on a tobacco issue. We were hitting our head against the wall trying to penetrate this vast lobby. At the time, the one way we managed to penetrate it was to make it a children's issue, protecting kids from addiction to tobacco and then a lot of good things started happening. Why is it that this issue, which relates to our kids so much more and is so much more dangerous even than tobacco, in my estimation, why is it so difficult? Senator Hawley's? Correct. We're really fighting the biggest kid on the block when it comes to this issue. Thank you, Mr. Chairman.

Sen. Richard Blumenthal (D-CT):

Thanks very much, Senator Durbin and thank you for your leadership on this issue. I'm going to turn to Senator Graham if he has some opening remarks and then to Senator, quickly,

Sen. Lindsey Graham (R-SC):

Maybe number seven is the magic number of bills. The next bill, I hope and one thank you Senator Blumenthal and Hawley for doing this is to sunset Section 230. The other bills are going nowhere until they believe they can be sued in court. The day they know the courtroom is open to their business practices, they will flood us with all kind of good ideas. Until that day comes, nothing's going to happen. And I said, as we passed, they're going to go to the floor to die and be known. Senator Schumer, Senator McConnell, what's the House doing? Not much. So the bottom line is a society that cannot take care of its children or refuses to has a bleak future. So thank you for doing this.

Sen. Richard Blumenthal (D-CT):

Thanks, Senator Graham. Senator Blackburn.

Sen. Marsha Blackburn (R-TN):

Thank you Mr. Chairman and Mr. Bejar, thank you so much. Thank you for the time that you've given Senator Blumenthal staff, my staff as you've met with us and for being so open. When you met with Senator Blumenthal and I last week, I really appreciate this as Senator Blumenthal said, and we have worked on this for years and he built the timeline out going to 2021, but the work we were doing looking at big tech and looking at some of the problems, the lack of privacy, the frustration of people not being able to control who had access to their virtual you is what led us to this point to begin to look at what was happening to our children. And as I told you in our meeting the day we had that first hearing, looking at what was happening online with children, it was like the floodgates opened and we started hearing from moms and dads not only in Tennessee and not only in Connecticut, but across the country who were saying, can I please tell you my story?

The reason they did this is because their hearts were breaking. Their children had committed suicide. Their children had met a drug dealer, their children had met a pedophile, their child had met a sex trafficker. They had been exposed to cyber bullying and had committed suicide. They were looking up ways to commit suicide. See, there are laws in the physical world that protect children from all of this, but online it has been the Wild West and as my colleagues have said, we have fought this army of lobbyists for years. Big tech has proven they are completely incapable of governing themselves, of setting up rules, of having guidelines, of designing for safety, and it is so important that we move forward with this. Now, one thing I'll add and I think is so important for your being here and for our colleagues that weren't a part of what we were doing, pardon me.

In 2021, Mr. Mosseri, when he came before us as the CEO of Instagram indicated, pardon me, indicated they were taking steps, but we find out they were not. We find out from the advice and the awareness that you provided Mark Zuckerberg and Mr. Messer, what did they do with that? They made a conscious decision to ignore your advice and guidance and use our kids as the product. The longer they're online, the richer that data is, the richer the data is, the more money they make. So they have monetized what comes from our children being addicted to social media. Thank you so much for being here today. Thank you, Mr. Chairman.

Sen. Richard Blumenthal (D-CT):

Thanks, Senator Blackburn. Let me formally introduce the witness. Arturo Bejar is a former security engineer with very significant experience working on user safety and wellbeing at Facebook. He served as director of Engineering for Protect and Care, a specific team at Facebook. From 2009 to 2015, he reported to the CTO. He then came back as a consultant to help Instagram's wellbeing team from 2019 to 2021. He's also a parent to a courageous young girl, young woman who spoke up about her experiences online. Mr. Bejar, as is our customer, I'm going to administer the oath to you now. If you would stand, please do you swear that the testimony that you will give to this committee is the truth, the whole truth, and nothing but the truth, so help you God, thank you. Thank you. Please go ahead,

Arturo Bejar:

Chairman Durbin, ranking member Graham Chairman Blumenthal ranking member Hawley and members of the subcommittee, thank you for the opportunity to appear before you and for your interest in addressing the most urgent, one of the most urgent threats to our children today to American children and children everywhere. My name is Arturo Bejar and I appear before you today as a dad with firsthand experience of a child who received unwanted sexual advances on Instagram as an expert. With over 20 years of experience working as a senior leader, including leading online security for and safety and protection at Facebook, it is unacceptable that a 13 year old girl gets propositioned on social media.

Unfortunately, it happens all too frequently to today. In a carefully designed survey by Instagram in 2021, we found that one in eight kids age 13 to 15 years old, experienced unwanted sexual advances in the last seven days. This is unacceptable and my work has shown that it doesn't need to be this way. Starting in 2009, I was the engineering and product leader for Facebook's efforts to reduce online threats to both children and adults. I met regularly with senior executives including Mark Zuckerberg, and they were supportive of this work. As a parent, I took the work personally and I worked hard to help create a safer environment. By the time I left in 2015, I felt the work was going in the right direction.

A few years later, my 14 year old daughter joined Instagram. She and her friends began having awful experiences including repeated unwanted sexual advances, harassment. She reported these incidents to the company and it did nothing in large part because of what I learned as her father. On October of 20,019, I returned to Facebook this time as a consultant with Instagram's wellbeing team. We tried to set goals based on the experiences of teens themselves. Instead, the company wanted to focus on enforcing its own narrowly defined policies. Regardless of whether that approach reduced the harm that teens were experiencing, I discovered that most of the tools for kids that we had put in place during my earlier time at Facebook had been removed. I observed new features being developed in response to public outcry, which were in reality, kind of a placebo, a safety feature in name only to placate the price and regulators.

I say this because rather than being based on user experience data, they were based on very deliberately narrow definitions of harm. The company was creating its own homework. For example, Instagram knows when a kid spends significant amount of time looking at harmful content content that they are recommending, Meta must be held accountable for their recommendations and for the unwanted sexual advances that Instagram enables. As soon as I understood this gap, I did what I had always done. I researched the problem, vetted the numbers, and informed Mark Zuckerberg, Sheryl, Sandberg, and other executives. I did this because for six years that was my job to let them know of critical issues that affected the company.

It's been two years since I left and these are the conclusions I have come to one. Meta knows the harm that kids experience on their platform and executives know that their measures fail to address it. Two, there are actionable steps that Meta could take to address the problem, and three, they're deciding time and time again to not tackle these issues. Instagram is the largest public directory of teenagers with pictures in the history of the world. Meta which owns Instagram is a company where all work is driven by data, but it has been unwilling to be transparent about data regarding the harm that kids experience and unwilling to reduce them. Social media companies must be required to become more transparent so that parents and public can hold them accountable. Many have come to accept the false proposition that sexualized content or wanted advances, bullying, misogyny and other harms are unavoidable evil.

This is just not true. We don't tolerate unwanted sexual advances against children in any other public context, and they can similarly be prevented on Facebook, Instagram, and other social media products. What is the acceptable frequency for kids to receive unwanted sexual advances? This is an urgent crisis. When asked, has anyone threatened you, damaged your reputation, insulted you, disrespected you, excluded you or left you out? 11% of kids said yes in the last week and one in four witnessed it happening and the company does nothing about that. When asked if they saw a post that made them feel bad about themselves, one in five kids said yes. In the last week, Meta executives know this. The public now knows this. When I left Facebook in 2021, I thought the company would take my concerns and recommendations seriously to heart and act. Yet years have gone by and millions of teens are having their mental health compromised and are still being traumatized by unwanted sexual advances, harmful content on Instagram and other social media platforms. There was a time when at home on the weekend at least a kid could escape these things, these harms, but today just about every parent and grandparent has seen their kids' faces change from happiness to grief, to distress the moment that they check social media, where can a child seek refuge? It's time the public and parents understand the true level of harm enabled by these products and it's time for Congress to act. Thank you for your time.

Sen. Richard Blumenthal (D-CT):

Thanks Mr. Bejar. We're going to now begin with the questions and each of us will ask five minutes of questions and because of the turnout, I'm going to limit it to five minutes and then we'll have a second round if folks want to do that, we put in the record your memo to Mark Zuckerberg of October 5th where you recommend that there be in effect not only a change in the business practice of the company, but a culture shift as you call it, and then you wrote to Mr. Mosseri separately on October 14th. I'm going to ask that that document be made part of the record as well where you presented more of these statistics and very powerful evidence of harm and it seems to me that the reaction was to pat you on the head and in effect tell you to go away, be a good boy and pull the curtain. Senator Hawley's referred to cooking the books. I think what they did was bury this evidence, conceal it, hide it, and deny it in effect to Congress and to the public. And then in the past year, they've actually cut around 21,000 jobs or a quarter of the global workforce in what Mark Zuckerberg has called the year of efficiency, hundreds of jobs involving content moderators and safety jobs, including from Instagram's wellbeing team. What is the impact of cutting those resources devoted to online safety?

Arturo Bejar:

Thank you for the question. If you start from the point that the work was already heavily under-resourced when I was there, that we were dealing 20%, 10% of people experiencing this and that there were a small fraction of people dedicated to address that harm and then they take more resources away from that, including the people who are doing the work to understand the harm that kids are experiencing, then it seems to me that the company culture is one of C, no evil hear no evil. We don't want to understand what people are experiencing and we were not willing to invest in that and the tools that will help.

Sen. Richard Blumenthal (D-CT):

Thank you. We spoke in advance of the hearing and you told me a story about meeting with another senior executive, Chris Cox, Facebook's chief product officer, and it was just so striking to me that he already knew a lot of the numbers and statistics and evidence of harm that you were bringing to Mark Zuckerberg's attention. Why was this meeting so memorable to you?

Arturo Bejar:

When I returned in 2019, I thought they didn't know, like when I began seeing a culture that was consistently ignoring what teens were experiencing, I thought that executives did not know. And I did spend a year researching vetting, validating with people across the organization and I would ask people, do you know what percentage of people are experiencing this? And nobody was able to answer on the top of their head. The first person to do that was Chris Cox and I found it heartbreaking because it meant that they knew and that they were not acting on it.

Sen. Richard Blumenthal (D-CT):

In effect, their expressed caring about teens and safety and protecting children was all a charade to mockery. They already had the evidence that you were bringing to their attention. They knew about it and they disregarded it, correct?

Arturo Bejar:

Yes, that's correct.

Sen. Richard Blumenthal (D-CT):

And then they rejected your recommendations for making Facebook and Instagram safer, correct?

Arturo Bejar:

That's correct.

Sen. Richard Blumenthal (D-CT):

And let me ask you before we go to our next witness, do you think that the Congress of the United States should now act? Don't you think action is long overdue in this area given the total lack of credibility on the part of social media?

Arturo Bejar:

Yeah. My experience after sending that email and seeing what happened afterwards is that they knew there are things they could do about it. They chose not to do them and we cannot trust them with our children and it's time for Congress to act. Evidence I believe is overwhelming.

Sen. Richard Blumenthal (D-CT):

I'm very hopeful that your testimony added to the lawsuit that's been brought by attorneys general State Attorneys general across the country. I'm a former state attorney general. I believe strongly in enforcement by them, added to the interest that I think is evidenced by the turnout of our subcommittee today will enable us to get the Kids Online Safety Act across the finish line along with measures like Senator Durbin's proposals and others that can finally break the straight jacket that big tech has imposed on us. Big tech is the next big tobacco. I fought big tobacco in the 1990s. I sued big tobacco, I urged Congress to act the same kind of addictive product. That big tobacco pedaled to kids now is advanced to them and promoted and pitched by big tech and we need to break the rait jacket they've imposed through their armies of lobbies and lawyers. Thank you. Senator Hawley.

Sen. Josh Hawley (R-MO):

Thank you Mr. Chairman. Mr. Bejar, thank you again for being here. I just want to first establish a factor two just to make sure everybody understands. So on October the fifth, 2021, you composed an email which is now I think in the record, to Mark Zuckerberg, Sheryl Sandberg, and a group of other executives at Meta. Am I right so far?

Arturo Bejar:

That's correct.

Sen. Josh Hawley (R-MO):

In that memo, you disclosed to them that according to your own research, one in eight children, children now had experienced unwanted sexual advances within the last seven days. Is that correct?

Arturo Bejar:

That's correct.

Sen. Josh Hawley (R-MO):

And about one in three, I think it was 27%, had experienced unwanted sexual advances outside of the seven day window. So that is more than seven days, is that correct?

Arturo Bejar:

That is correct.

Sen. Josh Hawley (R-MO):

Those numbers are astounding. I just want to let that sink in. One and eight, within seven days, a third of children outside of that window. Mark Zuckerberg, did he reply to you?

Arturo Bejar:

He did not reply.

Sen. Josh Hawley (R-MO):

Did he meet with you?

Arturo Bejar:

He did not meet with me.

Sen. Josh Hawley (R-MO):

Sheryl Sandberg. Did she meet with you?

Arturo Bejar:

She did not meet with me.

Sen. Josh Hawley (R-MO):

In other words, the people who had recruited you to come back to Facebook, Meta, whatever, it's hard to keep up. They ignored your findings when you presented data to them. They didn't want to see. They turned a blind eye. Let me ask you about something else. This is from the Wall Street Journal's report earlier this year. This is June of this year. They found the following. I'm going to quote. Instagram helps connect and promote a vast network of accounts openly devoted to the commission and purchase of underage sex content. Pedophiles have long used the internet, but unlike the forums and file transfer services that cater to people who have an interest in illicit content, Instagram doesn't merely host these activities. Instagram's algorithms promote them. Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share these interests. The journal and academic researchers found this is a stunning, stunning report, Mr. Bejar, that more than buttresses bears out what you were telling, trying to tell the executives who ignored you. Just give us a sense, in your own view, why do you think this is happening? Why has Instagram become, in the words of the Wall Street Journal, a vast pedophile network? Why are people like your daughter Every time they get on Instagram, they're being bombarded with unwanted sexual advances, sexual content, why is this happening?

Arturo Bejar:

My experience of that is that most of the resources, if close to all that they invest in this, go towards this very narrow definition of harm. And so I would encourage anybody here when you're looking at this issue, if you find an account that seems to be a pedophile account selling things, try and act on it, try and raise it, see what the company does with that, but see what happens if you like it or follow it, what you start getting recommended and of all of the things that get surfaced by the systems, how many of them are they acting on? It's a fraction of a percent.

Sen. Josh Hawley (R-MO):

One of the things that you said changed from the time you left Facebook in 2015 I think it was and came back in 2019, was that Facebook had shifted to an automated driven process of safety, standard safety inspection, monitoring for things like this, which they boast about. They say that their AI is great, it's doing great work. That doesn't appear, however, to be the actual fact. It appears that these harms are proliferating. Tell us about the shift towards automated safety monitoring and what that has meant in your experience.

Arturo Bejar:

I was not there for the shift, but what I can say is that algorithms are as good as their inputs. So if you don't allow a child to be, oh, that is gross, it makes me uncomfortable, which something that you can do for an ad today, you can take an ad and say that is sexually inappropriate, but there's no way for a child to do that when they get a message or other areas. How do these systems even have a hope of addressing these issues? How can they as a company have a hope of addressing these issues if they're not willing to listen when a teen is trying to tell them that they're experiencing gross content, unwanted sexual advances. I mean that's how you find predators. That's how you find the bad things.

Sen. Josh Hawley (R-MO):

So what your research found and what you elevated to leadership was at least in part that these automated systems were not catching the vast majority of this unwanted content out there. I mean the sexual advances of this pedophile material, it simply doesn't begin to capture yet. Facebook didn't shift more resources, didn't change their process, and here's the thing that really gets me and I'll end with this Mr. Chairman, I know there's others who want to question. I have been reading over and over and over again this case filed by my home state, Missouri versus Biden, landmark First Amendment case in which two federal courts, federal district court and a federal court of appeals have found that Facebook among others actively coordinated with the present administration to censor First Amendment protected speech, not this garbage that is not protected by anything in our constitution, but first amendment protected speech.

Here's what gets me, what the courts found, this is in the record, this is factual findings, is that Facebook devoted all kinds of resources and people, actual human people to doing things like monitoring posts on COVID-19 vaccine efficacy. There's one example of a parent in my home state of Missouri who wanted to post something about a school board meeting. Facebook used human moderators to go and take down that post. That was important. That has to come down. We can't have them posting about school board meetings for heaven's sake, but the things that your daughter experienced, this ring of pedophiles rings plural that Facebook just can't find the time for. They just don't have the resources for it that we just have to leave to, let the market have its effect, let AI do its job. We just don't have the resources for it. They had plenty of resources to censor First Amendment speech, no resources to protect our children. Absolutely unconscionable.

Sen. Richard Blumenthal (D-CT):

Thanks Senator Hawley. Senator Durbin.

Sen. Dick Durbin (D-IL):

Thanks for being here. Mr. Bejar, you said earlier in your opening statement that when you work for these companies, they were data-driven. What do you mean by that?

Arturo Bejar:

Everything at Meta, there's goals based on numbers. There's very ingrained understanding of what is happening. People set their jobs on that the next six months. I'm going to make this number go from this to that,

Sen. Dick Durbin (D-IL):

But the ultimate answer is they were dollar driven too.

Arturo Bejar:

Correct. What I can speak to very directly is that, and my question to Adam was what percentage of teens should be experiencing unwanted sexual advances? If there's not a team that that's their goal. If they can't answer your questions about how many teams are impacted by this and if they cannot give you detailed data as to who's initiating those contacts, then it's just not a priority.

Sen. Dick Durbin (D-IL):

But bottom line is this, they've made a decision that it's not a priority to them because of profit motive, have they not in terms of what it's going to cost them in their business model if they have to interrupt it and monitor the content.

Arturo Bejar:

I think that would be a wonderful question to ask Mark and Sheryl. Well, Sheryl's no longer there. On Adam because they can speak to why they made these choices. It can only speak to the fact that they keep making these choices over and over again.

Sen. Dick Durbin (D-IL):

Well, I would just back up what Senator Graham said. If this becomes expensive to them to continue this outrageous conduct, then they may pay closer attention. That's for sure. But you have suggested here as well that we need a survey of young people as to their experience. You want to explain that?

Arturo Bejar:

Yeah. The way that harm should be tracked on these products is you go up to teens and ask them, did you receive an unwanted sexual advance in the last seven days? And they are going to know it doesn't matter what the message is and then what you can do to help that team is give them a chance to tell you, and the measures that I talk about are not even expensive to implement.

Sen. Dick Durbin (D-IL):

We were also briefed by the DEA in terms of narcotics transactions and the use of platforms for that purpose. Did you ever look into that issue?

Arturo Bejar:

I did not directly, but what you can do is if you look at the numbers that I provided the committee, there is a category for that class of issues and you should ask the company how much of that content which teams experience as that they take down.

Sen. Dick Durbin (D-IL):

It's interesting to me that if one of my kids when they were kids, our grandkids now came home and said there was somebody lurking outside the playground at school that'd make the kids feel uncomfortable. We would know what to do and to move on it quickly. We find it unacceptable and yet what we know for a reality is that there is danger lurking in the iPhones that they're opening up every single day and we seem to feel that we are unable to respond to this. I hope we can change that. Senator Graham's suggestion about 230, I don't know, do you have any thoughts on that Section 230?

Arturo Bejar:

No, I'm not qualified to talk about 230, but I can say that these companies should be held accountable for the content they recommend.

Sen. Dick Durbin (D-IL):

Well, I certainly agree with that. I think that is the bottom line. Thank you for your testimony.

Arturo Bejar:

Thank you.

Sen. Richard Blumenthal (D-CT):

Thanks Senator Durbin. Senator Graham.

Sen. Lindsey Graham (R-SC):

Thank you. You're doing the country a great service here. Did they contest your memo? Did anybody call you up and say you don't know what you're talking about, you're wrong.

Arturo Bejar:

No, I must have spoken to 20 or 30 people including Adam Mosseri saying, do you have any feedback, anything that's inaccurate in my data and nobody did.

Sen. Lindsey Graham (R-SC):

Okay. To sum up your testimony, is it fair to say that in its current form what you're describing is a dangerous product?

Arturo Bejar:

Correct.

Sen. Lindsey Graham (R-SC):

And that millions of families are affected by this dangerous product.

Arturo Bejar:

Correct.

Sen. Lindsey Graham (R-SC):

As a father who had a 13 year old affected by this product, did you feel helpless?

Arturo Bejar:

I did. And one that could help, it would've been me.

Sen. Lindsey Graham (R-SC):

If you could have sued them, would you?

Arturo Bejar:

I apologize, could you repeat the question?

Sen. Lindsey Graham (R-SC):

If you could sue on behalf of your daughter, would you?

Arturo Bejar:

I believe you just have to be held to account and being transparent about it.

Sen. Lindsey Graham (R-SC):

Well, one way of doing that is to sue them. Do you know you can't sue them under the current law?

Arturo Bejar:

I did not know that.

Sen. Lindsey Graham (R-SC):

Okay. Alright, so your daughter felt harmed. Your testimony is millions of people are in the same situation. Your daughter, they know what they're doing and they keep doing it anyway. Is that all correct?

Arturo Bejar:

That is correct.

Sen. Lindsey Graham (R-SC):

I can't think of a company in the world can do this crap and not get sued except these people. Now, if you had to give sovereign immunity, basically what we've done here to a group of people, this would be on the bottom of my list, not the top of my list. So I've just asked my office to find out how much money I've received from Facebook, Instagram, and other companies. I'm going to give it back. I think we ought to all boycott the giving because if Senator Hawley is right, I think you are. Their leverage here is just power over the political system. So I'm calling on every member of Congress today, don't take their money until they change.

Don't accept what they're offering you until they change. Because the money you're receiving is coming from people who have created a dangerous product for children and they seem not to be willing to change. That'd be on the bottom of my list. Now that I know what you've told me of people I want to associate myself with, have you ever heard them talk about being afraid of anything or anybody? I have not. That's amazing, isn't it? Company's this big. He's telling them what you're doing, asserting people, they're indifferent to it, but they feel like they're immune from action because they pretty much are bottom line. If we did create a system where parents like you could sue and hold them liable in court, do you think that may change your behavior?

Sen. Richard Blumenthal (D-CT):

That is not for me to say. I just want my daughter, our kids, to have the tools that they need when they're experiencing these things.

Sen. Lindsey Graham (R-SC):

Yeah. Well, what I will tell you is that I believe it would. We'll never know, do we try? I think we should dedicate ourselves on this committee, which has been a pleasure to serve on Senator Durbin. All of you have been great on this issue to not just pass bills but insist on change. The ultimate change comes to my colleagues is when they can be held liable in a court of law. Until you open up the courthouse, nothing's going to change. The day you do, you'll be amazed how many good ideas they knew about they didn't tell us. So I'm going to dedicate what time I have left in this business to opening up the courtroom. I don't think nothing else will do and until that day comes, I'm not going to take any of their money. If every member of cove say, your money is not welcome till you change, that might be the first step toward change. Thank you for your bravery. Sorry, what happened to your daughter? We owe you and everybody in your situation better. Thank you.

Sen. Richard Blumenthal (D-CT):

Thanks Senator Graham. And that's why many of us have joined you in a call for abolishing Section 230, including the ranking member and myself, Senator Klobuchar.

Sen. Amy Klobuchar (D-MN):

Thank you very much. Senator Blumenthal. Thank you for those words. Senator Graham, I'm a strong believer in, we can talk about this and have hearings and keep reminding people that we need to get things done, but until we change the law, these are no longer companies that started in a garage with two guys tinkering around with platforms or computers or in their college dorm room. Okay, this is real lives that are getting lost and I really appreciate Mr. Bejar that you are willing to come forward and testify. I'm going to focus on one area that I don't think we've talked about enough, and that is the platform's inability and refusal to take down sites that are selling dangerous drugs. Recently the DEA found that one third of drug cases had direct ties to social media. I was just in Minnesota with a mom, Bridget, who lost her kid, and the kid literally ordered one pill.

As we say, one pill kills on the internet thinking it was something else. It was laced with fentanyl as the mom said. As Bridget said, all of the hopes and dreams we as parents had were erased in the blink of an eye and no mom should have to bury their kid. That's why Senator Durbin and I and others on this committee have been working with Senator Shaheen and Marshall as well as Senator Grassley on this bill that has come through this committee already and needs to go to the floor along with another of other bills we've talked about through requires social medias, companies to report fentanyl and other dangerous drug sales on their platforms. In the words of our DEA administrator, they have basically the cartels who don't really care if people die or not because no one knows it in Mexico and China have basically harnessed these platforms. Do social media companies have the correct incentives to identify and eliminate drug sales to kids?

Arturo Bejar:

Thank you for the question and the concern on all of those important things until there is disclosure of what kids experience as drug content, sexual content, until there is transparency about these things, I don't know what the incentive is, which is why I think transparency is so essential as parents and grandparents, we see it, we understand it, we know how frequent it is. That's what the numbers they have to share. And if I would want, one thing for everybody here to know is that when we talk about any category that you care about, for example, drugs, and then when the company talks about that category, they're likely talking about a fraction of a percent of what we as a society are experiencing. Exactly.

Sen. Amy Klobuchar (D-MN):

And we all know there's a lot of other things to do with fentanyl including at the border, but this would be a major game changer for the ability to take these cases on. Prosecutors have also reported an emerging trend where offenders collect photos of children that may fall just shy of the definition of child pornography and distribute them on websites with the intent to harass or abuse the child victims. And there was a major story on this in the Washington Post, Senator Cornin and I have a Bill the SHIELD Act to fill in gaps in federal law so that prosecutors can hold those who abuse kids in this way accountable. In your role as a person at Facebook who was responsible for efforts to keep users safe, can you talk about the deficiencies in current policies?

Arturo Bejar:

Thank you for the question. If you look at content that sexualizes minors, again, the question is, is that something that actually violates company policy and would be removed and is that what the company is acting on? Or does it end up being something that because it is content that the company does not act on, they actually end up recommending and distributing. And as a parent, we see this. If you were to look and open the app and look for it, you can find it and then if you like it, you get recommended it. And these are all things that the company is I believe, aware of in terms of reach and can do things about and I've chosen not to do so.

Sen. Amy Klobuchar (D-MN):

Okay, very good. I think I was listening to Senator Graham, and he's right that there's one big thing we can do is to allow these cases to go forward in court, but I also think some of these things I'm discussing actually makes it easier for people to proceed with these cases and create incentives. And the one thing I add, I'll ask it on the record, is that Senator Coon, Senator Cassidy, and I think Senator Blumenthal's involved, have a bill to allow independent researchers to look at the algorithms that you know are designed in a way that manipulates these kits and can lead to their deaths to require these digital platforms to give independent researchers access to data. And yes or no, I figure you think this would be helpful? Yes. Okay, very good. Just again, thank you and we can again talk about this all we want and we'll remember you and your story, but until we get these things in our allowed floor time by both sides and can maybe put them together into one package, we're just not going to get the solutions that we need because just getting mad at these platforms haven't changed their content, their conduct.

Sen. Richard Blumenthal (D-CT):

Thanks. Thanks, Senator Klobuchar. Senator Kennedy.

Sen. John Kennedy (R-LA):

Mr. Bejar social media can make people less lonely. Can it not?

Arturo Bejar:

It can do that.

Sen. John Kennedy (R-LA):

Social media can deliver insight, can it not?

Arturo Bejar:

It can.

Sen. John Kennedy (R-LA):

Social media, when used properly, can give voice to the timid, can it not?

Arturo Bejar:

It can.

Sen. John Kennedy (R-LA):

Social media can also spread hate. Can it?

Arturo Bejar:

It can.

Sen. John Kennedy (R-LA):

And isn't it a fact that much of social media, not all, but much of social media has become a cesspool of snark?

Arturo Bejar:

What I can speak to, Senator, is it happens all so often and it doesn't need to be that way.

Sen. John Kennedy (R-LA):

But is it that way for much of, not all but much of social media?

Arturo Bejar:

One of the numbers that I talk about is this 20% of kids who witnessed bullying in the last seven days, and this is content that does not get taken down. If people comment on it, it gets promoted.

Sen. John Kennedy (R-LA):

Yeah. Well, lemme put it another way. I'm trying to sum this up for us. Isn't it a fact that social media has lowered the cost of being an a-hole?

Arturo Bejar:

Yes.

Sen. John Kennedy (R-LA):

And isn't it true that social media removes any geographical border to the harassment of others?

Arturo Bejar:

Yes.

Sen. John Kennedy (R-LA):

And isn't it true that some forms of social media optimize for engagement?

Arturo Bejar:

Yes. I think using your term, they reward being an a-hole.

Sen. John Kennedy (R-LA):

Yeah. And isn't it true that some forms of social media use surveillance to identify our and our kids' hot buttons?

Arturo Bejar:

I cannot speak to that.

Sen. John Kennedy (R-LA):

Isn't it true that some forms of social media use algorithms to show us and our kids stuff that pushes those hot buttons?

Arturo Bejar:

I will say this, recently my daughter had somebody going to one of her posts about cars and said, you'd like to drive and you like cars because you saw a man doing it. And she said, I'm studying automotive restoration. I've been doing this for years. I know a lot about cars. I am more than qualified. And the person shot back, no women just belong in the passenger seat to every point that you just made. I will say that when I asked her about that post, if she would delete it because she knows reporting would do nothing, she said, I will not delete it because I'm worried that that will mean that less people will see my posts.

Sen. John Kennedy (R-LA):

I'm not going to ask you this question, I'm going to make the statement because you're probably not familiar with Louisiana, but in my state, social media has impacted the news media, particularly print media, thank God for our TV news and our radio news. But when it comes to our print media, we are in Louisiana and that's print media and paper. Paper and print media, which is on the internet when it comes to print media in Louisiana, our news desert. We've only got about two real non-New print media journalists left who were fair and aren't opinion. Most of our print media members are now sports sub journalists, which is fine. I love sports, but there's a lot else going on in the world. Let me wrap up this way. What you do is what you believe, isn't it? And everything else is just cottage cheese, isn't it?

Arturo Bejar:

It is, yeah.

Sen. John Kennedy (R-LA):

I look forward to the day when members of the United States Senate will come together and establish a new rule not used every day or every week or every month or even every year. But that rule would say when there is a consensus and when you as a senator can demonstrate that you have 60 votes to pass a bill, that you have the right to bring that bill to the floor of the United States Senate no matter who doesn't want it. Thank you, Mr. Chairman.

Sen. Richard Blumenthal (D-CT):

Thanks. Senator Kennedy. Senator Hirono.

Sen. Mazie Hirono (D-HI):

Thank you Mr. Chairman. Thank you Mr. Bejar for testifying. I was curious about the fact that so many of the young people on these platforms are exposed to cyber billing and that can be anything. Your daughter experienced some of that because some of the things that she posted online, but there is also an addictive quality to keeping these kids online for the platforms. Keeping these kids online means money for them. What is it that, is there anything we can do to address the addictive aspect of what is happening to our young people where they continue to go on to these platforms and expose themselves to this kind of harmful content?

Arturo Bejar:

Thank you for the question. I think that it is essential to have good data about the impact that this product has. And it's not that difficult. You could take a teenager after half an hour and go like, how are you doing? Are you feeling better or worse? And then use that information.

Sen. Mazie Hirono (D-HI):

Let's say that we have this kind of data as to the harmful impact. What do you think we should do?

Arturo Bejar:

I think that products should adopt measures and where appropriately compelled to figure out a good way to help teens have a use of the product that serves them. I think what happens right now is it distresses them. And what I experienced a parent, and I think every parent here has experienced is that sense of urgency of needing to be on there and the impact that it has on their emotions.

Sen. Mazie Hirono (D-HI):

Do the young people understand the harmful impact themselves? Would it help if part of your testimony is about how all of us should be, should understand what is happening? Would it help if the kids themselves also understand the harmful impacts? This is also an aspect of education, for example, of the young people. Would that help?

Arturo Bejar:

I think it helps to educate young people. I think what helps in my experience the most is changes to the product so that it's less harmful and it's those changes and the refusal to do those changes while

Sen. Mazie Hirono (D-HI):

I'm doing so. Right now, there's not much incentive for these platforms to change their product because they face no consequences for the content. So meanwhile, dozens of states, including the state of Hawaii where I come from, have sued these companies, including me, alleging that they designed their products to harm users. And I think most of these cases have been consolidated in California. The defendants are saying that they are limited in their liability exposure because of Section 230, you're not a lawyer, but if these companies were exposed to legal liability and of these lawsuits, they're still pending by the way. And if the companies have found liable and forced to pay money as a result of these lawsuits, do you think that would change their behavior as far as them paying attention to the harmful impact of the contents on their platforms?

Arturo Bejar:

I'm not a lawyer. I'm not qualified to weigh in on that.

Sen. Mazie Hirono (D-HI):

No. But you did testify that it's all about money for these companies. That's why they keep doing what they do. And if they were exposed to, if they had to pay money as a result of their content, do you think that would change their behavior?

Arturo Bejar:

My hope is, and what I believe will change their behavior is the moment that Mark Zuckerberg, when he declares earnings, he has to say, last quarter we made $34 billion.And the next thing he has to say is, and in Instagram this is the percentage of teens that experienced unwanted sexual advances. That number would go down very quickly.

Sen. Mazie Hirono (D-HI):

How would it go down?

Arturo Bejar:

Because it would be incentivized to work on it. Because right now there are no goals to reduce unwanted sexual advances as far as I am aware.

Sen. Mazie Hirono (D-HI):

Except that if they are, if there's no law that prevents them from having this kind of content or there are no court cases, they're not held responsible for content, then the fact that people know that they have an incentive that they have exposed these kids. So, and this is why there's so much attention being paid to Section 230 and the limited liability, in fact, no liability. They're protected from content. They do try, I suppose, and you say that they have a very limited understanding of what is harmful content. But on the other hand, I'm all for doing more than them. We are currently doing. But one of the things that can also happen, I read a letter asking the FTC to investigate matters made us alleged practice of censoring advertisements for health products related to menstruation. And there Meta decided that this kind of advertisements was harmful. I hardly call that a very narrow definition of harm. So all these companies left to their own devices, they get to choose what they deem to be harmful. In the examples that you cite, it's a very limited definition. But the example I cite, they've decided that women's cell health products that is harmful and they're going to censor those kinds of products. So this is, it's a lot more complicated than at first, I guess glimpse, but I know we're going to try and do something. So thank you Mr. Chairman.

Sen. Richard Blumenthal (D-CT):

Thanks, Senator Hirono. Senator Blackburn.

Sen. Marsha Blackburn (R-TN):

Thank you Mr. Chairman. And thank you for your testimony and your frankness. I know that we appreciate it. And my state of Tennessee, we have attorneys here from our attorney general's office today and they are pushing to also get something done about the overreach of Facebook. And we're grateful that so many states have stepped up to hold Facebook and Meta to task. So we appreciate this. I want to return this to December 21 and Chairman Blumenthal mentioned this. He and I at the Senate Commerce Consumer Protection subcommittee that we led had Mr. Ari in front of us. And you were consulting for Instagram at that time, correct?

Arturo Bejar:

Sorry, what is the date again?

Sen. Marsha Blackburn (R-TN):

December 21.

Arturo Bejar:

I had left at that point.

Sen. Marsha Blackburn (R-TN):

You had left at that point. Okay. And a few months earlier, you sent him two emails that talked about youth harms on the platform, correct?

Arturo Bejar:

Correct.

Sen. Marsha Blackburn (R-TN):

Okay. Now I'm going to quote you some things from his testimony. He said, and I quote, we care deeply about the teens on Instagram, which is in part why we research complex issues like bullying and social comparison and make changes. Do you agree with that characterization?

Arturo Bejar:

I agree with that. They make research. I don't agree that they make changes.

Sen. Marsha Blackburn (R-TN):

Okay. So they have the info, they take no action. He also said, we don't allow people to bully or harass other people on Instagram and have rules in place that prohibit this type of conduct. We've also built tools that prevent from happening in the first place and empower people to manage their accounts so they never have to see it. Do you agree with that?

Arturo Bejar:

I think it's profoundly misleading because at a time at which this public statistic was a fraction of a percent, right? One in five teens had watched it happening, like 10% experienced it, and you have to bear in mind they're standing right there. And if this was a school, that would be completely unacceptable.

Sen. Marsha Blackburn (R-TN):

I agree. Let me give you one more talking about the executives, and I'm interested to see how they reacted to the information that came out in 21 about their disregard for harms to minors. Do you think that Meta executives were motivated to do more or to address the problem or were they interested in covering up what was going on at Meta at the time?

Arturo Bejar:

I think you will need to ask them about their intentions, but I also deeply believe that actions speak louder than words.

Sen. Marsha Blackburn (R-TN):

Did any of the members of Meta's team, whether it was Zuckerberg, Sandberg, Cox, did any of them respond to your email in a way that suggested that they were going to take an action to correct the wrongs?

Arturo Bejar:

No. For six years when I sent that kind of message, I would get a meeting within 24 hours to spend meaningful amount of time talking with them and what needed to be dealt with. And in this case, the lack of response, the meeting sometime later and then the lack of action again speaks about the fact that they,

Sen. Marsha Blackburn (R-TN):

So they sloughed it off.

Arturo Bejar:

That was my experience.

Sen. Marsha Blackburn (R-TN):

Money was more important than protecting children.

Arturo Bejar:

I think you should ask them that question.

Sen. Marsha Blackburn (R-TN):

Okay. I would be interested to know who took responsibility for making policy determinations about youth safety and one conversation you had with my staff, you suggested that Mark Zuckerberg had a hand in such decisions during your first stint at the company, but that when you returned he would tell employees not to raise youth safety issues to him. Is that accurate?

Arturo Bejar:

In my first stint, he would be he, Chris Cox and Sheryl would be who you raise these issues to and they would engage very proactively. Having done that for six years, that's why I felt that was probably one of the most qualified people in the world to bring it to their attention. I was not aware when I sent my email that it was hard to talk to Mark about this, but I could say that my experience of how the entire company was behaving when it came to the harms that teens were experiencing was a cultural issue that was grinding on that decision, which in my experience, prayer tracing prevalence over harm is something that Mark sets direction for that whole executive team. And that's why I realized it was necessary to appeal directly to them.

Sen. Marsha Blackburn (R-TN):

So they were aware.

Arturo Bejar:

Correct.

Sen. Marsha Blackburn (R-TN):

They knew that harms were taking place.

Arturo Bejar:

Correct.

Sen. Marsha Blackburn (R-TN):

They had the research that pointed this out, their own research.

Arturo Bejar:

Correct.

Sen. Marsha Blackburn (R-TN):

And they made a conscious decision to do nothing about it.

Arturo Bejar:

Correct.

Sen. Marsha Blackburn (R-TN):

Did they ever talk about profits as opposed to enacting these protections?

Arturo Bejar:

Not in my presence.

Sen. Marsha Blackburn (R-TN):

Not in your presence, okay. So other than Mark Zuckerberg, who would've claimed responsibility for dealing with youth safety and youth harms? Anyone

Arturo Bejar:

For Instagram?

Sen. Marsha Blackburn (R-TN):

So Adam. Okay. Thank you for that. My time's expired. Thank you, Mr. Chairman.

Sen. Richard Blumenthal (D-CT):

Thanks. Senator Blackburn. Senator Welch.

Sen. Peter Welch (D-VT):

Thank you very much. I just want to start by acknowledging my gratitude to my colleagues on this committee for the work that you have been doing on a bipartisan basis. Senator Blackburn and I began working together when we were both in the house together and introduced I think the first privacy bill. So I haven't been with you in this effort, but I was with Senator Blackburn and I can't elaborate on the excellent opening statements, Senator Blumenthal and Senator Hawley that you made. And I guess in our phrase, I'd like to associate myself with your remarks, but I do want to on my own behalf, express my shock at what's happening to our kids and how it's all because there's a lot of money to be made. And your questions, Senator Blackburn revealing just the disregard for the mental health of our kids is truly shocking.

So I'm all in with you on your efforts here. I'm also delighted that in Vermont, our attorney general has joined the lawsuit. And also I want to thank you for your stepping forward and providing such clarity and also embedded in the concern that you have, not just for your daughter, but for all of our kids. A couple of issues that have come up from letters that I've received in comments, and I know you're getting the same questions as well is from, and I want to make sure we can do this legislation that doesn't do any harm. And I've been receiving a number of letters from folks in the LGBT community who are concerned that some of this legislation included the KOSA Act would compromise their ability to get together online and be mutually supportive. And I support that. So I just want you to talk a little bit about how if we proceed with the legislation, which I hope we do, we're not in any way going to interfere with the capacity of kids who legitimately are getting together, mutually supporting none of the exploitive stuff. Can we accomplish that?

Arturo Bejar:

Thank you for the question. I cannot speak to the legislation. I think I trust that you are extraordinarily qualified for that part. I think that my job here is to help bring light to the harms that these teams are experiencing and the fact that the way the company talks about them in my experience is misleading.

Sen. Peter Welch (D-VT):

Okay? And that's based on all your years really at the forefront of Facebook.

Arturo Bejar:

Correct. And then the other thing I really would want you to know, and for any kid that again ends up having these awful experiences, it does not need to be this way, right? Instagram is standing right next to them as these things are happening and they should be able, and I know because I built these kinds of things for six years, they should be able to turn and say, can you please help me with this? And then get help with whatever's happening for them. And today that is not the case.

Sen. Peter Welch (D-VT):

So it's the exploitative content that in the algorithms that you're focusing on, and I think all of us are.

Arturo Bejar:

Thank you. No, actually it, it's when somebody says in front, you're taking a school and you're in the hallway and somebody comes to you and says, I'm going to make sure that you don't get invited to any party ever again. Right? The only people around you hear that. And if that happens online, that is a post that implies a person doesn't name them, never get removed, is incredibly distressing to the teen. And the kind of stuff I am talking about because I deeply care about every child we can talk about in every context is that that child who gets left out insulted because of the reasons that Sherman ment outlined, that child should be able to get help independent of what the content is. And I believe that's important for all children no matter what their gender or…

Sen. Peter Welch (D-VT):

Right. No, I share that. By the way, another question that's come up is about encryption and there's real privacy benefits to maintaining encryption. So I would hope any legislation that we have wouldn't compromise the privacy rights of individuals who are on the internet.

Arturo Bejar:

I deeply believe in privacy and in everything that I'm talking about, if a child gets a direct message that makes them uncomfortable, hurts them, it doesn't matter what the content is, it ought to be my house, my rules, right? It only matters that that child feels uncomfortable and is able to say, which is what I asked Adam, can we please add a button when a child receives this message that says, please help me what's going on, somebody's being really mean to me. And it doesn't matter what the content is, that child deserves help. And if somebody's initiating those messages, sending those, going into those kids' houses and telling them these things, then step number one, they should know that's not appropriate. And if they keep doing it, then other things can be brought to bear.

Sen. Peter Welch (D-VT):

Thank you. So kids first.

Arturo Bejar:

Absolutely.

Sen. Peter Welch (D-VT):

Thank you. I yield back.

Sen. Richard Blumenthal (D-CT):

Thanks, Senator Welch. Senator Cornyn.

Sen. John Cornyn (R-TX):

Thank you Mr. Bejar for being here. In your courage and your testimony, I think we've met the enemy and the enemy is us. We actually have six bills that Senator Durbin referred to that were voted out of the Judiciary committee. But in the Senate, the only person who can actually schedule those bills for vote are is the majority leader, Senator Schumer. And so I would suggest that we focus our attention on trying to get Senator Schumer to schedule a vote on those six pieces of legislation. That would be a good start, but we can talk about it. But without that happening, nothing is going to happen in the Senate. So one wise person said one time when trying to figure out a complex topic like this, follow the money. You've mentioned a number of times the data. Do social media applications like Instagram and Facebook collect huge volumes of data about the users?

Arturo Bejar:

They do.

Sen. John Cornyn (R-TX):

And that data is then used mainly for advertising products. For example. It's amazing to me when I go to a website and I look at something, let's say a piece of hunting gear, next thing I know on my Instagram, an advertisement from that same company shows up. And the way that that happens is that Instagram, Facebook X or formerly known as Twitter, sells that data to companies who then use that information to promote their products. Isn't that correct?

Arturo Bejar:

I'm not an expert in that domain.

Sen. John Cornyn (R-TX):

Well, that's how they make money, right?

Arturo Bejar:

They make money through advertising. Yeah.

Sen. John Cornyn (R-TX):

Well, I was shocked to read an article here in the MIT Technology Review, which talks about it's shockingly easy to buy sensitive data about US military personnel. Duke University did a study at the request of West Point and others and determined for as little as 12 cents per record that data brokers would sell sensitive information on US military members and veterans. Would that surprise you?

Arturo Bejar:

Again, this is not an area where I have any expertise. I mean, I have expertise from the perspective of being a security professional and ensuring that the systems do what they're set to do, but I don't have expertise on how the data gets brokered.

Sen. John Cornyn (R-TX):

Well, I think it's pretty much common knowledge that that's the case, that this data accumulated by social media companies is then sold. And that's the reason why when you go on Instagram or Facebook, you don't actually have to pay a subscription or a fee. And they've talked about if they couldn't recover that revenue from selling that data about me, Chairman Blumenthal, the ranking member, Hawley and others, or your daughter, then they would have to charge a fee in order to make this economical. But they don't do that. They can sell your data. And as shocking as what you have discovered, and you shared with us today about this one social media company, the truth is this is not unique to Instagram or Facebook, correct?

Arturo Bejar:

Correct. It's the entire social media sector that serves teens.

Sen. John Cornyn (R-TX):

And here in the Congress we've talked a lot about our concern about China's increasing belligerency and militancy and buildup of its not only its economy, but its military and threatening peace in Asia and elsewhere. But we also have talked a lot about apps like TikTok, for example, that are Chinese applications that then do much as Instagram does and vacuum up all this data addict our children to by using the algorithms or codes to figure out what to recommend to them. And again, this is all about the data and all about the money. And of course, Senator Durbin mentioned the use of social media applications when it comes to selling drugs. Fentanyl, synthetic opioids is a single leading cause of death for 18 to 45 year olds in America today. And much of it has transacted those sales and through the use of social media. And then there's other scary things like deep fakes. Do you know what a deep fake is?

Arturo Bejar:

I do.

Sen. John Cornyn (R-TX):

What is it?

Arturo Bejar:

It is when you use technology to create an image that appears to be a person, but it's not an actual video or a photograph of that person.

Sen. John Cornyn (R-TX):

And I've read in the last couple of days that deep fakes are now being used to basically portray young girls for sexual gratification using these deep fake false images due to this incredible technology, which as Senator Kennedy pointed out, could be used for a lot of good but can be also used for ill as well. I know our time is short here today. I just want to thank you for answering some of these questions. We have a lot of work to do here in the Senate and in the Congress and as parents and grandparents and to try to protect our children. I'm just, thank goodness my daughters are adults now and they don't live in the, they aren't of an age of Senator Hawley's kids or others. But the first thing we need to do, Mr. Chairman, is ask the one person who can actually schedule a floor vote on some of the bills that passed unanimously out of the Senate Judiciary Committee to schedule a vote. We could do that next week, but he's got to make it happen. Thank

Sen. Richard Blumenthal (D-CT):

You. Thanks, Senator Cornyn, and I can't speak for Senator Schumer, but I know he is vitally interested in reform in this area, and I'm sure that he will make that interest real on the floor of the Senate at the right time. Senator Butler.

Sen. Laphonza Butler (D-CA):

Thank you, Chair Blumenthal and ranking member Hawley. As a mom, this is a topic that I could not show up to engage in, and I want to appreciate your leadership, Mr. Bejar, for fighting for and leading on behalf of not just my daughter, but America's children. And I know not just your own. I appreciate very much also your comments to my colleague here, Senator Welch, specifically, when you're talking about taking an all children approach, I want to direct my comments to really engage in a space that where maybe all children, the all children approach hasn't necessarily been taken. And I'd love to get your thoughts on some gaps that we could try to fill now. And we know that the internet can be a hateful place. We've talked about that today. I understand that among your research in Meta's user experience, you looked into instances of identity-based hostilities on the platform, and you found that over a quarter of Instagram users under the age of 16 said they witnessed hostility against someone based on their race, religion, or identity.

Within the last week, one study published in the Journal of American Academy of Children and Adolescent Psychiatry looked at the issue of online racial discrimination between March and November of 2020. It found that black youth experienced increases on online racial discrimination that their white counterparts did not. And those instances of discrimination predicted worsened same day and next day, mental health amongst black youth. Mr. Bejar, can you talk with us a little bit about what more you think the company should be doing to protect against these kinds of race, racial and ethnic harassments and hostility online?

Arturo Bejar:

Sorry. The fact that a child today black any identity, right? It gets called out in front of the entire shared audience. Again, the difference between when this happens in a school and when it happens online, right? Go home, ask him, ask your child, what would you do? What can you do? And there's no way for that child to say, this is what's happening to me, somebody's being really mean to me. And I use that language because 10 years ago, Facebook knew this. We knew that in order to help a child dealing with an issue and help them, you have to hear the words that they use. A 13 year old does not like to report things because they're worried they're going to get in trouble and get other people in trouble. So you tell them, would you like some help? And if you look at the work that I submitted from 10 years ago, you should be able to say, this is awful for me because of my identity, any form of that. And the company should be able to take that into account to help that child be protected and then give them resources and then also make sure that that is not acceptable behavior in the community. Because the most tragic thing about that 20% number witnessing these kinds of attacks is that the lack of action on part of the company and the very narrow definition of the content that they would take down means that they're normalizing their behavior. Children watch and children learn from the way other children are behaving.

Sen. Laphonza Butler (D-CA):

And just to follow up a little bit, what would it look like to create a good experience? Is it just simple? The ability to exercise some agency in the button that you're making reference to?

Arturo Bejar:

It's a process. So if it's on direct messages, you have a button, you record that, somebody initiated that message. And one of the questions for the platforms is, how many hateful or harassing messages should somebody be able to send before you tap them on the shoulder and you tell them that it's not appropriate behavior, right? So it creates information that you can then act on. If somebody keeps doing it, then you know that they're up to no good, and then you can take further measures. And so without this data, any systems do not have a hope of making a safer environment for youth.

Sen. Laphonza Butler (D-CA):

And what do you think has been the barrier for companies? We're talking about the company that you have the most experience with. What do you think is the barrier to change and what do you think could help to create that, overcome that barrier?

Arturo Bejar:

I think they're just not incentivized to make this change. That's why nothing has changed. It's been two years and our kids do not have that button in their direct messaging where the content doesn't matter. To say, this makes me uncomfortable, but you can say it about an ad, for example, you can go into an ad and say, oh, that's sexually inappropriate or it's not for me.

The thing about this is until the information is transparent, and I would strongly encourage that that includes identity-based youth, because if it turns out that the overall number is 10%, but that 90%, 80% of youth that experience these things is because of an identity issue, the data is there to be had if the company makes it a priority and collects it. And that is at the heart of why I am here today.

Sen. Laphonza Butler (D-CA):

Thank you so much, Mr. Bejar, again for your leadership and advocacy on behalf of America's children, Mr. Chair.

Sen. Richard Blumenthal (D-CT):

Thanks, Senator Butler. A number of our colleagues may be joining us returning in the next few minutes, but why don't we begin a second round of questions now, speaking of which Senator Coons is arriving, and I can give you a couple of minutes to go ahead, get comfortable, or you can begin right now. Senator Coons.

Sen. Chris Coons (D-DE):

Thank you very much Mr. Chairman, ranking member for convening this important and timely hearing. And Mr. Bejar, Arturo, thank you so much for taking of your own personal experience as the senior engineer responsible for the wellbeing section within this unbelievable platform. Quick surveys suggest that something like two thirds of all American teens are currently on Meta's platforms. In particular Instagram. I am very concerned about the likely impact on our children and our future, and I wanted to make sure that I had a chance to question you for just a few moments about a possible path forward. As you testified to this committee today, your own research was hidden, was ignored, was marginalized by the very team that had recruited you to return to a leadership role at Meta. Your testimony highlights the dangerous lack of transparency at social media companies. The dangerous consequences of this ongoing global experiment with our children and documents, ways in which they are on the receiving end of both images that make them feel worse about themselves and unwanted sexual advances. Our own US surgeon General has issued a clarion call for Congress to act to recognize we are experiencing a crisis in mental health in particular amongst our children, and to find ways to restrain these platforms and their impact.

A bipartisan bill I suspect Senator Klobuchar or Senator Blumenthal may have referred to before called the Platform Accountability and Transparency Act, that's co-sponsored by Senator Cornyn and Senator Cassidy and Senator Graham would make critical advances in transparency and require platforms to disclose some of the public safety information that they currently hide. Can you give just two or three examples of the kinds of data and the kinds of insights into algorithms and how they work that would be critical for our public to know and that companies like Meta refuse to report? And do you expect that companies will ever voluntarily fully disclose what it is about their algorithms that make these platforms addictive or even dangerous for our children?

Arturo Bejar:

Thank you for the question. I think that apologies. Thank you for the question. I think that for as long as these companies get to make up their own definitions of what is harmful, of what is, for example, addiction, I looked into that issue when I was in the company asking around about the understanding of it, and what I found is that it was an internal term called problematic usage, and the definition of that was so narrow that does it really capture what we as parents all see? And so I think without transparency of the harms that teens are experiencing by their own word without instruments that help us understand the role that social media plays in their lives and without ensuring that for example, there's something that when they need help actually helps them. This was something that we proposed saying, let's measure our help by whether it helped and that was not adopted. And so I think that without these things, I don't think anything is going to change and that's why I'm here today.

Sen. Chris Coons (D-DE):

Could you explain for us how empowering independent researchers would provide a much more balanced understanding of how safe or dangerous social media platforms really are and say something about what kinds of safety research could be done in order to facilitate a better mental health and better safety outcomes for our teenagers?

Arturo Bejar:

I can speak well to that because that's what I did and my team did for six years. 10 years ago, we brought in experts from different universities in the United States, including Yale and who understood that for example, a 13 year old is more liable to take risks because where they are developmentally and they knew that it was important that the most important thing that you can do for a child that's having a distressing experience is to make sure that they feel supported at that moment. Us as product engineers and designers are not qualified to give teams tools and that's why independent research and the data that enables that is absolutely necessary to help our understanding of what people are experiencing online.

Sen. Chris Coons (D-DE):

Thank you. My colleague, Senator Hawley said earlier that Instagram's algorithm doesn't just promote but accelerates the connections between pedophiles and our kids. For anyone who is a caring and concerned parent, for anyone who caress about our community, that should be a chilling sentence. And the fact that you dedicated years to conducting research on safety and did everything you could to get it to the attention of the leadership of the company and are only here before us as a last gasp attempt should motivate all of us to advance legislation that will unlink what I think is a corrosive, harmful malign connection between algorithms and self-harm and assaults on our children. Thank you for your testimony today.

Sen. Richard Blumenthal (D-CT):

Thanks Senator Coons. We're going to have a second round of questions limited in length I want to assure you, but thank you for your patience and your perseverance here today. Let me just begin by saying that the lawsuit filed by the Commonwealth of Massachusetts yesterday, which is one of nine individual lawsuits filed around the country by states, and it is complimentary to the federal lawsuit filed by 33 states in district court, Connecticut joined that lawsuit. I am going to ask that the complaint be made of the record without objection says that 90% of young people in the United States, 90% of young people use Instagram. So we're talking about millions of young people, are we not?

Arturo Bejar:

Yes, we are.

Sen. Richard Blumenthal (D-CT):

And its cites Mark Zuckerberg saying in October, 2021, in response to Francis Hagen's whistleblower testimony before our committee at the heart of these accusations is this idea that we prioritize profit over safety and wellbeing. That's just not true. He said further it is very important to me that everything we build is safe and good for kids. Taking your admonition that actions speak louder than words. His actions certainly demonstrate the falsehood of those claims. Do they not?

Arturo Bejar:

They do. And if I have a moment, there's something from that same note that I would like to bring to the committee's attention.

Sen. Richard Blumenthal (D-CT):

Sure.

Arturo Bejar:

In the same note Mark Zuckerberg wrote, but when it comes to young people's health or wellbeing, every negative experience matters. It is incredibly sad to think of a young person in a moment of distress who instead of being comforted has their experience made worse. And I believe that is what Instagram does today.

Sen. Richard Blumenthal (D-CT):

The reference was made earlier to the policies of Facebook and social media in general being data driven. In fact, they are dollar driven. Correct?

Arturo Bejar:

My experiences of extensive data-driven culture.

Sen. Richard Blumenthal (D-CT):

Or in this case Facebook and Meta doctored the data to drive the dollars.

Arturo Bejar:

In my experience, what happened is this data that should be public, right, they shouldn't need to be here to talk about it, should be public about harm.

Sen. Richard Blumenthal (D-CT):

I was struck in the memo that you wrote to Adam Mosseri dated October 14th. It's now part of the record. You made the point, first of all, and I'm quoting, everyone in the industry has the same problems right now.

Arturo Bejar:

Correct.

Sen. Richard Blumenthal (D-CT):

You made that point to Mr. Mosseri and in effect urged Meta to be a leader. Instagram and Facebook could be a leader. And you said quote, there is a great product opportunity in figuring out the features that make a community feel safe and supportive, a great product opportunity. In fact, you were inviting them to design a better product that consumers would prefer because it was safer, correct?

Arturo Bejar:

Correct.

Sen. Richard Blumenthal (D-CT):

And the history of capitalism, I don't want to be too philosophical here, is that consumers go to products that are more efficient, more effective, but also safer as in safer cars, safer ovens, safer washing machines, safer everything. And you were appealing to the better instincts of Sheryl and Zuckerberg and the whole team, correct?

Arturo Bejar:

That is correct. I mean, Instagram is a product like ice cream or a toy or a car. I ask you how many kids need to get sick from a batch of ice cream or be hurt by a car before there's all minors of investigations. And there was an opportunity because they're standing right next to the teen, they're the company that's delivering the unwanted sexual advance. They're the company that's delivering the content that is upsetting to them and they're standing right there and they should be able, there's the opportunity for them to be told, Hey, there's something really awful happening here. Will you help me and be like, yes I can. And then use that to make the community be one that's safer.

Sen. Richard Blumenthal (D-CT):

And the Kids Online Safety Act is also about the product. It's about product design. If you have consumers give them some choices about what they want to see and hear so as to be able to disconnect the algorithms that drive something people don't want to see or hear. It's not censorship, it's not content blocking. Do you favor that approach to protecting young people and others on the internet?

Arturo Bejar:

Completely. In a world where, as in the third paragraph on my email to Mark, and in my experience from 10 years earlier, that's five years of looking at this, where 90% of the content that teens experience as harassment, it might not be discernible for policies. The only way to address this is through the kind of measures that you're describing. It's a product. It needs to be different, it has to change.

Sen. Richard Blumenthal (D-CT):

And the Kids Online Safety Act is also about holding social media and big tech accountable when they harm people. Right now, as you've heard, they feel no sense of accountability in terms that really affect their bottom line. When Mark Zuckerberg gives his quarterly report or his discussion to analysts, would you favor that kind of accountability so that they are held responsible?

Arturo Bejar:

Absolutely. I was, again, within the wellbeing team, which I want to take a moment to say that in my experience, the integrity and wellbeing professionals which are working on these issues firsthand are incredibly good people with wonderful ideas and management couldn't be letting them down more. But sorry,

Sen. Richard Blumenthal (D-CT):

And go ahead. Sorry.

Arturo Bejar:

Oh, I was going to say during that time, one of the issues that it's in one of the materials is we talk about a kind of content that we know is bad for body image issues. It has a name, the inspiration, it's being recommended. They know it is being recommended. They know teens are spending a meaningful time looking at it, and they're unwilling as a product to address that. So without being too held to account for what they're recommending, I can't imagine that ever changing.

Sen. Richard Blumenthal (D-CT):

And another part of our Kids' Online Safety Act provides for more transparency about the algorithm so that there can be more public knowledge and also expert knowledge. Would you favor that approach?

Arturo Bejar:

Yes. I believe transparency is essential, and I will say that algorithms are as good as their inputs and can be measured by their outputs. So you can take an algorithm and if the algorithm doesn't know that a kid experiences something as obscene, then why won't it recommend it? And if you look at what it's, it's recommending obscene things that should be held to account, then the only way there is with transparency about these aspects.

Sen. Richard Blumenthal (D-CT):

And before I go to Senator Hawley for his second round of questions, you mentioned that the people who worked on your team, the people who work in these companies to quote you are generally good people who want to do the right thing. And I noticed in your memo to Ms. Maer, you said, and I'm quoting a point which might be good for you to know, which I did not put in the document reviewed by the team, is that many employees I've spoken to who are doing this work and are of different levels, are distraught, are distraught about how the last few weeks have unfolded. These people who love FB slash IG, Facebook, Instagram, I assume, and are heart slash mission driven to the work they were distraught by the public exhibition of Facebook's knowing that it was profiting by toxic content driven at kids and the company in effect concealing and hiding the truth, rejecting recommendation for improvement and rolling back safety measures. Correct?

Arturo Bejar:

Correct. They were distraught. They were afraid that because the company was externally disavowing like body image issues, while at the same time there were studies and data that were saying otherwise features getting proposed that were saying otherwise, they were afraid that the work would be stopped, that they wouldn't get the support they needed or they wouldn't be able to build what they needed to build. And I say that the amount of investment that this company ought to do for those people should be commensurate to that table of harms that you now have.

Sen. Richard Blumenthal (D-CT):

Thank you. Senator Hawley.

Sen. Josh Hawley (R-MO):

Thank you again, Mr. Chairman. Mr. Bejar, thank you for being here and thank you. You have been extraordinarily patient but also incredibly forthcoming in your responses and it's just been tremendously helpful. So thank you so much. I just want to come back to something that you said over and over because you've been asked about it over and over to quote you in response to an earlier question, you said that changes to the product and you were just explaining that Instagram is a product like ice cream or opioids, maybe changes to the product would be most helpful, but there is no incentive. And by no incentive, I mean that really just means there's no money in it for the company, right? I mean isn't that what it gets down to the if they could make money on it, they do it.

Arturo Bejar:

You're going to have to ask them. I really am. I'm very excited for the day that Mark or Adam are sitting here.

Sen. Josh Hawley (R-MO):

Me too.

Arturo Bejar:

And then you can ask them, so why did you not invest? Because one of the things that is in each recommendation you see there, do you understand what data is causing these things? Like here's the button that you can build in the systems. Those are not a matter of significant investment. It would not cost them as much. It is a matter of how much they prioritize the work and whether they're willing to set their goals based on what teens are experiencing.

Sen. Josh Hawley (R-MO):

I think that's very well said. And I would just add this, you commented earlier that it would be great to hear Mark Zuckerberg say, we made 34 billion this quarter. That was hypothetical. You threw out. And then here I also have to report, here's the amount of harm that teenagers suffered. I'd tell you what else I'd love to hear him say, we made 34 billion this quarter and we have 34 billion in jury judgements pending against us that would get their attention. And I just have to say, at the end of the day, if you want to incentivize changes to these companies, you have got to allow people to sue them. You've got to open up the courtroom doors, the FTC find Facebook what was a billion dollars or something a couple of years ago. It made no discernible difference to their business practices. None. They changed nothing.

They don't care. But I tell you what they fear, they fear parents going into court and holding them accountable. That's the hammer. That's what happened with big tobacco. That's what happened with opioids. That's the hammer. And that's what we have got to do. And so I'll just say this, we've talked about the bills that have passed this committee. One of 'em is Senator Durbin's, bill along with me. It's our bill together on child sexual exploitation abuse material CSAM exploitative material. And for my money, the best part about that bill is it contains a private right of action. So I'll just say this, it's November, I think the seventh today. Is that right? I'll make you pledge. We're going to vote before the end of the year. Before the end of this calendar year, I will go to the floor of the United States Senate and I will demand a vote on the bills that we have passed in this committee.

And we'll just find out. We're going to put people on record because I'm tired of waiting. I've waited four years. Many folks on this committee have waited far longer. So we're going to vote. Any senator can go to the floor and call up a piece of legislation and ask for a vote on it, and I'm going to do it before the end of this year. I'm going to do it. So we're going to find out, we have all this talk about, oh, we love it, we need to do stuff. Okay, fine. Let's do something. The other thing I just say is on the money, the money that is flowing into this capital from big tech is obscene. It's totally obscene. And if we really wanted to change something, we'd get the corporate money out of politics, we would stop these mega corporations from making political contributions that would change things.

But either way, we're going to vote before the end of 2023 and we'll just put people on record and we'll see where we go from there. Mr. Bayer, thank you. I hope your testimony today will really motivate people. I know it will motivate parents. I think every parent listening to this will say, you know what? That's been my experience too. And I think to have someone who is an engineer as you are has your level of expertise and been inside the company, I think so often parents feel isolated and they feel like, maybe I just don't understand this technology. Maybe I'm the only one. And I just say, listening to you today, I think parents are going to say, I'm not the only one. My kid is not the only one. Yeah, go ahead.

Arturo Bejar:

If I may say something about that. Parents know they see this every day. And the other thing that's been my experience in all my years doing this is that parents know how to parent. And sometimes when I've had a parent of a child that's been groomed and come and talk to me about it, about their experience, they're like, well, I don't understand this technology. The best way I've experienced of people to think about these things is just take social media out the conversation. As a parent of a young kid, you know who your kids are spending time with. You keep an eye out on that, right? There's, you get a sense of that. You want to make it very safe for your kid to come up to you and say, Hey dad, there's this thing that's happening. That's what happened with me at home. And you want to make it safe for a kid to bring up an issue to you. And then when you see that these things are happening on these devices, if these things were happening at a school and you knew that one in five kids were witnessing or one in 10 were experiencing about the unwanted sexual advances, and the kid turns to somebody in the school for help and they're like, oh, I'm sorry. I cannot help you with that. As a parent, what would you do? You would hold the school's administration into account, and that's one of the reasons that I am here today.

Sen. Richard Blumenthal (D-CT):

Thanks, Senator Hawley. I would just again make the point that the Kids Online Safety bill imposes accountability, and I want to join the pledge to seek a vote before the end of the year. I'm very hopeful we'll have not only a vote, but an overwhelmingly positive bipartisan vote in favor of the kids' online safety. Bill and I challenge social media and big tech to come forward and put your money where your mouth is, put your actions, where your rhetoric is, support this bill for years. In fact, before our committees, they have said, oh, well, we want regulation but just not that regulation. And that has been their mantra. Trust us. No longer will kids or parents trust social media to impose the right safeguards. We want to give them the tools that their products need so the kids can take back their lives online. Senator Blackburn,

Sen. Marsha Blackburn (R-TN):

Thank you, Mr. Chairman. And thank you again for your patience today. I wish that my colleague from Vermont was still here. It was 2012 when he and I started on privacy and filed the first privacy bill in the house. And as Senator Welch was saying, we've been at this for a long time and we've been fought by big tech every single step of the way, every way. And it's been really quite amazing to see because they are, and sometimes people will say, how did tech companies grow this big this fast? And they didn't have the guidelines, the rules and restraints that the physical world has. And it's kind of been the Wild West, and we've seen that in how they choose to gather data and data mine and use that to make the dollar the eyeballs. They've got to keep these eyeballs on the page.

The longer they keep 'em, the more money they make. Now I want to go back to the hearing we had with Mr. Messer in December 21. And for the record, I want us to build out a little bit more of this framework because I think it's important to the states that have joined the lawsuit. I think it's important to us as we work to get the Kids Online Safety Act passed. Now, when you were with Facebook, you built a structure that would allow for some online governance and you put in place what you thought was a pretty good process for keeping people safe online, correct?

Arturo Bejar:

That's correct.

Sen. Marsha Blackburn (R-TN):

And basically, you had embarked on safety by design, is that correct?

Arturo Bejar:

That is correct.

Sen. Marsha Blackburn (R-TN):

Okay. And you were putting in place a duty of care for the social media company to be responsive to the users that were on those platforms?

Arturo Bejar:

That is correct. As I was going through one of these materials, I remember talking about bullying and teenagers and said that we as a company had the responsibility not only to the teens within the product, but to also improve the world's understanding of these issues so that the field could be moved forward, and that is the spirit with which we engage the work.

Sen. Marsha Blackburn (R-TN):

And then in 2013, Facebook decided they were going to change the rules and allow kids ages 13 to 17 to post content on Instagram, correct?

Arturo Bejar:

I don't know the exact date that change happened.

Sen. Marsha Blackburn (R-TN):

Okay. I think that that is accurate. And allowing them, what do you think changed? What motivated them to drop that age and allow 13 year olds?

Arturo Bejar:

I cannot speak to their motivation, but what I can say is that if you look at those 2013 presentations and 2012, one of the things that is written about there is the fact that a 13 year old will do riskier behavior and feels things more intensely because that's where they are developmentally. And so making a change that potentially increases their audience, I think would be inconsistent with that understanding.

Sen. Marsha Blackburn (R-TN):

I find it so interesting that whether it was Zuckerberg or Sandberg or Cox, when you highlighted with them how readers were responding to the survey, users were responding and you kept trying to direct this toward the experience, not the perception, but the experience. And that is noted several times in your emails to them, even though 51% of the users may say they've had a negative experience, they chose not to address that issue. And in most corporations allowing issues like that to just slide would never be tolerated. So it is left and you laid out an agenda and an opportunity for items for discussion so that you would make good use of your time. And you explicitly and specifically went through the numbers on kids that had received different negative interactions. Then you broke out the data by age and you created a chart so that he could look at it in a Google Doc.

Arturo Bejar:

Correct.

Sen. Marsha Blackburn (R-TN):

How did he respond when you broke it out by age or did he take the time to look at it?

Arturo Bejar:

It is my experience of all the years in Meta that an executive gets that email reads it thoroughly looks at all of the attachments. And so it would be my expectation that he had read it. My conversation with him, he demonstrated understanding of everything I spoke about, and we specifically talked about the button for a teen girl who received unwanted advances.

Sen. Marsha Blackburn (R-TN):

Okay, thank you for that. I think that what troubles me is knowing that harm was being done to kids and then to tell us, and I quoted back to you some of his comments from his testimony that he gave to us and for him to allude to the fact to give the impression that they've built tools that prevent these adverse activities, but then it's that old thing of the truth, the whole truth, and nothing but the truth. It was true. They had built tools, you built them. That was true, but they chose to remove that. And in doing that, there are hundreds of children that we have met with their parents, and we have heard about the suicides, the attempted suicides, and the adverse impact on these children. Thank you.

Sen. Richard Blumenthal (D-CT):

Thanks, Senator Blackburn. Thank you so much for being here today. As you can tell from the turnout, there is very strong bipartisan support for reform because actions do speak louder than words. And my hope is that colleagues will join Senator Hawley and me and Senator Blackburn and Senator Durbin and others in seeking action on a very doable, practical, politically achievable bill that targets the design of this product much as we would a safer car or stopping addiction to cigarettes and tobacco and nicotine. Big tech is very much in danger. I would say it is the next Big Tobacco and am hoping that it will join in this effort to make its product safer. In some ways, what we face here is a garden variety challenge to improve the reliability and safety of a product that uses a black box that very few people understand, which makes it more complex and mysterious, but no less urgent and ultimately understandable by everyday Americans.

Everyday Americans understand the harm that's being done. We have seen and heard it from moms and dads, from teenagers who have come to us and pleaded absolutely implored us to act now not at some distant point in the future. And so by the end of the year, I'm very hopeful that we will have a vote and that it will be an overwhelmingly bipartisan vote. In part thanks to the testimony that you have offered today. It has been tremendously impactful and moving and very powerful in it's science-based persuasion. You're an engineer, as you have stated, you're not a lawyer, but ultimately engineering is what may save Facebook from the perils and dangers that it's creating along with other social media. It's not alone, and my hope is that we will move forward so that in effect, we can make big tech the next big tobacco in terms of a concerted effort to reduce its harm and inform the public about how they can do it as well. So thank you for your testimony today, and this hearing will be adjourned now, but the record will remain open for a week in case colleagues have any questions they want to submit in writing. And in the meantime, again, my thanks to you for your very impactful and important testimony today. The meeting is during.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics