Once identified in a television interview as the “woman in charge of your child’s safety on Facebook,” Antigone Davis is not yet among the more well-known representatives of the company. But with concerns about the impact of social media on kids and teens renewed by reports in the Wall Street Journal about the impact of Instagram on teen mental health and its efforts to market products to young children, that may be about to change.
Davis, Facebook’s Global Head of Safety, will represent the company in a hearing on Thursday, September 30th before the Senate Commerce Subcommittee on Consumer Protection, Product Safety, and Data Security. The news that Davis would represent the company came after Senator Richard Blumenthal (D-CT) demanded Facebook send a “ranking, qualified and knowledgeable representative” to the upcoming hearing.
At Thursday’s hearing, Davis will no doubt be asked difficult questions. The first Facebook executive to appear before Senators since the revelations in the Wall Street Journal– Steve Satterfield, Vice President of Privacy and Public Policy- was raked over the coals last week by Senators from both parties at a hearing that was ostensibly about privacy. Senator Ted Cruz (R-TX) used his moment on camera to ask Satterfield if he could quantify the number of teens that had committed suicide because of Facebook’s products.
Davis joined Facebook in October 2014, according to a LinkedIn profile. A speaker’s bureau bio says before joining the social media company, she spent a decade working for the Maryland Attorney General, where “she helped establish the office’s first online privacy and safety unit, and led the national Association of Attorney General’s 2012-2013 presidential initiative ‘Privacy in the Digital Age.’” She earned her B.A. from Columbia University and J.D. from the University of Chicago. Her bio also mentions she received a master’s degree in education and taught in middle and high school classrooms.
The speaker’s bureau bio says she “spearheads the efforts of Facebook’s Safety Advisory Board, a team of leading safety organizations from around the world who provide Facebook with cutting edge research and advice on best practices, as well as its Global Safety Network.” Her portfolio includes addressing child exploitation, revenge porn, suicide and cyberbullying, among other problems that occur on the company’s platforms.
In talks and media appearances, she gives tips to parents on issues such as screen time and how to use parental controls. She frequently refers to her experiences as a mother. In a recent discussion with Larry Magid, the CEO of ConnectSafely, a nonprofit that Facebook supports, Davis describes various informational websites she has deployed around COVID-19, mental health and emotional well-being, and media literacy, as well as tools to address harassment and other ills.
Davis has served on the International Advisory Board for WePROTECT, an alliance of “governments, the private sector, civil society and international organizations” that addresses online child sexual exploitation and abuse. She has also served on the boards of the National Cybersecurity Alliance, where Facebook is a “board member company”; and the National Network to End Domestic Violence, which Facebook funds. She was previously on the board of the Family Online Safety Institute, which Facebook sponsors.
In prior media appearances, Davis is disciplined about pivoting to key messages often repeated by senior executives at the company. Davis affirms that the company takes issues such as misinformation, harassment, and hate speech “really seriously” and promises the company is taking steps to address such problems. She points to the responsibility of parents in monitoring how children and teens use Facebook’s products. And she promises the company puts “people over profits,” emphasizing it has taken decisions in the interests of its users that adversely affect its bottom line in the short term.
For instance, in 2018, Davis appeared on CBS This Morning to contest the idea that Facebook’s products are addictive.
John Dickerson (CBS):
Tell us about this question of addiction, the charges that Facebook actively tries to get people addicted.
Well, first of all, there’s a lot of misinformation and miscommunication about the concept of addiction and how people are engaging online. We really care at Facebook, about making sure that people are engaging in a positive, safe way. You may have seen recently on Facebook, we changed the way your News Feed works to really encourage and support interactions with your family and friends and promote those meaningful conversations.
But those meaningful conversations may draw me in even deeper. I mean, those are what I want to know about and therefore, wouldn’t that make me more addicted?
Well, again, I think that we know that people are utilizing these technologies. We take this issue really seriously. We’ve invested in research to look at these issues, but we’re really trying to develop ways that our technology is meaningful and positive for people. Changing News Feed, or recently, we launched Messenger Kids. I don’t know how much you know about that, but that’s a video chat app for kids. It has no ads. It is parent controlled. It was really designed to figure out what people need and to make sure that we’re creating a positive experience.
A report from the Wall Street Journal reveals that internally, Facebook research showed that the changes to the News Feed were having the opposite effect, causing more division and motivating people and politicians towards outrage. Later in the CBS interview, Davis is asked whether Facebook has conducted research into problems on the platform.
Norah O’Donnell (CBS):
Has Facebook done its own research into tech addiction?
Well, we’ve actually just put a lot of money investing into this research and we’ve done some research. What we do find is that people engage in different ways. I’ll give an example. Sometimes we see when people are posting about something that may distress them, that people will actually respond in a really positive way and provide support. Those are the kinds of connections that we want to ensure that we’re helping to support and create.
I’ve been looking at some of the studies and one of them that’s been cited by the Association for Psychological Science found that nearly half of teens who spend more than five hours on electronic devices report feeling lonely, planned or attempted suicide. Too much screen time is a bad thing. Does Facebook agree with that?
Well, I think this is really a question for parents. Every parent knows their child best. They know how much time they think a child should spend online and how they want them to use those technologies. What we’re trying to do is really give parents control. For instance, today, for Internet Day, which is a day that marks online safety and being responsible with online technologies, we’re launching something called Parent Conversations to ensure that parents have the tools that they need. Parent Conversations give parents the latest in academic research, the latest in research from child development experts to really give them the tools that they need to make those decisions wisely.
The latest in a series of Wall Street Journal reports suggests that in 2018, the company was actively researching how to appeal to younger children, as it sought “messaging primacy with U.S. tweens, which may also lead to winning with teens,” according to internal Facebook documents. Other research inside the company has reportedly shown a connection between its products and anxiety, depression, and suicidal thoughts in teens. “The tendency to share only the best moments, a pressure to look perfect and an addictive product can send teens spiraling toward eating disorders, an unhealthy sense of their own bodies and depression, March 2020 internal research states,” according to the Wall Street Journal report on Instagram.
As in past hearings featuring Facebook executives, Senators can expect Davis to admit the company is not perfect. Not unlike Facebook CEO Mark Zuckerberg, Davis is disciplined about acknowledging the company’s “mistakes”. For instance:
- In a 2018 video interview with SkyNews, Davis addressed the problem of hate speech on the platform. The interviewer shows her a post depicting a ritual murder. “Will we make mistakes? Absolutely. We’re gonna make mistakes. Are we always aiming to do get it right and to do better? Absolutely,” said Davis.
- Appearing on the Indian broadcaster NDTV, Davis was asked about the use of social media for abusive purposes and to incite violence. Davis said “First of all, we take these issues really seriously. We want people to be using our platform as a way of building community, bringing people closer together, and making sure that the information they are exchanging is, in fact, of course, in this case, you want to make sure it is accurate.” The interviewer goes on to ask her about the WhatsApp lynchings. She says the “right policies, the right tools” will address the problem.
- Appearing on iTV, Davis was asked about the increase in the problem of cyberbullying, Davis said “Well first of all, we have a job to do. We have to have policies in place. We need to take down bullying content and harrassing content.” Later, when asked if Facebook is investing enough in safety, she said “We have heard from Mark himself that we are putting people over profits.”
In a recent Twitter post, Davis amplified a statement from Nick Clegg, Facebook’s Vice President of Global Affairs, calling it an “important qualifier.” In his blog post, Clegg claimed the Wall Street Journal had mischaracterized the Facebook research it obtained.
Wednesday evening, the company made the research reviewed by the Wall Street Journal available to the public (see below). The whistleblower that provided material to the Journal has been working with Senators to prepare for Thursday’s hearing, and is scheduled to appear in person at a hearing on Tuesday, October 5th.
Knowledge that her testimony will be scrutinized in that subsequent hearing may put additional pressure on Davis to choose her words carefully. Senators can expect Davis to underscore the importance of safety to the overall mission of the company.
“We used to talk about our company as a company that was designed to connect people,” Davis said in a speech at the World Anti-Bullying Forum in 2019. “We now talk about our company as a company that is designed to give people the power to build community and bring the world closer together. The reason this matters to me so much is it really reflects the goal of the company to take safety inside of our mission. It’s one thing to connect people, but if people are not connecting in positive ways, then we really haven’t brought safety into the mission.”Instagram-Teen-Annotated-Research-Deck-1
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering. Opinions expressed here are his own.