Home

Donate

Transcript: Mark Zuckerberg Announces Major Changes to Meta's Content Moderation Policies and Operations

Justin Hendrix / Jan 7, 2025

Meta founder and CEO Mark Zuckerberg. Source

In a video posted on Facebook and text posted to Threads, today Meta founder and CEO Mark Zuckerberg announced sweeping changes to the company's approach to content moderation. After years of developing 'trust and safety' and content moderation systems and policies, Zuckerberg now asserts that "we've reached a point where it's just too many mistakes and too much censorship." Zuckerberg has decided to return to a posture that he believes will prioritize free expression.

Specific policy changes announced include:

  • Eliminating fact-checkers in the US and replacing them with a "community notes" system similar to X (formerly Twitter);
  • "Simplifying" content policies by removing certain restrictions on topics like immigration and gender;
  • Changing enforcement approach for policy violations:
    • Focusing automated filters only on illegal and high-severity violations;
    • Requiring user reports before taking action on lower-severity violations;
    • Increasing the confidence threshold required before removing content;
    • Reintroducing civic and political content into recommendation systems on Facebook, Instagram, and Threads;
    • Relocating trust and safety and content moderation teams from California to Texas. "This will help remove the concern that biased employees are overly censoring content," Zuckerberg wrote on Threads.

Zuckerberg says he is planning to work with President-elect Donald Trump to oppose global "censorship" pressures. He makes claims about such pressures mounting in multiple regions, including:

  • Europe's "ever-increasing number of laws, institutionalizing censorship";
  • Secret courts in Latin America that "that can order companies to quietly take things down";
  • Chinese censorship.

He further claims that the only way Meta "can push back on this global trend is with the support of the US government, and that's why it's been so difficult over the past four years when even the US government has pushed for censorship."

Zuckerberg's statement was timed with the release of a post by Meta Chief Global Affairs Officer Joel Kaplan:

We want to undo the mission creep that has made our rules too restrictive and too prone to over-enforcement. We’re getting rid of a number of restrictions on topics like immigration, gender identity and gender that are the subject of frequent political discourse and debate. It’s not right that things can be said on TV or the floor of Congress, but not on our platforms.

Notably, Kaplan says the company is now deploying "AI large language models (LLMs) to provide a second opinion on some content before we take enforcement actions."

Timed with the announcement, Kaplan also provided an "exclusive" interview with the Fox News morning program "Fox & Friends."

Below is a transcript of Zuckerberg's remarks, followed by a transcript of the Kaplan appearance on Fox & Friends:

Mark Zuckerberg:

Hey everyone. I want to talk about something important today because it's time to get back to our roots around free expression on Facebook and Instagram. I started building social media to give people a voice. I gave a speech at Georgetown five years ago about the importance of protecting free expression, and I still believe this today, but a lot has happened over the last several years.

There's been widespread debate about the potential harms from online content. Governments and legacy media have pushed to censor more and more. A lot of this is clearly political, but there's also a lot of legitimately bad stuff out there. Drugs, terrorism, child exploitation. These are things that we take very seriously, and I want to make sure that we handle responsibly. So we built a lot of complex systems to moderate content, but the problem with complex systems is they make mistakes even if they accidentally censor just 1% of posts.

That's millions of people, and we've reached a point where it's just too many mistakes and too much censorship. The recent elections also feel like a cultural tipping point towards, once again, prioritizing speech. So, we're going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms. More specifically, here's what we're going to do.

First, we're going to get rid of fact-checkers and replace them with community notes similar to X starting in the US. After Trump first got elected in 2016, the legacy media wrote nonstop about how misinformation was a threat to democracy. We tried in good faith to address those concerns without becoming the arbiters of truth, but the fact-checkers have just been too politically biased and have destroyed more trust than they've created, especially in the US. So, over the next couple of months, we're going to phase in a more comprehensive community notes system.

Second, we're going to simplify our content policies and get rid of a bunch of restrictions on topics like immigration and gender that are just out of touch with mainstream discourse. What started as a movement to be more inclusive has increasingly been used to shut down opinions and shut out people with different ideas, and it's gone too far. So, I want to make sure that people can share their beliefs and experiences on our platforms.

Third, we're changing how we enforce our policies to reduce the mistakes that account for the vast majority of censorship on our platforms. We used to have filters that scanned for any policy violation. Now, we're going to focus those filters on tackling illegal and high-severity violations, and for lower-severity violations, we're going to rely on someone reporting an issue before we take action. The problem is that the filters make mistakes, and they take down a lot of content that they shouldn't. So, by dialing them back, we're going to dramatically reduce the amount of censorship on our platforms. We're also going to tune our content filters to require much higher confidence before taking down content. The reality is that this is a trade-off. It means we're going to catch less bad stuff, but we'll also reduce the number of innocent people's posts and accounts that we accidentally take down.

Fourth, we're bringing back civic content. For a while, the community asked to see less politics because it was making people stressed, so we stopped recommending these posts, but it feels like we're in a new era now, and we're starting to get feedback that people want to see this content again. So we're going to start phasing this back into Facebook, Instagram, and Threads while working to keep the communities friendly and positive. Fifth, we're going to move our trust and safety and content moderation teams out of California, and our US-based content review is going to be based in Texas. As we work to promote free expression, I think that will help us build trust to do this work in places where there is less concern about the bias of our teams.

Finally, we're going to work with President Trump to push back on governments around the world. They're going after American companies and pushing to censor more. The US has the strongest constitutional protections for free expression in the world. Europe has an ever-increasing number of laws, institutionalizing censorship, and making it difficult to build anything innovative there. Latin American countries have secret courts that can order companies to quietly take things down. China has censored our apps from even working in the country. The only way that we can push back on this global trend is with the support of the US government, and that's why it's been so difficult over the past four years when even the US government has pushed for censorship.

By going after us and other American companies, it has emboldened other governments to go even further. But now we have the opportunity to restore free expression, and I'm excited to take it. It'll take time to get this right, and these are complex systems. They're never going to be perfect. There's also a lot of illegal stuff that we still need to work very hard to remove. But the bottom line is that after years of having our content moderation work focused primarily on removing content, it is time to focus on reducing mistakes, simplifying our systems, and getting back to our roots about giving people voice. I'm looking forward to this next chapter. Stay good out there, and more to come soon.

Transcript of Joel Kaplan appearance on Fox and Friends, January 7, 2025.

Lawrence Jones:

All right, buckle up. We got a Fox News alert moments ago. Meta the parent company of Facebook and Instagram announcing they are changing their policies to focus on free speech. CEO Mark Zuckerberg saying this.

Mark Zuckerberg:

We're going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms. More specifically, we're going to get rid of fact-checkers and replace them with community notes similar to X starting in the US.

Ainsley Earhardt:

Wow, this is a big deal. Changes include ending its third-party fact-checking program and lifting restrictions on topics such as immigration and gender identity.

Brian Kilmeade:

Well, is there anything more important to talk about? Also, until now, meta used automated systems to scan for violations, but found they resulted in too much censorship. So from now on, they'll focus on tackling illegal and severe violations like terrorism fraud to name a few.

Steve Doocy:

And joining us right now to name one Fox and Friends exclusive, met as Chief Global Affairs Officer, Joel Kaplan joins us live here on the couch. Joel, welcome.

Joel Kaplan:

Thanks. Great to be here on the couch on Fox and Friends.

Steve Doocy:

It's great to have you. You here? Okay, so you are announcing these big changes as of two minutes ago. If you're making big changes, does that mean you were doing something wrong before?

Joel Kaplan:

Look, this is a great opportunity for us to reset the balance in favor of free expression. As Mark says in that video, what we're doing today is we're getting back to our roots in free expression. There's a number of changes we're making, but if I could, and just highlight three first is, as you heard, we're eliminating the third-party fact-checking system. Well-intentioned at the outset, but there's just been too much political bias in what they choose to fact-check and how, so we're just scrapping it entirely.

Brian Kilmeade:

You say "they," what do you mean? You mean "you."

Joel Kaplan:

The fact-checkers.

Brian Kilmeade:

You set up your, but they're your fact-checkers, right?

Joel Kaplan:

So the idea was they're independent, but they've just been too biased. And so what we're going to do instead is adopt a system like X has of community notes. So we're just going to rely on our own community of users to provide people more information about what they're seeing and we think that's going to work great.

Ainsley Earhardt:

So give us an example. If we said, if the headline is Donald Trump won the presidency, then anyone in the community can write notes underneath it.

Joel Kaplan:

Yeah, great question. So somebody can write a note and then the way it works is different people on the platform can sort of vote on that note. And if you get people who usually disagree who all say, yeah, that sounds right, then that note gets put on the post and people see it. X has been doing it for a while, we think it's working really well, and we're going to adopt that system.

Brian Kilmeade:

And what are the other ones you want? I know you want to go through.

Joel Kaplan:

Thank you. So the second one is about the rules that govern content on our platform. They've just become too restrictive over time about what people can say, including about those kind of sensitive topics that you mentioned that people want to discuss and debate. Immigration, trans issues, gender. We want to make it so that bottom line, if you can say it on tv, you can say it on the floor of Congress. You certainly ought to be able to say it on Facebook and Instagram without fear of censorship. So we're changing those rules. And then the last thing, if I could, we're also going to change how we enforce the rules and we're going to make it so that there's way less over-enforcement and way fewer mistakes that actually make up the vast majority of censorship that people experience on our platforms. So big changes all in service of getting us back to our values of free expression.

Lawrence Jones:

So Joel, I guess the big question is where are the changes coming from? Because I mean, it's hard not to notice there's been a change in Mark Zuckerberg. You've seen him as respectfully the nerdy kid, change over to the jujitsu. He's put on some lean muscles. His viewpoints have changed, his posting content. When he does speak, he's meeting with Trump, taking a different stance on certain things. Are these changes coming from him or from what the public is saying?

Joel Kaplan:

Well, there's no question that the things that happen at Met are coming from Mark, but there's also no question that there has been a change over the last four years. We saw a lot of societal and political pressure all in the direction of more content moderation, more censorship, and we've got a real opportunity now. We've got a new administration and a new president coming in who are big defenders of free expression, and that makes a difference. One of the things we've experienced is that when you have a US president administration that's pushing for censorship, it just makes it open season for other governments around the world that don't even have the protections of the First Amendment to really put pressure on US companies. We're going to work with President Trump to push back on that kind of thing around the world.

Brian Kilmeade:

So Joel, why now? I know you said it's a change, but did you feel restricted over the last four years and this is the first time you have, it's in your business interests to express this quest for freedom?

Joel Kaplan:

Well, there's no question that there's an opportunity here with a new president taking office, as I said, who really believes in free expression. And that's just going to give us the space to get back to those values that Mark has talked about for a long mark gave a big speech six years ago at Georgetown about free expression, about these values. Unfortunately, there's been a lot of political and societal pressure here and around the world that have pushed away from those values. We've got a real opportunity to reset, get back to them, and really provide a space for how.

Ainsley Earhardt:

How was the Biden administration putting pressure on you specifically? Where they calling where they didn't like a post? What did they say? Take it down.

Joel Kaplan:

Yeah. So Mark talked about this in a letter he sent to the House judiciary committee a few months ago that sort of outlined the way in which we got a lot of pressure around Covid in particular to take down more content, even things like humor and satire about the pandemic and about vaccines. So we did experience that kind of pressure. The decisions we made ultimately were our own. But there's a real opportunity here with President Trump coming into office with his commitment to free expression for us to get back to those values and really provide space for people to have the kind of discourse and debate they want to have.

Brian Kilmeade:

Would you do what X did? And that is unleash independent journalists like Matt Taibbi and Bari Weiss and Michael Shellenberger onto, forensically, go through what you guys, or actually repressing, shielding, shadow banning over the last few years. Wouldn't that be part of this cleansing process?

Joel Kaplan:

So we've had an opportunity to do a lot of that with the House Judiciary Committee and Chairman Jordan over the last few months and years. And he put out a really thorough report on the experience that we and other companies have had. Honestly, we're pretty focused on the future and the opportunity we have right now with the changes we're making today to really open up the space for debate and expression on our platform.

Lawrence Jones:

Is the idea to get out of politics. Totally. And focus on just the enterprise of the free market. People debating things, or is it for Mark Zuckerberg to become Elon Musk and be influential as well work with the next administration or a little bit of both? I don't know.

Joel Kaplan:

Well, Mark's always going to be Mark Zuckerberg, but I think there is a real opportunity to work with the new administration, both on free expression, but also on American leadership and technology and innovation. This is something we've seen President Trump is really focused on making sure that we maintain US competitiveness and leadership on technology, things like ai. Those are obviously super important to our company and our industry, and we'll look forward to working with the new administration to advance those goals.

Steve Doocy:

And Joel, when you talk about content moderation, I think a lot of people watching right now are going, well, it sounds better if they're not taking instructions from the federal government, the Joe Biden administration saying, Hey, this is wrong. Take that down. So people like the idea that they're going to have a say where you can add the community notes going forward, but ultimately, is this one of the things you are doing as a company to make sure that Meta and the other social media companies aren't regulated by Washington? The last thing you guys want.

Joel Kaplan:

Yeah. I think when it comes to expression, especially in the United States, we got a long tradition of the First Amendment and we think that the right thing is for individual users to have the ability to decide themselves what to say. Look, there's bad stuff that people do online like they do in the real world, and we want to make sure we don't have that. So we're still going to be enforcing against terrorism and drug sales and child sexual exploitation things. Everybody agrees have no business being on our platform.

Ainsley Earhardt:

What about bullying all the young little girls? I remember, mark, remember he was on the Hill and he had apo.... he was kind enough to turn around and apologize to the families, but all those families have lost their kids from bullying. I have a nine-year-old, I'm starting to get concerned because eventually when she's in high school or even after, she'll be faced with these situations. So what do you say to those families?

Joel Kaplan:

Yeah, well, I'm a parent of two teenagers.

Ainsley Earhardt:

Gosh, you're in it.

Joel Kaplan:

Yeah, I'm in it. So these are issues that I take really seriously and that a lot of my colleagues who are also parents at Meta and those who aren't parents take really seriously. We got to make sure teens are protected online. We launched a few months ago a product called Teen Accounts that really puts parents in the driver's seat and makes it so that they can see what their kids can see and do online, who can contact them. We put kids in sort of built in default settings, the most protective settings we have. And then if you're under 16, you can't change those settings on Instagram without your parents' permission. We think that's a really good innovation and something we're hopeful. It's an area where Congress can act.

Ainsley Earhardt:

How do we do that setting? How do the parents find that setting on the phone before they give it to their child?

Joel Kaplan:

Yeah, so we will send notifications basically to the parents to let them know that this opportunity exists and the kid can't sign up for Facebook, or excuse me, for Instagram without being put into a teen account. And then their parent gets a notice and says, Hey, your kid's trying to sign up and if you want to change any of the settings, you can do it here.

Brian Kilmeade:

So Joel, what you got to do is reestablish trust. I mean, people remember the Zuckerbucks into those key areas in 2020 that were deciding what candidate was going to win what state. And one of the reasons you are trying to do that is number one, making this announcement here. The other thing is changing your board to let people know maybe by the people that you're hiring that you asked to join, that their agenda doesn't match maybe the 2020 agenda. What have you done?

Joel Kaplan:

Well, we made a big announcement yesterday. Put three new board members on, great board members including Dana White, the president and CEO of UFC, guy's a legend, an incredible businessman, incredible entrepreneur, marketing genius, super excited to have him. A couple other great board members with real deep business experience that we put on the board. So those are important voices. Mark has always been great about getting different perspectives so that we sort of cover the waterfront of ideas and perspectives that we want to make sure are getting fed into the process.

Lawrence Jones:

Joel, can you tell the audience and the world that this is not a temporary stance? And what I mean by that is this just because you got Donald Trump in there for four years, are you guys going to continue this after his presidency or is this just a temporary thing for the moment?

Joel Kaplan:

This is an opportunity for us to get back to the values that Mark founded the company on. This runs deep for Mark. It's a great opportunity to rebalance and to get back to free expression.

Steve Doocy:

When you say rebalance, historically the big tech companies and social media companies have leaned to the left that is this dragging it closer to the center. I mean with Dana White, obviously a Trump ally as part of the board. Is the company becoming more central rather than leaning to one side or the other?

Joel Kaplan:

I think the way to look at it is we are a platform for all people and all ideas.

Steve Doocy:

I know, but some people don't feel that certain and haven't felt historically that certain social media companies were hospitable to their points of view.

Joel Kaplan:

No question. And that's what I meant by rebalance. We want to make sure that they understand that their views are welcome and that we're providing a space for them to come onto our platforms, engage, express themselves, engage in the important issues of the day or not in the important issues today, and just whatever it is they want to talk about and share. And so we need to make sure they feel welcome. We need to make sure we rebuild trust, and that's a big part of what we're doing today.

Brian Kilmeade:

So also there's something else could be in front of the Supreme Court and that's future of TikTok China controlled. So you're an American company, you decide to make this announcement. How does this relate to what TikTok could possibly do?

Joel Kaplan:

So look, we like competition. We think it makes us better. I know there are concerns that Congress has expressed and that the President and Congress and the courts are working through With TikTok, we're going to stay focused on what we're doing and how we make our platform as open and welcoming a place as possible for all points of view.

Brian Kilmeade:

Do you think it should be banned?

Joel Kaplan:

Well, that's something I'm going to leave to the president and the courts in Congress. We're focused, again, we're focused on what we can do to make our platforms the best they can be.

Brian Kilmeade:

If Elon Musk didn't buy X and expose what he exposed and put community notes like that, do you think we'd be in? What did he in buying this and doing what he did do for the introspection that Facebook is showing right now?

Joel Kaplan:

I think Elon's played an incredibly important role in moving the debate and getting people refocused on free expression. And that's been really constructive and productive, and we're just glad that we've got the opportunity now to make these kind of changes and to get back to our roots in free expression.

Ainsley Earhardt:

Maybe all social media companies can learn a lot from this, because y'all were the original, mark was one of the originals at least. And so y'all have had to see all the changes in what works and what doesn't work and what Americans really want. So thank you for your honesty. Thank you for doing this for children and all your customers.

Brian Kilmeade:

Australia has banned social media up until you're 16 years old. How do you feel about that?

Joel Kaplan:

We think the right way to do it is to trust parents. That's what we're doing with the teen accounts that I mentioned. And obviously we've got to work with governments and follow the laws that they pass, but I think there's a better way, and that's to put parents in charge. They know what's best for their teens and their kids, and that's what we're trying to do.

Ainsley Earhardt:

What's your biggest customer? What age group?

Joel Kaplan:

Oh, that's a good question. I mean, we've got 3.2 billion people using our services, so we really have people across all ages and we want to make sure we're serving them all and letting them express themselves in a way, especially with teens where they're still kept safe.

Brian Kilmeade:

Did you talk to David Sacks about this? Who's going to be coming in as the AI czar?

Joel Kaplan:

Yeah, we've talked to David. We're really excited about that post. We think he's going to do great things. He really understands the industry. He really understands crypto and AI, and we can't wait to start working with him.

Steve Doocy:

Well, Joel, thank you very much for stopping by the couch and making a big announcement today.

Lawrence Jones:

Hopefully you'll come back as well.

Joel Kaplan:

Absolutely. Thank you all for having me.

Brian Kilmeade:

A fair and balanced social media world. There you go. Thank you so much.

Ainsley Earhardt:

Thank you. Nice to meet you.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Related

The Need to Make Content Moderation Transparent

Topics