Meta is ending its fact-checking program in favor of a 'community notes' system similar to X's
Meta CEO Mark Zuckerberg announced a series of major changes to the company's moderation policies and practices Tuesday, citing a shifting political and social landscape and a desire to embrace free speech, El.kz cites nbcnews.com.
Zuckerberg said Meta will end its fact-checking program with trusted partners and replace it with a community-driven system similar to X’s Community Notes.
The company is also changing its content moderation policies around political topics and undoing changes that reduced the amount of political content in user feeds, Zuckerberg said.
The changes will affect Facebook and Instagram, two of the largest social media platforms in the world, each boasting billions of users, as well as Threads.
"We're going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms," Zuckerberg said in a video. "More specifically, here's what we're going to do. First, we're going to get rid of fact-checkers and replace them with community notes similar to X, starting in the U.S."
Zuckerberg pointed to the election as a major influence on the company's decision and criticized "governments and legacy media" for, he alleged, pushing "to censor more and more."
"The recent elections also feel like a cultural tipping point towards, once again, prioritizing speech," he said. "So we're going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms."
He also said the systems the company had created to moderate its platforms were making too many mistakes, adding that it would continue to aggressively moderate content related to drugs, terrorism and child exploitation.
"We built a lot of complex systems to moderate content, but the problem with complex systems is they make mistakes," Zuckerberg said. "Even if they accidentally censor just 1% of posts, that's millions of people, and we've reached a point where it's just too many mistakes and too much censorship."
Beyond the end of the fact-checking program, Zuckerberg said, the company will eliminate some content policies around immigration, gender and other hot-button issues and refocus its automated moderation systems on what he called "high severity violations," relying on users to report other violations.
Facebook will also move its trust and safety and content moderation team from California to Texas.
"We're also going to tune our content filters to require much higher confidence before taking down content," he said. "The reality is that this is a trade-off. It means we're going to catch less bad stuff, but we'll also reduce the number of innocent people's posts and accounts that we accidentally take down."
Meta and social media companies broadly have in recent years reversed course on content moderation part because of the politicization of moderation decisions and programs. Republicans have long criticized Meta’s fact-checking system and fact-checking in general as unfair and favoring Democrats — a claim that is in dispute.
Conservatives have celebrated X’s Community Notes system, which CEO Elon Musk has used to replace the company's previous efforts around misinformation, and it has allowed for a mixture of fact-checking, trolling and other community-driven behavior.
CEOs and business leaders across sectors are currying favor with the incoming administration of President-elect Donald Trump. Meta, along with other tech companies, donated $1 million to Trump's inaugural fund, and ahead of the election, Zuckerberg praised Trump in an interview with Bloomberg Television without offering an outright endorsement. Ahead of Trump's inauguration, Meta has reportedly appointed Republican Joel Kaplan to lead itspolicy team, and Monday, its announced that UFC's Dana White, a longtime supporter of Trump’s, would join its board.