Meta ditches fact-checkers amid Trump ties, admits “We’re going to catch less bad stuff”

Written by

Published 9 Jan 2025

Fact checked by

NSFW AI Why trust Greenbot

We maintain a strict editorial policy dedicated to factual accuracy, relevance, and impartiality. Our content is written and edited by top industry professionals with first-hand experience. The content undergoes thorough review by experienced editors to guarantee and adherence to the highest standards of reporting and publishing.

Disclosure

logo

Meta, the company that owns Facebook and Instagram, announced Tuesday that it will stop using independent fact-checkers to review posts in the United States. Instead, the company plans to let users add notes and context to questionable content, similar to the system used on X (formerly Twitter).

Major shift in content moderation policies

Mark Zuckerberg, Meta’s CEO, revealed the change as part of a broader move to reduce content restrictions on the platform. “We’ve reached a point where it’s just too many mistakes and too much censorship,” Zuckerberg said in a video announcement. “It’s time to get back to our roots around free expression.”

    The decision caught many fact-checking organizations off guard. “We heard the news just like everyone else,” said Alan Duke, editor of Lead Stories, which has worked with Meta since 2019. These organizations have helped Meta identify false claims and add warning labels to misleading posts since 2016.

    The timing and nature of the changes also suggest closer ties between Meta and President-elect Donald Trump’s team. The company recently donated $1 million to Trump‘s inauguration fund and last week promoted Republican policy executive Joel Kaplan to head of global affairs. Meta also added Dana White, a close Trump ally and CEO of Ultimate Fighting Championship, to its board.

    When asked if Meta’s changes were a response to his previous criticisms, Trump responded, “Probably.”

    “I think they’ve come a long way. Meta. Facebook,” Trump said to CNN. “I watched it, the man was very impressive.”

    How community notes will replace expert fact-checkers

    Under the new system, users will be able to add “community notes” to posts they think need more context or correction. Meta won’t write these notes or decide which ones appear. Instead, users will have to reach an agreement with different viewpoints to help prevent bias.

    The change comes with risks that Zuckerberg openly acknowledged. “The reality is this is a tradeoff,” he said. “We’re going to catch less bad stuff, but we’ll also reduce the number of innocent people’s posts and accounts that we accidentally take down.”

    Safety experts worry that the new approach could spread more harmful content online. “This is a major step back for content moderation at a time when disinformation and harmful content are evolving faster than ever,” said Ross Burley, who leads the Centre for Information Resilience, a nonprofit organization.

    Some fact-checkers strongly disagree with Zuckerberg’s claim that they showed political bias. “Fact-checking journalism has never censored or removed posts; it’s added information and context to controversial claims,” said Angie Drobnic Holan, director of the International Fact-Checking Network.

    Meta will continue using automated systems to catch serious violations like terrorism and child exploitation. For less severe issues, the company will only review content after users report it. The company is also moving its content policy teams from California to Texas, which Zuckerberg says will help address concerns about potential bias.

    The changes will roll out in the United States over the next few months. Meta plans to keep its fact-checking program in countries with stricter content regulations, such as Europe. The company has also faced multiple fines in the region for violating these rules.

    Meta says users can sign up now through Facebook, Instagram, or Threads to be among the first contributors to the new community notes program when it launches.