Meta is set to introduce significant changes to its content moderation policies. According to reports, founder Mark Zuckerberg disclosed plans to phase out the company’s fact-checking teams in favour of a community-driven model akin to X (formerly Twitter). The shift, starting in the United States aims to emphasise free speech and minimise political bias in moderation practices.
In a video message, Zuckerberg noted that Meta's fact-checkers had become "too politically biased," undermining trust rather than fostering it. Under the new policy, content filters will target only illegal and high-severity violations, while users will be encouraged to report less critical issues.
As part of the overhaul, Meta will relocate its content moderation teams from California to Texas, a move Zuckerberg said reflects "less concern about the bias." While this strategy seeks to reduce censorship, Zuckerberg acknowledged it could result in less effective filtering of harmful content, admitting, “catch less bad stuff.”
The policy shift coincides with escalating political tensions in the US and marks a more restrained approach to moderation following Donald Trump’s return to the White House.
Meta also revealed leadership changes, with Joel Kaplan succeeding Nick Clegg as the company’s head of global affairs. The appointment is seen by many as a potential pivot toward conservative-leaning policies.
Meta’s oversight board welcomed the decision to revisit fact-checking but emphasised the need for greater transparency and external input in the new moderation framework. The board urged Meta to include user feedback to ensure the policy's effectiveness.
The changes may have international ramifications, especially in regions like Europe, where Meta faces strict content regulation. In the UK, authorities expressed concerns over the revamped approach, reminding Meta of its obligations under the Online Safety Act, which mandates the removal of illegal and harmful content, particularly involving children.