Meta has announced the shutdown of its U.S. fact-checking program, a decision driven by concerns over perceived biases and errors within the system, especially in the context of rising misinformation ahead of the 2024 presidential election. The company plans to move to a Community Notes model, allowing users to collaboratively flag misinformation. Critics fear this shift may exacerbate misinformation and further divide public discourse, raising alarms among global fact-checking organizations about its potential impact on the integrity of news.
Meta has announced the shutdown of its third-party fact-checking program, citing a "cultural tipping point" following the 2024 U.S. presidential election as a key reason for the decision. The company stated that the current system, which relied on external fact-checkers, made too many errors and contributed to perceived bias.
Meta emphasized its intention to prioritize free speech and reduce reliance on external oversight, aligning with a broader shift toward community engagement in content moderation. The decision was also influenced by Elon Musk's X platform, which uses a Community Notes system to allow users to collaboratively flag and contextualize misinformation. Critics, however, warn that this approach could exacerbate misinformation risks, particularly in politically charged environments.
The new system, Community Notes, will replace the third-party program in the U.S., with less obtrusive labels directing users to additional information. While Meta will continue moderating content related to drugs, terrorism, child exploitation, fraud, and scams, restrictions on controversial topics like immigration and gender will be lifted. The shift follows Trump's reinstitution and reflects Meta's alignment with conservative political sentiments, which has drawn both praise and criticism.
The company's trust and safety team is also relocating from California to other U.S. locations, signaling a broader restructuring of its content moderation strategy. Despite these changes, concerns persist about the lack of preparation and testing for the Community Notes system, raising fears of manipulation by foreign actors, bots, and influence campaigns.
Global fact-checking organizations have expressed alarm over the decision, citing potential financial impacts and the erosion of trust in the news ecosystem. Fact-checkers from countries like Brazil, Croatia, Italy, Nigeria, Ukraine, and the Philippines worry about losing critical revenue streams. Meta's move also comes amid political tensions, with the company reportedly seeking to mend relations with the incoming Trump administration, including a $1 million donation to Trump's inaugural fund.
Critics argue that the decision could undermine efforts to combat health-related misinformation, such as content promoting eating disorders or mental health issues, while further polarizing public discourse. As Meta adapts to its new model, the broader implications for content moderation and misinformation risks remain uncertain.