In a significant shift in content moderation policy, Meta CEO Mark Zuckerberg announced on Tuesday that Facebook will be replacing its fact-checking program with a community-driven system called "Community Notes". This move, which will initially be implemented in the United States, marks a departure from the company's long-standing collaboration with third-party fact-checkers and signals a return to what Zuckerberg describes as the platform's "roots around free expression".
The new approach will involve:
- Elimination of third-party fact-checkers
- Introduction of Community Notes, similar to the system used by X (formerly Twitter)
- Simplification of content policies
- Increased focus on reducing moderation mistakes
The Community Notes feature will allow users to:
- Call out potentially misleading posts
- Add context to content that may require additional information
- Rate the helpfulness of notes provided by other users
This system aims to empower the community to decide when posts need more context and what kind of information is helpful for other users to see.
Zuckerberg cited several reasons for this significant policy shift:
- Addressing concerns about political bias in fact-checking
- Reducing over-enforcement and censorship
- Restoring free expression on Meta's platforms
Joel Kaplan, Meta's chief global affairs officer, stated that the previous fact-checking program had "too much political bias in what they choose to fact-check".
The decision has sparked mixed reactions:
- Supporters view it as a step towards greater free speech on the platform
- Critics worry about the potential spread of misinformation
Meta executives have suggested that this change aligns with a perceived cultural shift towards valuing free speech, particularly in light of recent political developments.
The new Community Notes system will be rolled out over the next few months, starting in the United States. Meta recommends that AR creators who have published effects on its platforms download and save their project files, assets, and demos before January 14, 2025, when the current Spark AR platform will be shut down.
This policy change comes amid a series of modifications to Meta's content moderation strategies:
- Lifting restrictions on certain topics like immigration and gender
- Relocating content moderation teams
- Adjusting automated systems to focus on "high-severity" violations
These changes reflect a significant shift in Meta's approach to content moderation and free speech on its platforms, potentially reshaping the landscape of social media discourse in the coming years.