Meta Drops Fact-Checking, Eases Content Moderation Rules
Meta, the parent company of Facebook, Instagram, and WhatsApp, has made significant changes to its content moderation policies. These shifts come after criticism over the company’s role in spreading misinformation, particularly around political and health-related topics.
Key Changes in Meta’s New Content Moderation Approach
Meta has introduced three primary changes, which were outlined by the company’s Chief Global Affairs Officer, Joel Kaplan, in a blog post titled “More Speech, Fewer Mistakes.”
1. Ending Third-Party Fact-Checking Program
Meta will be phasing out its third-party fact-checking program. Instead, it will move to a Community Notes model, similar to what other platforms like X.com have adopted. This change is expected to impact how misinformation is addressed and how users interact with content on the platform.
2. Focusing on Severe Violations
Meta is also easing restrictions on what it considers “mainstream discourse.” It will now focus its enforcement efforts on serious violations, such as terrorism, child sexual exploitation, drugs, fraud, and scams, rather than on controversial but less harmful content.
3. Personalized Political Content
Meta is shifting towards encouraging users to engage with political content in a more personalized manner. This means users will be able to shape their feeds to display content that fits their views, increasing the diversity of opinions and opinions they see, and allowing people to create their own echo chambers.
The Context Behind Meta’s Shift
These changes come just ahead of a new U.S. presidential administration, which will have its own approach to free speech and political discourse. Meta’s previous content moderation practices, particularly its efforts to combat misinformation after the 2016 U.S. presidential election, drew significant criticism. Some argued that these policies suppressed legitimate political debate, while others felt they were too lenient in tackling harmful content.
Meta had introduced its fact-checking program in 2016, working with third-party organizations to reduce the spread of fake news. However, some critics argued that the fact-checking process was flawed and politically biased, leading to over-enforcement in some areas.
Meta’s Current Approach
The company now aims to reduce “over-enforcement” of its rules, acknowledging that even experts who managed the fact-checking process made errors in judgment. This resulted in the suppression of legitimate content and political discussions, which Meta now hopes to correct.
As Meta takes a more hands-off approach, critics argue that the company is opening the door to more misinformation and political bias. However, Meta maintains that the platform is built for free expression, even if it leads to some degree of messy or contentious content.
The Future of Meta’s Content Policies
Along with these policy changes, Meta is restructuring its leadership. UFC president Dana White has joined the board of directors, and the company’s trust and safety teams will be relocated from California to Texas. These shifts reflect Meta’s broader strategy to align itself with a changing political climate and meet the demands of its user base.
For more details on Meta’s new direction, visit Meta’s official blog post.