Business Daily Media

Men's Weekly

.

What Meta’s move to community moderation could mean for misinformation

  • Written by Denitsa Dineva, Senior Lecturer (Associate Professor) in Marketing and Strategy, Cardiff University
Donald Trump silhouette and Facebook logo on phone

Meta, the parent company of Facebook, Instagram, WhatsApp and other services has announced it will discontinue its third-party factchecking[1] programmes, starting in the US. Journalists and anti-hate speech activists have criticised the decision as an attempt to curry favour with[2] the incoming US president, Donald Trump, but there could be an even more cynical reason. Meta’s strategy could be a calculated move for greater user engagement and income.

This decision marks a significant shift in how the social media giant addresses misinformation on its platforms.

Meta’s official rationale[3] for ending its independent factchecking in favour of crowdsourced contributions[4] centres on promoting free expression. Chief executive, Mark Zuckerberg, said that the company seeks to reduce censorship and will concentrate its enforcement efforts on illegal or highly harmful content.

This move aligns with broader discussions among governments, social media companies, civil society groups and the public on balancing freedom of expression and content moderation. These debates have become urgent, as there is mounting evidence that there are biases in content moderation.

For example, a 2023 University of Cambridge study[5] discusses how biases in content moderation disadvantage the cultural, social, and economic rights of marginalised communities.

The crowdsourcing model does encourage participatory moderation. But professional factchecking can be more effective[6] at ensuring accuracy and consistency in content moderation, due to the expertise and rigorous methods of trained factcheckers or automated models.

However, social media platforms, including Meta, make their revenue[7] from user engagement. The type of content flagged as misleading or harmful often attracts more attention due to platform algorithms amplifying its reach.

A 2022 US study[8], for instance, shows that political polarisation increases truth bias[9], which is the human tendency to believe people with they identify with are telling the truth. This can lead to higher user engagement with disinformation, which is further amplified by algorithms that prioritise attention-grabbing content.

What might this mean for our digital information ecosystem?

1. Increased exposure to misinformation

Without professional factcheckers, the prevalence of false or misleading content will probably rise. Community-driven moderation may be inclusive and decentralised, but it has its limitations.

As shown by X’s community notes[10], the success of crowdsourced moderation[11] relies on both participation from informed users and users reaching a consensus on the notes, neither of which is guaranteed. Without independent factchecking mechanisms, users may find it increasingly difficult to distinguish credible information from misinformation.

Donald Trump silhouette and Facebook logo on phone
Meta has been accused of changing its policy to get closer to Donald Trump. Camilo Concha/Shutterstock[12]

2. The burden of verification

As professional oversight diminishes, the responsibility for assessing content accuracy falls on users. But many social media users don’t have the media literacy, time, or expertise needed to evaluate complex claims. This shift risks amplifying the spread of falsehoods, particularly among audiences[13] who are less equipped to navigate the digital information landscape.

3. The risk of manipulation

Crowdsourced moderation is vulnerable to coordinated efforts by organised groups. A 2018 study[14] examined millions of messages over several months to explore how social bots and user interactions contribute to the spread of information, particularly low-credibility content. The study found that social bots played a significant role in amplifying content from unreliable sources, especially during the early stages, before an article went viral.

This evidence shows that organised groups can exploit crowdsourced moderation to amplify the narratives that suit them. Such a dynamic could undermine the credibility and objectivity of the moderation process, eroding trust in the platform. Millions of X users have already migrated to its rival Bluesky[15] for similar reasons.

4. Impact on public discourse

Unchecked misinformation can polarise communities, create distrust, and distort public debate. Governments, academics and social groups have already criticised social media platforms for their role in amplifying divisive content[16], and Meta’s decision could intensify these concerns. The quality of discussions on Facebook and Instagram may decline as misinformation spreads more freely, potentially influencing public opinion and policy-making.

There is no perfect solution to the challenges of content moderation. Meta’s emphasis on free expression resonates with longstanding debates about the role of tech companies in policing online content.

Critics of censorship[17] argue that overly aggressive moderation suppresses important discussions. Meta aims to create a platform that fosters open dialogue and minimises the risk of suppression, by reducing its reliance on factcheckers.

However, the trade-offs are clear. Free expression without proper safeguards can enable the unchecked proliferation of harmful content, including conspiracy theories, hate speech and medical misinformation.

Achieving the right balance between protecting free speech and ensuring the integrity of information is a complex challenge, and one that is evolving. Meta’s announcement to shift from professional factchecking to crowdsourced community moderation risks undermining this balance by amplifying the spread of disinformation and hateful speech.

References

  1. ^ discontinue its third-party factchecking (theconversation.com)
  2. ^ attempt to curry favour with (www.bbc.co.uk)
  3. ^ official rationale (www.theguardian.com)
  4. ^ crowdsourced contributions (nypost.com)
  5. ^ a 2023 University of Cambridge study (www.cambridge.org)
  6. ^ more effective (academic.oup.com)
  7. ^ revenue (www.investopedia.com)
  8. ^ 2022 US study (onlinelibrary.wiley.com)
  9. ^ truth bias (www.thebehavioralscientist.com)
  10. ^ As shown by X’s community notes (www.washingtonpost.com)
  11. ^ success of crowdsourced moderation (ojs.aaai.org)
  12. ^ Camilo Concha/Shutterstock (www.shutterstock.com)
  13. ^ particularly among audiences (www.tandfonline.com)
  14. ^ 2018 study (www.nature.com)
  15. ^ X users have already migrated to its rival Bluesky (www.theguardian.com)
  16. ^ role in amplifying divisive content (www.ft.com)
  17. ^ Critics of censorship (onlinelibrary.wiley.com)

Read more https://theconversation.com/what-metas-move-to-community-moderation-could-mean-for-misinformation-247016

Yellow Canary partners with Celery to bring pre-payroll assurance technology to Australia

Wage underpayment headlines continue to put pressure on employers of all sizes, revealing how costly payroll mistakes can be for small and medium bu...

Brennan Bolsters Leadership to Accelerate Next Growth Chapter

In a move to further embed cybersecurity at the heart of its business strategy and deliver sovereign secure-by-design solutions for its customers, A...

How to Be Investable: Insights from Richelle Nicols, CEO of Pollinatr

Richelle Nicols is the CEO of Pollinatr, a pioneering investment and business development program designed to support and accelerate the growth of s...

What Can Australian SMEs Hope For in a Meeting Between Albanese and Trump?

For small and medium-sized enterprises (SMEs) in Australia, international politics might seem distant—but when leaders like Prime Minister Anthony...

Qantas to Serve Nan’s Davidson Plum Cookie

Lake Macquarie, NSW (Awabakal Country): From a single mother’s kitchen bench to supermarket shelves, Wiradjuri entrepreneur Terri-Ann “Tezzi” Dani...

Minns Labor Government shutting down the Business Connect program

The NSW Opposition is concerned that the Labor government will shut down a support program that has assisted New South Wales businesses. In a media ...