Business Daily Media

Men's Weekly

.

Elon Musk is right that Wikipedia is biased, but his AI alternative will be the same at best

  • Written by Taha Yasseri, Workday Professor of Technology and Society, Trinity College Dublin
Elon Musk is right that Wikipedia is biased, but his AI alternative will be the same at best

Elon Musk’s artificial intelligence company, xAI, is about to launch the early beta version of Grokipedia, a new project to rival Wikipedia.

Grokipedia has been described by Musk[1] as a response to what he views as the “political and ideological bias” of Wikipedia. He has promised[2] that it will provide more accurate and context-rich information by using xAI’s chatbot, Grok,[3] to generate and verify content.

Is he right? The question of whether Wikipedia is biased has been debated since its creation in 2001.

Wikipedia’s content is written and maintained by volunteers who can only cite material that already exists in other published sources, since the platform prohibits[4] original research. This rule, which is designed to ensure that facts can be verified, means that Wikipedia’s coverage inevitably reflects the biases of the media, academia and other institutions it draws from.

This is not limited to political bias. For example, research has repeatedly shown a significant gender imbalance[5] among editors, with around 80%–90% identifying as male in the English-language version.

Because most of the secondary sources used by editors are also historically authored by men, Wikipedia tends to reflect a narrower view of the world, a repository of men’s knowledge rather than a balanced record of human knowledge.

The volunteer problem

Bias on collaborative platforms often emerges from who participates rather than top-down policies. Voluntary participation introduces what social scientists call self-selection bias[6]: people who choose to contribute tend to share similar motivations, values and often political leanings.

Just as Wikipedia depends on such voluntary participation, so does, for example, Community Notes[7], the fact-checking feature on Musk’s X (formerly Twitter). An analyses of Community Notes[8], which I conducted with colleagues, shows that its most frequently cited external source – after X itself – is actually Wikipedia.

Other sources commonly used by note authors mainly cluster toward centrist or left-leaning outlets. They even use the same list of approved sources[9] as Wikipedia – the crux of Musk’s criticism against the open online encyclopedia. Yet no-one calls out Musk for this bias.

Elon Musk's X profile
The problem with Community Notes … Tada Images[10]

Wikipedia at least remains one of the few large-scale platforms that openly acknowledges and documents its limitations. Neutrality is enshrined as one of its five foundational principles[11]. Bias exists, but so does an infrastructure designed to make that bias visible and correctable.

Articles often include multiple perspectives, document controversies, even dedicate sections to conspiracy theories such as those surrounding the September 11 attacks[12]. Disagreements are visible through edit histories and talk pages, and contested claims are marked with warnings. The platform is imperfect but self-correcting, and it is built on pluralism and open debate.

Is AI unbiased?

If Wikipedia reflects the biases of its human editors and their sources, AI has the same problem with the biases of its data.

Large language models (LLMs)[13] such as those used by xAI’s Grok are trained on enormous datasets collected from the internet, including social media, books, news articles and Wikipedia itself[14]. Studies have shown that LLMs reproduce existing gender, political and racial biases[15] found in their training data.

Musk has claimed that Grok is designed to counter such distortions, but Grok itself has been accused of bias. One study[16] in which each of four leading LLMs were asked 2,500 questions about politics showed that Grok is more politically neutral than its rivals, but still actually has a left of centre bias (the others lean further left).

Study showing the bias in LLMs
MIchael D'Angelo/Promptfoo, CC BY-SA[17][18] If the model behind Grokipedia relies on the same data and algorithms, it is difficult to see how an AI-driven encyclopedia could avoid reproducing the very biases that Musk attributes to Wikipedia. Worse, LLMs could exacerbate the problem. They operate probabilistically, predicting the most likely next word or phrase based on statistical patterns rather than deliberation among humans. The result is what researchers call an illusion of consensus[19]: an authoritative-sounding answer that hides the uncertainty or diversity of opinions behind it. As a result, LLMs tend to homogenise[20] political diversity and favour majority viewpoints over minority ones. Such systems risk[21] turning collective knowledge into a smooth but shallow narrative. When bias is hidden beneath polished prose, readers may no longer even recognise that alternative perspectives exist. Baby/bathwater Having said all that, AI can still strengthen[22] a project like Wikipedia. AI tools already help the platform to detect vandalism, suggest citations and identify inconsistencies in articles. Recent research highlights[23] how automation can improve accuracy if used transparently and under human supervision[24]. AI could also help transfer knowledge across different language editions and bring the community of editors closer. Properly implemented, it could make Wikipedia more inclusive, efficient and responsive[25] without compromising its human-centered ethos. Wikipedia on a laptop How much bias can you live with? Michaelangeloop[26] Just as Wikipedia can learn from AI, the X platform could learn from Wikipedia’s model of consensus building. Community Notes allows users to submit and rate notes on posts, but its design limits direct discussion among contributors. Another research project[27] I was involved in showed that deliberation-based systems inspired by Wikipedia’s talk pages improve accuracy and trust among participants, even when the deliberation happens between humans and AI[28]. Encouraging dialogue rather than the current simple up or down-voting could make Community Notes more transparent, pluralistic and resilient against political polarisation. Profit and motivation A deeper difference between Wikipedia and Grokipedia lies in their purpose and perhaps business model. Wikipedia is run by the non-profit Wikimedia Foundation[29], and the majority of its volunteers are motivated mainly by public interest[30]. In contrast, xAI, X and Grokipedia are commercial ventures. Although profit motives are not inherently unethical, they can distort incentives. When X began selling its blue check verification[31], credibility became a commodity rather than a marker of trust. If knowledge is monetised in similar ways, the bias may increase, shaped by what generates engagement and revenue. True progress lies not in abandoning human collaboration but in improving it. Those who perceive bias in Wikipedia, including Musk himself, could make a greater contribution by encouraging editors from diverse political, cultural and demographic backgrounds to participate – or by joining the effort personally to improve existing articles. In an age increasingly shaped by misinformation, transparency, diversity and open debate are still our best tools for approaching truth. References^ Musk (theconversation.com)^ He has promised (news.northeastern.edu)^ xAI’s chatbot, Grok, (www.geekextreme.com)^ prohibits (en.wikipedia.org)^ significant gender imbalance (meta.wikimedia.org)^ self-selection bias (en.wikipedia.org)^ Community Notes (communitynotes.x.com)^ An analyses of Community Notes (arxiv.org)^ list of approved sources (en.wikipedia.org)^ Tada Images (www.shutterstock.com)^ five foundational principles (en.wikipedia.org)^ September 11 attacks (en.wikipedia.org)^ Large language models (LLMs) (theconversation.com)^ Wikipedia itself (arxiv.org)^ gender, political and racial biases (www.nature.com)^ One study (www.promptfoo.dev)^ MIchael D'Angelo/Promptfoo (www.promptfoo.dev)^ CC BY-SA (creativecommons.org)^ illusion of consensus (verfassungsblog.de)^ homogenise (osf.io)^ Such systems risk (verfassungsblog.de)^ still strengthen (tahayasseri.com)^ Recent research highlights (linkinghub.elsevier.com)^ transparently and under human supervision (arxiv.org)^ inclusive, efficient and responsive (www.cell.com)^ Michaelangeloop (www.shutterstock.com)^ research project (dl.acm.org)^ deliberation happens between humans and AI (arxiv.org)^ Wikimedia Foundation (wikimediafoundation.org)^ by public interest (dl.acm.org)^ blue check verification (elmmarketing.co.nz)

Read more https://theconversation.com/grokipedia-elon-musk-is-right-that-wikipedia-is-biased-but-his-ai-alternative-will-be-the-same-at-best-267557

pay.com.au unveils first-of-its-kind FX rewards feature, becoming the most flexible rewards solution for Aussie businesses

pay.com.au, the end-to-end payments and rewards platform, today announced the launch of International Payments, Australia’s first foreign exchange...

Yellow Canary partners with Celery to bring pre-payroll assurance technology to Australia

Wage underpayment headlines continue to put pressure on employers of all sizes, revealing how costly payroll mistakes can be for small and medium bu...

Brennan Bolsters Leadership to Accelerate Next Growth Chapter

In a move to further embed cybersecurity at the heart of its business strategy and deliver sovereign secure-by-design solutions for its customers, A...

How to Be Investable: Insights from Richelle Nicols, CEO of Pollinatr

Richelle Nicols is the CEO of Pollinatr, a pioneering investment and business development program designed to support and accelerate the growth of s...

What Can Australian SMEs Hope For in a Meeting Between Albanese and Trump?

For small and medium-sized enterprises (SMEs) in Australia, international politics might seem distant—but when leaders like Prime Minister Anthony...

Qantas to Serve Nan’s Davidson Plum Cookie

Lake Macquarie, NSW (Awabakal Country): From a single mother’s kitchen bench to supermarket shelves, Wiradjuri entrepreneur Terri-Ann “Tezzi” Dani...