Jump to content

Wikipedia:Wikipedia Signpost/2024-12-12/In the media

From Wikipedia, the free encyclopedia
In the media

Right-wing politician opines: "Like the BBC, often useful but not impartial"

"As biased as the BBC"

The Telegraph analyzes David Rozado's paper Is Wikipedia Politically Biased? which can be viewed as answering the question with a qualified "yes". The study was covered in The Signpost on July 4 by Tilman Bayer, who answered the same question with a qualified "perhaps". The Telegraph cites Bayer's article, and much of the rest of the article quotes British conservatives on why they think that Wikipedia has a liberal bias. Toby Young says:

"My own Wikipedia entry looks like it’s been written by [Left-wing commentator] Owen Jones," says Toby Young, the director of the Free Speech Union. "I used to check it from time to time and painstakingly remove all the inaccuracies that had been introduced since the last time I’d looked, but I’ve now given up."

Tory Jacob Rees-Mogg says he is "not surprised" by Rozado's results, "I have always thought of Wikipedia as if it were the BBC: often useful but not impartial."

Nigel Farage of Reform UK is quoted saying "There’s an inherent liberal bias in all of these sites. Whether it’s old-fashioned stuff like Wikipedia or newer sites like ChatGPT, it all has a huge bias – and that’s why it’s an absolute joy that Elon Musk’s bought X to give it a bit of balance."

Not to be outdone in comparing Wikipedia to the BBC, GB News's article 'Wikipedia is just as biased as the BBC' Research finds platform associates 'more negative' words with right-wing public figures, summarized most of the reactions from the Telegraph article, including Bayer's, and then plays fast and loose with the facts.

Wikipedia's own co-founder, Larry Sanger ... has accused the site of being taken over by left-leaning volunteers.

Last year, he told The Telegraph: "Wikipedia has, just like academia, tended to drive away people on the right," he says, "because conservatives tend to self-select out of communities that are deeply hostile to them."

Wikipedia's other co-founder, Jimmy Wales, who is chief executive of its parent company, the Wikimedia Foundation, has insisted the website is not "woke" and its [sic] "not true" to suggest it has become a standard bearer for left-wing causes.

As good as it's ever been?

Jimmy Wales, interviewed in New York magazine's "Jimmy Wales on Why Wikipedia Is Still So Good" answers a broad range of questions including how the platform is working now, explosive topics, the two sides of AI, and why people shouldn't edit their own pages.

  • On current challenges in the information environment: "the decline of trust in journalism and politics is quite severe, which then, in some cases, translates into people feeling more angry and lost because they don't know what to trust and what to believe. That's all challenging, but the Wikipedia community, we just plug away, trying to be neutral, trying to be clear."
  • On accusations of a liberal bias: "the main thing I think about there is what I call 'community health.' What that really means is, Is the community happy? Is it doing quality work, productive work? Is it feeling supported in that work? That's obviously crucial because if you don’t have the right people, what things can descend into — and we see this all over the internet — is just a battleground."
  • On why biographies of minor celebrities haven't been updated since 2012 or 2013: "that is almost certainly the case because that was when we had the peak number of editors. It stabilized, so it didn't collapse after that, but it stabilized and is pretty steady. Again, it would require a job that runs all the time, that's scanning through Wikipedia one page at a time. And now that you can augment it with search — ChatGPT by default will often go and do a search on the web. I'm going to play with this. That’s actually a great idea, just to say, 'Here's a page that hasn't been updated and is citing sources from 2012, but there’s actually new important information in the past five years.' "
  • On COI editors getting caught editing their own articles: "if you're in the public eye, that could become a problem. But I also understand it. One reason we've never outright banned it, rather than just recommend against it, is that sometimes people see something terrible that's wrong in their entry. They click 'edit' and they take it out. They sometimes yell at people: 'Why did you say this about me?' And you're like, Well, that's not really how to be a good Wikipedian, but it's not your job to be a good Wikipedian. You're just somebody who had a bad false claim in your Wikipedia entry and you're upset. We sort of have to go, 'Yeah, actually, that's fine.' There's probably a better way, but we don't want Wikipedia to be wrong."

Institutional capture from organized editing groups and other problems noted with genocide articles, again

See prior Signpost coverage September 4, August 14, and July 4

"Wikipedia editors include Palestine in genocide of indigenous peoples article": A report by The Jewish Journal of Greater Los Angeles includes the editing of at least one editor who is a party to the new Israel-Palestine case opened by Arbcom, which we reported on in issue 16.

In a Toronto National Post op-ed by Neil Seeman and Jeff Ballabon titled Wikipedia has it out for Israel, and we've got the data to prove it", the authors say their data-driven analysis found

biases [that] contradict the spirit of a "wiki" — an ethos of bottom-up collaboration and respect expressed toward all its volunteer editors. These biases include: elite theory bias, that is, a preference for academic sources over grassroots knowledge; high-contributor frequency bias (disproportionate influence of frequent editors); citation gaming (strategic use of citations to push particular viewpoints); temporal bias (over-representation of recent events or perspectives); institutional capture systematic bias (from organized editing groups); language complexity bias (use of complex language to obscure bias); and source selectivity bias (selective choice of sources to support particular views).

The New York based Algemeiner Journal has an article titled "Wikipedia's Quiet Revolution: How a Coordinated Group of Editors Reshaped the Israeli-Palestinian Narrative".

It covers the allegations of editing being coordinated by Tech for Palestine, labeling it "ideological subversion at scale", and describes how "After Rindsberg's [Pirate Wires] report was published, Ïvana was 'summoned' — in her words — by Wikipedia's Arbitration Committee and is reportedly facing a potential lifetime ban from the platform." That very Arbitration Committee action is covered here in The Signpost both in the earlier mentioned issue 16, and in this month's Arbitration report.

B

Politics and business in the pages of Wikipedia

EU Reporter gives an encyclopedic review of "article manipulation" on Wikipedia, which we might call paid and conflict of interest editing by activists, corporations, political parties and governments. Their examples include the article Comparisons between Donald Trump and Fascism, Asian News International, the Warsaw death camp story, the takeover of Croatian Wikipedia, with mentions to articles about Israel and the war in Gaza. You might even think that you are reading an article in The Signpost. Nevertheless we'll raise a red flag. EU Reporter's business model includes a free online newspaper popular among the EU political class, as well as bespoke information gathering for other clients. We could not identify the authors of this article, the International Foundation for Better Governance, despite two requests for information, which were not answered. – S

In brief

Terrible towel on the International Space Station

Wikipedia is a disaster. It purports to be an objective source of the world’s knowledge, and in reality it is a propaganda machine funded by unwitting citizen donors.
Let’s bring back the Encyclopedia Britannica.
- Bill Ackman on X

Actually, Bill, Encyclopedia Britannica has never gone away. You can access it at Britannica.com; it costs less than $75 per year. S

WMDE plans to make Wikidata’s data easily accessible for the Open Source AI/ML Community via an advanced vector search by expanding the functionality with fully multilingual models, such as Jina AI through DataStax’s API portal, to semantically search up to 100 of the languages represented on Wikidata. To vector embed a large, massively multilingual, multicultural, and dynamic dataset is a hard challenge, especially for low-resource, low-capacity open source developers. With DataStax’s collaboration, there is a chance that the world can soon access large subsets of Wikidata’s data for their AI/ML applications through an easier-to-access method. Although only available in English for now, DataStax’s solution provided a valuable initial experiment ~10x faster than our previous, on-premise GPU solution. This near-real-time speed will permit us to experiment at scale and speed by testing the integration of large subsets in a vector database aligned with the frequent updates of Wikidata
- Dr. Jonathan Fraine, Chief Technology Officer, Wikimedia Deutschland.

An earlier presentation gives further details.



Do you want to contribute to "In the media" by writing a story or even just an "in brief" item? Edit next week's edition in the Newsroom or leave a tip on the suggestions page.