Wikipedia:Wikipedia Signpost/2024-12-12/In the media
Right-wing politician opines: "Like the BBC, often useful but not impartial"
"As biased as the BBC"
The Telegraph analyzes David Rozado's paper Is Wikipedia Politically Biased? which can be viewed as answering the question with a qualified "yes". The study was covered in The Signpost on July 4 by Tilman Bayer, who answered the same question with a qualified "perhaps". The Telegraph cites Bayer's article, and much of the rest of the article quotes British conservatives on why they think that Wikipedia has a liberal bias. Toby Young says:
"My own Wikipedia entry looks like it’s been written by [Left-wing commentator] Owen Jones," says Toby Young, the director of the Free Speech Union. "I used to check it from time to time and painstakingly remove all the inaccuracies that had been introduced since the last time I’d looked, but I’ve now given up."
Tory Jacob Rees-Mogg says he is "not surprised" by Rozado's results, "I have always thought of Wikipedia as if it were the BBC: often useful but not impartial."
Nigel Farage of Reform UK is quoted saying "There’s an inherent liberal bias in all of these sites. Whether it’s old-fashioned stuff like Wikipedia or newer sites like ChatGPT, it all has a huge bias – and that’s why it’s an absolute joy that Elon Musk’s bought X to give it a bit of balance."
Not to be outdone in comparing Wikipedia to the BBC, GB News's article 'Wikipedia is just as biased as the BBC' Research finds platform associates 'more negative' words with right-wing public figures, summarized most of the reactions from the Telegraph article, including Bayer's, and then plays fast and loose with the facts.
Wikipedia's own co-founder, Larry Sanger ... has accused the site of being taken over by left-leaning volunteers.
Last year, he told The Telegraph: "Wikipedia has, just like academia, tended to drive away people on the right," he says, "because conservatives tend to self-select out of communities that are deeply hostile to them."
Wikipedia's other co-founder, Jimmy Wales, who is chief executive of its parent company, the Wikimedia Foundation, has insisted the website is not "woke" and its [sic] "not true" to suggest it has become a standard bearer for left-wing causes.
As good as it's ever been?
Jimmy Wales, interviewed in New York magazine's "Jimmy Wales on Why Wikipedia Is Still So Good" answers a broad range of questions including how the platform is working now, explosive topics, the two sides of AI, and why people shouldn't edit their own pages.
- On current challenges in the information environment: "the decline of trust in journalism and politics is quite severe, which then, in some cases, translates into people feeling more angry and lost because they don't know what to trust and what to believe. That's all challenging, but the Wikipedia community, we just plug away, trying to be neutral, trying to be clear."
- On accusations of a liberal bias: "the main thing I think about there is what I call 'community health.' What that really means is, Is the community happy? Is it doing quality work, productive work? Is it feeling supported in that work? That's obviously crucial because if you don’t have the right people, what things can descend into — and we see this all over the internet — is just a battleground."
- On why biographies of minor celebrities haven't been updated since 2012 or 2013: "that is almost certainly the case because that was when we had the peak number of editors. It stabilized, so it didn't collapse after that, but it stabilized and is pretty steady. Again, it would require a job that runs all the time, that's scanning through Wikipedia one page at a time. And now that you can augment it with search — ChatGPT by default will often go and do a search on the web. I'm going to play with this. That’s actually a great idea, just to say, 'Here's a page that hasn't been updated and is citing sources from 2012, but there’s actually new important information in the past five years.' "
- On COI editors getting caught editing their own articles: "if you're in the public eye, that could become a problem. But I also understand it. One reason we've never outright banned it, rather than just recommend against it, is that sometimes people see something terrible that's wrong in their entry. They click 'edit' and they take it out. They sometimes yell at people: 'Why did you say this about me?' And you're like, Well, that's not really how to be a good Wikipedian, but it's not your job to be a good Wikipedian. You're just somebody who had a bad false claim in your Wikipedia entry and you're upset. We sort of have to go, 'Yeah, actually, that's fine.' There's probably a better way, but we don't want Wikipedia to be wrong."
Institutional capture from organized editing groups and other problems noted with genocide articles, again
- See prior Signpost coverage September 4, August 14, and July 4
"Wikipedia editors include Palestine in genocide of indigenous peoples article": A report by The Jewish Journal of Greater Los Angeles includes the editing of at least one editor who is a party to the new Israel-Palestine case opened by Arbcom, which we reported on in issue 16.
In a Toronto National Post op-ed by Neil Seeman and Jeff Ballabon titled Wikipedia has it out for Israel, and we've got the data to prove it", the authors say their data-driven analysis found
biases [that] contradict the spirit of a "wiki" — an ethos of bottom-up collaboration and respect expressed toward all its volunteer editors. These biases include: elite theory bias, that is, a preference for academic sources over grassroots knowledge; high-contributor frequency bias (disproportionate influence of frequent editors); citation gaming (strategic use of citations to push particular viewpoints); temporal bias (over-representation of recent events or perspectives); institutional capture systematic bias (from organized editing groups); language complexity bias (use of complex language to obscure bias); and source selectivity bias (selective choice of sources to support particular views).
The New York based Algemeiner Journal has an article titled "Wikipedia's Quiet Revolution: How a Coordinated Group of Editors Reshaped the Israeli-Palestinian Narrative".
It covers the allegations of editing being coordinated by Tech for Palestine, labeling it "ideological subversion at scale", and describes how "After Rindsberg's [Pirate Wires] report was published, Ïvana was 'summoned' — in her words — by Wikipedia's Arbitration Committee and is reportedly facing a potential lifetime ban from the platform." That very Arbitration Committee action is covered here in The Signpost both in the earlier mentioned issue 16, and in this month's Arbitration report.
– B
Politics and business in the pages of Wikipedia
EU Reporter gives an encyclopedic review of "article manipulation" on Wikipedia, which we might call paid and conflict of interest editing by activists, corporations, political parties and governments. Their examples include the article Comparisons between Donald Trump and Fascism, Asian News International, the Warsaw death camp story, the takeover of Croatian Wikipedia, with mentions to articles about Israel and the war in Gaza. You might even think that you are reading an article in The Signpost. Nevertheless we'll raise a red flag. EU Reporter's business model includes a free online newspaper popular among the EU political class, as well as bespoke information gathering for other clients. We could not identify the authors of this article, the International Foundation for Better Governance, despite two requests for information, which were not answered. – S
In brief
- So terrible, yet so good: Alt-weekly Pittsburgh City Paper finds the article Terrible Towel terrific. See Yinzer for a possible explanation.
- Questions about truth, authority, and shared responsibility: How Creative Commons licenses can transform Rwandan storytelling
- Art models life training art, imitating life: Infinite Craft, a sandbox game, allows the creation of an in-game "Wikipedia" (Game Rant [1]). In an odd bit of life imitating art recursion, the game is based on Meta's LLaMA large language model, and LLaMA was in turn trained on Wikipedia.
- The Wikimedia Foundation's crucial spot on the frontlines of the disinformation war: PEN America interviews Maggie Dennis, vice president of community resilience and sustainability at the Wikimedia Foundation.
- WikiEd–Mellon collab: Look It Up: Humanities Students are Filling Wikipedia’s Content Gaps, Mellon Foundation
- An Indian voice: Comedian meghnerd lets it all hang out.
- The US politics-driven vandalism is just beginning: A vandalism edit that lived for all of one minute was reported on by Live Mint: [2]. Bill Ackman, who had trouble with the meaning of "plagiarism", took the one minute example of potty-talk as enough evidence to post on X:
Wikipedia is a disaster. It purports to be an objective source of the world’s knowledge, and in reality it is a propaganda machine funded by unwitting citizen donors.
Let’s bring back the Encyclopedia Britannica.
- Bill Ackman on X
- Actually, Bill, Encyclopedia Britannica has never gone away. You can access it at Britannica.com; it costs less than $75 per year. – S
- Fourteen leading reading rollercoasters: Comic Sands summarizes a Reddit thread about Wikipedia articles that take you on a rollercoaster. What's a reading rollercoaster, you ask? All of the biographies have lots of ups and downs and will occasionally turn your stomach. The top 14 were Colonel Sanders, Jerrie Cobb, Jack Parsons, Shi Pei Pu, Michael Angelo Aquino, Juan Pujol García, Julie d'Aubigny, Arthur Rimbaud, Northern Calloway, Jake Weber, and Tarrare, 1904 Olympic marathon, Jim Thorpe (the world's greatest athlete, not the borough), and Roy Sullivan.
- DataStax and WMDE announced a new project Wikimedia Deutschland Launches AI Knowledge Project in Collaboration with DataStax Built with NVIDIA AI DataStax
WMDE plans to make Wikidata’s data easily accessible for the Open Source AI/ML Community via an advanced vector search by expanding the functionality with fully multilingual models, such as Jina AI through DataStax’s API portal, to semantically search up to 100 of the languages represented on Wikidata. To vector embed a large, massively multilingual, multicultural, and dynamic dataset is a hard challenge, especially for low-resource, low-capacity open source developers. With DataStax’s collaboration, there is a chance that the world can soon access large subsets of Wikidata’s data for their AI/ML applications through an easier-to-access method. Although only available in English for now, DataStax’s solution provided a valuable initial experiment ~10x faster than our previous, on-premise GPU solution. This near-real-time speed will permit us to experiment at scale and speed by testing the integration of large subsets in a vector database aligned with the frequent updates of Wikidata
- Dr. Jonathan Fraine, Chief Technology Officer, Wikimedia Deutschland.
- An earlier presentation gives further details.
- Top-read Wiki articles: You'll never believe what's #1: A number of media wrote pieces on the year's end list of most read (English) Wikipedia articles, including ITV, CNN, and Boing Boing. Mediaite's clickbait-friendly title was "Here Are the Top 25 Wikipedia Searches for 2024 — And #1 is BLEAK". Spoiler, Deaths in 2024 topped the list.
Discuss this story
As the title is a quote from a politician with an agenda, it should not be in WP’s voice. With out it in quote marks, with appropriate inline attribution, or choose a different title. - SchroCat (talk) 22:02, 12 December 2024 (UTC)[reply]It was not really in The Signpost's voice since it was in quotes.It definitely was not in Wikipedia's voice. To tell you the truth, I more or less agree with the quote: Wikipedia is very useful, and if folks want to say that we have a bias something like the BBC's, that's ok with me. Everybody has some "bias" and if ours is comparable to the BBC's, we've done a pretty good job minimizing bias.Since I am being mentioned: I think this Signpost story could have been a bit clearer in summarizing what the Telegraph and GB News were citing me/the Signpost for, namely as leveling "accusations of bias" against Rozado's report itself - although that's also not quite what I actually said in the review:
As laid out in the review, I think it's worth taking the report seriously, but also - like with various other research that is being perceived as showing incontrovertible evidence of Wikipedia being biased against various groups (like conservatives here) - one may want to retain some healthy skepticism about causality claims. Regards, HaeB (talk) 22:44, 12 December 2024 (UTC) (Tilman)[reply]
Describing people who deny science, reality, and facts as denying science, reality, and facts is not bias. – Jonesey95 (talk) 03:09, 14 December 2024 (UTC)[reply]
"DataStax and WMDE"
Besides the company's blog post and the press release, it might have been useful to also link WMDE's own accouncement (which we already briefly covered in last issue's "Recent research", search for "Wikimedia Deutschland"). It covers additional aspects of interest to Wikimedians, e.g. "A further goal of the project is to more easily detect vandalism on Wikidata". Generally, it is worth being aware that these are all still announcements only and the actual product has not yet been released. WMDE folks have been talking about this project for about a year already (including at some conferences and about three months ago in the "Wikimedia AI" Telegram channel, where they provided some valuable additional background in response to questions from community members). Regards, HaeB (talk) 22:44, 12 December 2024 (UTC) (Tilman)[reply]
"Politics and business in the pages of Wikipedia"
This may be of interest:
This is about the International Foundation for Better Governance (IFBG). Peaceray (talk) 19:30, 15 December 2024 (UTC)[reply]