Jump to content

Wikipedia:Technology report

From Wikipedia, the free encyclopedia
(Redirected from Wikipedia:BRION)
The Signpost
WP:BRION
Technology report


File:IDEO-CVC-2019-Thomas Grimer5.jpg
Thomas Grimer
CC-By 4.0 International
100
25
500
Technology report

Wikimedia tech, the Asian News International case, and the ultra-rare BLACKLOCK

Wikipedia editors have a right to privacy as described in the Wikimedia Foundation Privacy Policy. Still, be aware that computer and network surveillance is everywhere. Right now the Wikimedia Foundation is responding to a court order to reveal the identity of some editors.

Wikipedia is a social machine in which people and software tools interact to build an encyclopedia. In considering the court case Asian News International vs. Wikimedia Foundation, here are some of the technologies which affect this case.

WP:BLACKLOCK

The one English Wikipedia article with a black lock which prevents all editing is Asian News International vs. Wikimedia Foundation.

The "black lock" is a tool which the Wikimedia Foundation applies to completely disallow anyone from editing a given Wikipedia article. Right now, the only English Wikipedia article with a black lock on it is Asian News International vs. Wikimedia Foundation.[blacklock 1] Currently, there is no easy way to determine which Wikipedia articles have ever had a black lock. The Wikipedia community used to maintain a log at Wikipedia:Office actions/Log, but had trouble managing it. There are data feeds which contain Wikimedia Foundation office actions which are difficult for humans to interpret, such as at Meta-Wiki, but in general, the only way to get information about Wikimedia Foundation locked articles is to find a human who already knows and ask them.

"Wikipedia is the free encyclopedia which anyone can edit", and the Wikipedia community wants to encourage discussion and development for all Wikipedia articles. Applying a black lock to an article immediately maximizes attention to that topic. There is no discreet way to use this feature, and attempts to use the black lock to gain privacy will only trigger the Streisand effect.

The black lock is part of Wikipedia:Protection policy, which are other article locking mechanisms for other purposes. The other locks are in the control of Wikipedia volunteer administrators, not the staff of the Wikimedia Foundation. Wikipedia administrators almost always use locks as a way to halt unproductive conflict within Wikipedia articles, and to instead direct editorial disputes to the associated Wikipedia talk page where editors can permanently and publicly log their article critiques while also seeking editorial consensus with their colleagues.

  1. ^ The article's talkpage is also so protected; see Category:Wikipedia Office-protected pages for the full list of pages.

User privacy features

The Wikipedia community places high value on global access to Wikipedia for both readers and editors. Everyone has the right to read Wikipedia, and everyone has the right to edit Wikipedia. Regarding readers, Wikipedia is the rare website which seeks to avoid spying on and tracking its readers. In the context of the Asian News International case, the more relevant right is safety and privacy for Wikipedia editors.

The safety and privacy protection extends to editors of who are here to build an encyclopedia. The usual activity for that is identifying reliable sources, then summarizing and citing those sources in Wikipedia articles as an editor. Based on information which the Wikipedia community has, the editors in the Asian News International case are good editors doing good editing in the Wikipedia way.

CheckUsers are a class of Wikipedia moderators who can examine IP addresses in limited circumstances.

Editors have a right to privacy as described in the Wikimedia Foundation Privacy Policy. One way that Wikipedia offers privacy is by allowing editors to register user accounts then edit through a username, rather than their offline legal identity. Information which might be associated with a user account includes an email address and the IP addresses from which that user edits. IP addresses are private, and the Wikimedia Foundation resists sharing them.

In the case of Asian News International, that organization asked the Delhi High Court to order the Wikimedia Foundation to reveal the IP addresses of three editors who had edited the article about the organization. Editors who have edited this article have reported distress at this news. The Wikipedia community strongly objects to revealing the IP address of users, as this privacy protection is one of the foundations of trust between Wikimedia users and the Wikimedia Foundation. For more reactions, check the public community discussion forums on the subject.

English Wikipedia intensely and continuously discusses editor privacy, including in the context of moderation and a class of volunteers who investigate misconduct in the Wikipedia:CheckUser role. It is also common knowledge that Wikimedia projects generally disallow editing through virtual private networks or Tor according to the meta:No open proxies rule. Right now in October 2024, the big news in this domain is that the Wikimedia Foundation is rolling out a new type of account called "Temporary Accounts" which permits users to edit for a limited time, then have certain private account information deleted. All of these features and tools include a complex interplay of Wikipedia editors governing the project with a mix of social consensus and technological tools to manifest community design and wishes.

Archive Today

Archive.today, formerly called archive.is, is a mysterious archiving website which an unknown, non-Wikimedia entity operates. It has no formal relationship with Wikimedia projects. In practice, however, the website is the best, easiest, or only option which Wikipedia editors have to gain access to certain deleted Wikipedia articles. To access the last version of "Asian News International vs. Wikimedia Foundation" all you need to know is archive.is/ and a five letter code. It is impossible to gain a basic understanding of this story for journalistic purposes without seeing the deleted article. Discussions about this case throughout the Internet and social media point to this archival copy hosted at that site.

Accessing the deleted article is necessary because Wikipedians tell stories through Wikipedia articles. The Signpost is attempting to create journalism about this story, but the best explanation would be simply linking to the Wikipedia article for the topic, were it not deleted. One insight which a reader would gain from seeing the Wikipedia article would include that a lot of journalists have already written about this court case. Observing that, a reader should then wonder why the court feels that it is problematic for Wikipedia editors to summarize and that journalism, when the information in Wikipedia originates from external journalism.

The Signpost is an independent newspaper and has no special private access to anything in the Wikimedia platform. There are bureaucratic processes for publicly asking for such things. The Signpost made requests to see the deleted text with permission from Wikipedia Administrators and Wikipedia Oversighters, who are volunteer community moderators with specific and different user permission. Administrators do not have access at all due to the black lock. Oversighters have access, but are bound by a non-disclosure agreement to only share content at the direction of the Wikimedia Foundation. Neither of these groups could share the text, so no reviewers in the Wikimedia community have access to the deleted text through the Wikimedia platform.

Questions arise: our Wikipedia editor colleagues whose privacy is under threat seem to be in danger. To what extent should the Wikipedia user community of editors support them? When organizations are sued, they typically do not publicly discuss their court cases. The Signpost does not have access to the inner workings and thoughts of the Wikimedia Foundation, and while the Wikimedia community generally trusts the Wikimedia Foundation for protection, the community and the foundation are different entities and have different values, objectives, and motivations. When the Wikimedia community of editors organizes its own advocacy and responses, then we need our own sources of information, our own conversations, and our own leadership.

A number of Wikipedia editors contacted The Signpost to recommend that journalists here not produce journalism about this story without permission of the Wikimedia Foundation, and that the lack of access to this article through the Wikimedia platform was the Wikimedia Foundation's way of ordering the volunteer community of editors to avoid discussion of the article within the Wikimedia platform. The end result is that now Wikipedia editors turn to anonymously-run third party websites like archive.today to access a copy of the deleted article. The Signpost would like to clarify its understanding that accessing this article, discussing it, and producing journalism about it is aligned with Wikimedia community interests, and not contrary to any Wikimedia Foundation rule. There is no conflict here. It just happens that in this case, part of the Wikipedia community and Signpost technology workflow includes a third-party archive run by a mysterious anonymous Internet entity.