Graphs are unavailable due to technical issues. There is more info on Phabricator and on MediaWiki.org.
BRFA activity by month
Welcome to the tenth issue of the English Wikipedia's Bots Newsletter, your source for all things bot.
Overall
In the last half of 2020, there were 54 BRFAs. Of these, Y 40 were approved, and 14 were unsuccessful (N2 5 denied, ? 5 withdrawn, and 1 expired).
New requirements for user signatures were implemented on July 6, with the aim of reducing linting errors and eliminating signatures that didn't link to local accounts.
EmausBot, which had been indefinitely blocked in May, was brought up at the bot noticeboard by wbm1058, who argued for its unblock, saying that it had been making "perfectly valid and what should have been totally uncontroversial technical edits". These arguments proved persuasive, and EmausBot was let out of bot jail on the 14th. It promptly returned to its prior hobby of fixing double redirects.
A well-attended village pump (policy) discussion, started in June by Nat Gertler, concerned InternetArchiveBot's recent "aggressive" linking of mentions of books to Internet Archive copies of those books, which Gertler questioned the legality of. A lack of consensus for the links resulted in the discussion being closedby DMacks in July with a recommendation for the bot to be stopped (with "no prejudice against future manual additions or removals of IA links by uninvolved editors"). At the bot noticeboard discussion regarding this closure, IABot operator Cyberpower678 asked that BAG not rescind the approval of the InternetArchiveBot 3 task, approved in 2019, and said "I am confused how the most recent discussion which had no consensus would overturn the original proposal and bot approval which both had unanimous consensus [...] There were some concerns raised that a pending lawsuit should change our approach; however, the Internet Archive is functioning no differently today than it was when the bot was approved". Mark Graham, who manages the IA's Wayback Machine, said that "for the last 5 years InternetArchiveBot has been linking to archived snapshots of web pages that no longer function or will soon not function [...] we take the wishes of the Wikipedia community very seriously, and we would very much like to continue helping the community with its inspiring mission". As of December 2021, the task is still listed as "approved".
Denied requests included Area code bot (proposed to repair and maintain pages for American telephone area codes) and NotPlanter (proposed to automatically replace "planter" with "agricultural enslaver" on biographies pertaining to the Antebellum South).
A GitLab consultation for MediaWiki was started, prompted by a developer survey showing "some level of dissatisfaction with Gerrit" (which had previously been migrated to from SVN in 2012). The page said that "while Gerrit’s workflow is in many respects best-in-class, its interface suffers from usability deficits, and its workflow differs from mainstream industry practices"; it also pointed out that many developers had already abandoned it for GitHub (with Wikimedia's Github already having 152 projects). GitLab's self-hosting capabilities, as well as its overall high quality, made it the most prominent choice for a replacement; the consultation was scheduled to end on October 23.
A proposal was made for a Twitter bot to post DYKs. As of December 2021, the Twitter account is posting regular updates.
The August Village Pump thread about Citation bot was closed on September 18. Barkeep49 said, in the closing note:
While some wished to do so only in certain circumstances (e.g. if there was a free version), there is consensus for including a link, when an online source is available, from the title text. There is consensus that removing a link requires human judgement.
[...]
There is consensus that what to link should be accorded respect when made by humans (see more in the following section). When deciding what to link in the title, the consensus is that sources which verify the information being cited and sources which are free are top priorities; a lesser factor to consider is whether it is a link to full text or partial (e.g. abstract only). At no time should we be linking to copyright violations.
[...]
There is consensus against removing a parameter just because it is a duplicate. However, there is an agreement that sometimes removing duplicates can be appropriate. This decision should be made on an article by article basis, unless it is a copyright violation for which there is consensus to remove.
The same day the discussion was closed, a thread at the Administrators' Noticeboard started by Levivich said that the bot was "still making unauthorized edits" and that "these edits do not appear to be authorized by any of the bot's authorizations". After two days of discussion, Levivich said that "the RFC has been closed (thanks BK) and the bot has been updated to reflect the close (thanks AMWNP), which I think resolves this. Thanks again everyone".
A thread at the Village Pump (proposals), for a "Cosmetic Bot Day", was started by GreenC. Consensus is generally against the use of bots to perform cosmetic edits (i.e. those which change an article's source code but don't change the rendered output for readers). Many people hate these, as they tend to spam up article histories and watchlists. But others like them, as they make editing and template maintenance much easier. GreenC's proposal was to set aside one day per year (or month, etc) for a large number of cosmetic bot edits to be done at once. This proposal enjoyed broad consensus.
New bot approvals in October were mostly for minor syntax correction and template modifications. SDZeroBot 5 was approved to populate Category:Content moved from mainspace to draftspace, and WugBot 5 was approved to apply fixes for a bug where article talk pages were erroneously being added to passed/failed DYK nomination categories (which should only contain the nomination templates).
On Meta, a RfC was introduced to refine global bot policy. The proposal, made by Martin Urbanec, noted that global bot policy only allowed bots to maintain interlanguage links or fix double redirects (meaning that, de facto, virtually no global bot authorizations occurred). He proposed an amendment to authorize steward approval for uncontroversial global bot tasks. The RfC would run for several months.
Two new bots were introduced in November: VahurzpuBot, whose first approved task was to remove the Redirect-Class designation from articles that weren't redirects, and Dapperbot, whose task was to use the COVID-19 API to sweep COVID-19 articles and update disease rates.
Edits made by Monkbot (specifically those made as part of its Task 18) were examined at the bot noticeboard in a December 9 thread started by Nikkimaria, who requested a re-examination of its BRFA, saying that "given the scale of this task, impacting over a third of the site's articles, the maintainer was asked to open an RfC to seek wider community consensus on this issue; thus far no such RfC has taken place, but the task has resumed regardless". Operator Trappist the monk countered that the task had already accrued more than 225,000 edits and that they "might consider an RfC were it obvious that a goodly portion of the community were up in arms about this bot task [...] all of those edits appear on a large number of watch lists yet there has been no uprising". The thread would continue for quite some time.
A discussion at Wikipedia talk:Bot policy, started by Izno, brought up the issue of users removing software restrictions by forking or modification. Specifically, tools like AutoWikiBrowser and Huggle have built-in verification routines that prevent editors from using the software unless they're on a central list (usually mamaged at WP:PERM)... but since their source code is openly accessible, anyone who cares enough can just change a 0 to a 1 and ignore the whole system. In general, participants tended to share the view that this could cause problems, but noted that it was hard to draw a distinction between code written to bypass the checks in these softwares and code that simply functionied similarly to them. Ultimately, most agreed that such editing was covered under WP:MEATBOT at any rate.
There is consensus for the relaxing of WP:COSMETICBOT so that bots may be approved, using the normal process, to allow cosmetic edits that would run only on designated cosmetic bot days (in addition to typical trials/testing done during the Bot Request process). These bots must still have appropriate consensus for their cosmetic tasks. Several editors, both those in favor, but especially those who opposed worried that the tasks would impose preferences rather than improvements.
At the moment there is consensus for a single cosmetic bot day as a trial when these bots may run as there was no consensus about a frequency among participants of this RfC. There was no consensus, largely to lack of discussion, about when such a cosmetic bot day trial should occur but to the extent it was discussed editors indicated a preference for days that experience fewer edits. In any event appropriate notification should be given to warn editors of the trial.
When approving bot(s) that may run on cosmetic bot days, the Bot Approvals Group should also take into consideration the preference expressed for bots/cosmetic bot day structures which will minimize watchlist disruption (for example Enterprisey suggested a theoretical way cosmetic edits would be aggregated before being implemented by a single bot).
Following this trial, subsequent discussion (and/or follow-up RfC) may establish consensus for further cosmetic bot days including a proposed frequency and any other supporting structure/limitations as the community may deem appropriate.
ShortDescBot, created by MichaelMaggs as part of an effort to address the 3.5 million articles lacking short descriptions, saw its first task approved in December: adding short descriptions to 26,000 pages about moths.
Conclusion
What's next for our intrepid band of coders, maintainers and approvers?
Will the global bot policy amendments be approved?
What will be the fate of MonkBot's infamous Task 18?
Will ShortDescBot prove victorious in its tireless crusade to describe moths?
How big does ClueBot NG's watchlist get after eleven years? Is anybody gonna do something about that?
These questions will be answered — and new questions raised — by the January 2022 Bots Newsletter. Tune in, or miss out!