Wikipedia:Bots/Requests for approval/Cyberbot II 3
- The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Withdrawn by operator.
Operator: Cyberpower678 (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 15:17, Thursday June 6, 2013 (UTC)
Automatic, Supervised, or Manual:
Programming language(s): PHP
Source code available: Not yet
Function overview: Scan's selected namespaces from a control interface and delinks redlinks. Currently article space is selected only.
Links to relevant discussions (where appropriate): https://en.wikipedia.org/w/index.php?title=Wikipedia:Bot_requests&oldid=558578137#Current_method_I.27m_using_to_remove_redlink_entries_in_a_set_of_lists_-_need_faster_method
Edit period(s): Every 24-48 hours
Estimated number of pages affected: Perhaps several thousand or 10000 at first.
Exclusion compliant (Yes/No): Yes
Already has a bot flag (Yes/No): Yes
Function details: This task is managed through a control interface so the bot delinks redlinks and ignores others. The control interface also allows for controlling which pages to ignore and which namespaces to ignore. The purpose of this script is to remove redlink to articles, especially ones that are flooded with them, and tidy them up to be more appealing to the reader. This eliminates confusion to the readers who are not familiar with red links mean. The bot will run every 24 hours to conserve resources. The interface will have approved management and everyone will be able to submit requests for exemption. Link to the interface can be found in edit summaries.
Discussion
[edit]Do you have a mockup of said control interface? ·addshore· talk to me! 18:19, 7 June 2013 (UTC)[reply]
- Not yet. Details are still being worked out. But it will allow anyone to submit a request to add or remove a namespace, page, or link to ignore when the bot does its runs. It also allows for management to approve the requests if they are reasonable. Reasonable requests would be "articles pending creation" for example. The bot should accurately remove links to pages that will not interfere. I believe this method best accomplishes accurate removal of redlinks.—cyberpower ChatOnline 18:24, 7 June 2013 (UTC)[reply]
- So you're saying that on the first run the bot will remove most, if not all redlinks from pages? ·addshore· talk to me! 18:29, 7 June 2013 (UTC)[reply]
- The first run will be a controlled test. It will remove a lot of redlinks but once it knows which ones to ignore, it will only start to delink the ones it should. The bot will be throttled to a very slow edit rate to allow for fixing issues.—cyberpower ChatOnline 18:33, 7 June 2013 (UTC)[reply]
- I feel this needs more community input. Personally I would feel better if the bot worked the other way around, a user specifies a link to remove, and the bot removes it :/ ·addshore· talk to me! 18:45, 7 June 2013 (UTC)[reply]
- I have considered that. I feel that would require even more work in the long run, specifying every link to remove. I don understand what you're saying. So I will let the requester know if he would still like it and is willing to broadcast an RfC for this. If not, I'll withdraw this request.—cyberpower ChatOnline 18:52, 7 June 2013 (UTC)[reply]
- If you continue with this, I'd like a clear explanation of exactly what the bot is supposed to be doing (the description currently given is horribly unclear, unless you really intend to remove all non-"whitelisted" redlinks from all articles) and why that doesn't go against WP:REDDEAL. Anomie⚔ 20:22, 7 June 2013 (UTC)[reply]
- I have considered that. I feel that would require even more work in the long run, specifying every link to remove. I don understand what you're saying. So I will let the requester know if he would still like it and is willing to broadcast an RfC for this. If not, I'll withdraw this request.—cyberpower ChatOnline 18:52, 7 June 2013 (UTC)[reply]
- I feel this needs more community input. Personally I would feel better if the bot worked the other way around, a user specifies a link to remove, and the bot removes it :/ ·addshore· talk to me! 18:45, 7 June 2013 (UTC)[reply]
- The first run will be a controlled test. It will remove a lot of redlinks but once it knows which ones to ignore, it will only start to delink the ones it should. The bot will be throttled to a very slow edit rate to allow for fixing issues.—cyberpower ChatOnline 18:33, 7 June 2013 (UTC)[reply]
- So you're saying that on the first run the bot will remove most, if not all redlinks from pages? ·addshore· talk to me! 18:29, 7 June 2013 (UTC)[reply]
- It doesn't appear the requester is going to go through with an RfC. I am hereby marking this request as Withdrawn by operator. Anomie, thanks for the link by the way. I wasn't aware of this to begin with.—cyberpower ChatOnline 22:24, 16 June 2013 (UTC)[reply]
- The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.