Jump to content

Wikipedia:Bot requests/Archive 48

From Wikipedia, the free encyclopedia
Archive 45Archive 46Archive 47Archive 48Archive 49Archive 50Archive 55

Database reports/Unused non-free files

In the past there was a bot that was taking care of tagging these images Wikipedia:Database reports/Unused non-free files‎ as orphaned. I've been trying to keep up when it wasn't working, but given the number there, it's getting to be overwhelming. If someone can remember which bot it was, I'd be happy (with a bit of assistance in getting it started) to take over running the bot. Skier Dude (talk) 03:13, 2 May 2012 (UTC)

Hey Skier Dude. I could write a code that would do that for you (SHouldn't be too hard) but if you want to do it still ok. :). Ceradon talkcontribs 03:29, 2 May 2012 (UTC)
I might be able to get this done, if I think my code may be good enough, I'll make a BRFA. It was done by Fbot, but the owner Fastily retired. The BRFA was Wikipedia:Bots/Requests for approval/Fbot 5.  Hazard-SJ  ✈  02:58, 3 May 2012 (UTC)
Yep, still interested. I was looking for the Fbot5 source code, but didn't find anything :( Skier Dude (talk) 02:58, 6 May 2012 (UTC)
User_talk:Fastily#Fbot5_source_code Bulwersator (talk) 08:05, 7 May 2012 (UTC)
I think you're mixing up the bots and I also think you're mixing up "Fbot 5" with "Fbot 8". If I remember correctly, User:DASHBot is the bot handling the {{orphaned fair use}} tagging whereas User:Fbot used to tag files with {{subst:orfurrev}}. Both tasks are of course needed. --Stefan2 (talk) 11:22, 7 May 2012 (UTC)
I have everything Fastily ran, but I'm waiting for his clearance to set it up on Labs. And no, I don't have the source code, and even if I did, he's not sharing. Sven Manguard Wha? 17:15, 7 May 2012 (UTC)

Bot for Women's History project stubs

There is an existing category: Stub-Class Women's History articles. Could someone please run a bot that puts the template {{Women's-History-stub}} at the bottom of the articles within that category? Thank you for your help. Maile66 (talk) 12:43, 5 May 2012 (UTC)

 Doing... using AWB. Should be done in an hour or two. Rcsprinter (gas) 13:49, 5 May 2012 (UTC)
Thank you very much. Maile66 (talk) 14:14, 5 May 2012 (UTC)
Actually, make that a day or two. A slight underestimate there. Wasn't bargaining on their being 3 and a half thousand. Rcsprinter (talk) 21:48, 5 May 2012 (UTC)
And whoa a second. Worth talking to the stub people about this - I think they would probably prefer a lower case "h" for history, and may indeed already have a suitable stub. They are pretty amazing when it comes to stubs. Rich Farmbrough, 02:08, 6 May 2012 (UTC).(Using some automation)
I left them a message. Rich Farmbrough, 02:11, 6 May 2012 (UTC).(Using some automation)
Indeed, please stop until Wikipedia:Stub types for deletion#Women.27s history stubs has been resolved. SeveroTC 15:11, 6 May 2012 (UTC)

Actually, this has all been a big mistake. I went to bot requests, because someone suggested I run an AWB on my own. I didn't want to do that, so I put the request here. But it's a longer story than that. Refer to the Stub types for deletion link above to see. Rcsprinter123, I apologize for putting this here. Aren't you all glad I didn't just run my own AWB? Maile66 (talk) 14:58, 7 May 2012 (UTC)

mdash --> spaced mdash

Hi, similar to Wikipedia:Bot requests/Archive 47#ndash --> spaced ndash, I'm looking for a bot to switch over {{mdash}} to {{spaced mdash}} per the outcome of Template talk:Spaced mdash#Requested move. Jenks24 (talk) 15:11, 7 May 2012 (UTC)

I believe I'll do this. I'll get it approved soon and get to it.  Hazard-SJ  ✈  23:34, 7 May 2012 (UTC)
BRFA filed  Hazard-SJ  ✈  23:56, 7 May 2012 (UTC)
Thanks very much, hope the BRFA goes smoothly. Jenks24 (talk) 23:59, 7 May 2012 (UTC)

Bot to change dates in sortable tables to {{dts}}

Will it be better if we have a bot which can change dates in sortable tables to the {{dts}} template, so they can be sorted correctly? Many people, even if they know how to make sortable tables, do not know how the JavaScript sorting algorithm actually works, and when sortable tables are created with raw dates they are sorted alphabetically. Such bots may be able to read the date format and use the same format in the template. (Delhi Daredevils is one example of an article with raw dates in sortable tables). jfd34 (talk) 15:22, 8 May 2012 (UTC)

Changing <br/> to <poem> on Sonnet pages

See [1]. <poem></poem> tags are more efficient and less distracting in the editing window IMO than <br> / similar tags on every line. So could a bot go through Shakespearean sonnet pages, remove all instances of <br>, </br>, or <br/>, and place <poem></poem> around the text, inside the {{sonnet}} template? It Is Me Here t / c 17:47, 28 April 2012 (UTC)

This is a sensible request, though I suspect false positives may be a problem. Might be better done through AWB. --Izno (talk) 03:21, 29 April 2012 (UTC)
Could you do that for me, perchance? I don't think I have AWB. Thanks It Is Me Here t / c 01:13, 30 April 2012 (UTC)
While it makes sense to me, the template instructions do explicitly state: "Each line should end with <br />." Just want to be sure that there is consensus for this change. GoingBatty (talk) 01:26, 30 April 2012 (UTC)
WikiProject Shakespeare and WikiProject Poetry notified. It Is Me Here t / c 18:12, 30 April 2012 (UTC)
I agree with the change. It's what <poem> is for. Span (talk) 22:33, 30 April 2012 (UTC)
Makes sense to me. I'll just note that the initial link shows
<poem> First line text...
Last line text. </poem>
Whereas it will make more sense to end up with
<poem>
First line text...
Last line text.
</poem>
...as currently seems to be implemented in Sonnet 1. Wow. I haven't seen that many colons since my previous job as an endoscope. Phil wink (talk) 23:02, 30 April 2012 (UTC)

OK, so do we now have consensus? It Is Me Here t / c 19:34, 4 May 2012 (UTC)

Coding... It looks like a non-controversial task, and I see no objection. I'll probably run it as assisted editing in a bit. — madman 18:22, 10 May 2012 (UTC)
Doing... Please confirm this edit; I used the same format as in Sonnet 1. If it looks good, I'll run it on all the rest. — madman 19:42, 10 May 2012 (UTC)
Green tickY Looks good It Is Me Here t / c 21:27, 10 May 2012 (UTC)
I have taken the liberty of boldly updating the human-readable instructions for {{Template:sonnet}} to conform with this new standard. Please take a peek and confirm or rebuke. Thanks. Phil wink (talk) 21:57, 10 May 2012 (UTC)
Y Done [2]. Also, something's broken about those docs, at least for me. I'll try to fix it. — madman 01:00, 11 May 2012 (UTC) The problem's that there's no default value for sonnet number or text. I can fix that, but I don't know what a sensible default would be. — madman 01:04, 11 May 2012 (UTC)
18? If the choice does not affect any article pages, I suppose you may as well choose that one, if I have understood what you're saying correctly. It Is Me Here t / c 08:50, 11 May 2012 (UTC)

Sounds good.  Done (Anyone can feel free to change if they feel the defaults shouldn't be so large, but now there aren't big red error messages at Template:Sonnet.) — madman 16:21, 11 May 2012 (UTC)

I've slightly reformatted the template page, so as to remove the overlaid "pre" box. Again, feel free to revert and rebuke if I've screwed anything up. Phil wink (talk) 02:57, 12 May 2012 (UTC)
Looks great! madman 03:15, 12 May 2012 (UTC)

blog.taragana.com -> gaeatimes.com

The links at Special:LinkSearch/*.blog.taragana.com are now all redirects to subdomains at gaeatimes.com. Not sure if anyone is able to work out a means to convert these to direct links to the articles. — billinghurst sDrewth 11:20, 6 May 2012 (UTC)

161 to do, shouldn't take too long, so I'm  doing... using AWB. Rcsprinter (rap) 16:20, 12 May 2012 (UTC)
 Done Rcsprinter (rap) 19:50, 12 May 2012 (UTC)

user:N

Is it possible to get all links/signatures to my former usernames, user:N, and User:Nardman1, and their respective talk pages, updated? There is a new user who wishes to use the User:N moniker and I would like to avoid any confusion in the future. Please also include subpages, most notably User:N/modern Jesus -Nard 14:33, 11 May 2012 (UTC)

It is possible, and usurpation is IMO an acceptable exception to WP:R2D. Coding... Anomie 17:30, 11 May 2012 (UTC)
I was sure that there was a specific policy/guideline that said not to do this, but it may have gone away since 2007. Am I just making it up? (In any case, I think there's rarely any justification to update any archived pages, so that's a consideration too.) — madman 18:06, 11 May 2012 (UTC)
There's WP:R2D, not sure if there is anything else. The only reason I'm considering this case is because someone is asking to usurp User:N, and it would be confusing if all the old links to User:N (from comments made by User:Nard the Bard when he/she used User:N) went to this new person instead of Nard. If it was just someone changing their username and the old name could be left as a redirect, I'd say no. Anomie 18:10, 11 May 2012 (UTC)
Indeed, that would seem to make sense. — madman 18:21, 11 May 2012 (UTC)

BRFA filed. I also went ahead and advertised the discussion at WP:VPR and WT:R. Nard, if this gets approval for trial, would you like the bot to log pages it can't handle to a particular subpage, or should it just spam your main talk page? Anomie 02:51, 12 May 2012 (UTC)

  • Spamming my talk page is fine. I'll go through and clean up from there. A similar request on Commons went rather well. They started with a simple find and replace on my old signatures which knocked out 99% of the hits. -Nard (Hablemonos)(Let's talk) 12:41, 12 May 2012 (UTC)
Hah! Wan't it you (Anomie) and Xeno that complained when I did this for a RL stalked user? But go for it. Rich Farmbrough, 02:23, 13 May 2012 (UTC).
for the curious, Wikipedia:Bot_requests/Archive_35#Username_Change. →Στc. 04:25, 14 May 2012 (UTC)

Talk page notification

Is there a way to make a bot that can notify my alternate account talk page when I have a message on my main account, as my alternate account redirects. CTJF83 23:57, 11 May 2012 (UTC)

You could put your main account's talkpage on your alternate account's watchlist; I don't think it would be worth wasting a bot on. Doesn't the software email you when either of your accounts talkpages are edited anyway? Rcsprinter (deliver) 16:08, 12 May 2012 (UTC)
Yes Femto Bot could do this with a suitable BRFA, it copies stuff around my talk pages. However you may find that if you wait a few months, Mirror Bot would be a better bet, as you could mirror the entire page. Rich Farmbrough, 02:21, 13 May 2012 (UTC).
Nah, e-mail annoys me lol CTJF83 02:09, 14 May 2012 (UTC)
Ok, thanks! CTJF83 02:06, 14 May 2012 (UTC)

Remove template

Template:Category TOC, which is used if there is more than 400 pages in a category, is transcluded into over 48,000 articles. I have had a look at some of the categories where it is used and many of them actually had less than 200 pages in them. Having the template in these categories is unneeded and confusing clutter. Can we get a bot to remove them if there are less than 400 pages in a category? Alternatively, slip me a list of the pages and I will do it with AWB and take FULL responsibility for the task (which goes without saying). -- Alan Liefting (talk - contribs) 04:37, 14 May 2012 (UTC)

I need a bot that will update the league table and results of the 2012 Kenyan Premier League, the 2012 FKF Division One, the 2012 Kenyan Women's Premier League and their future seasons on a regular basis (every 4 days). Manually updating them gets extremely tedious as I normally have other tasks to perform in or outside Wikipedia. Also, I, and any other person who edits the articles, will not always be available to update the articles. I know coming up with code for this can take extremely long, so if you can, which I hope, can it be ready in the next 2-3 months? Davykamanzitalk  · contribs 17:18, 8 May 2012 (UTC)

I don't think updating content (especially league tables, which aren't supposed to be there anyway) is what bots are supposed to be used for. They need repetitive tasks which don't require human input - how would they know what to put in the tables anyway? Can't see it working. Rcsprinter (converse) 16:12, 12 May 2012 (UTC)
Note that this is in response to Wikipedia:Bots/Requests for approval/DavykamanziBot. —  HELLKNOWZ  ▎TALK 16:27, 12 May 2012 (UTC)
With some excel skills it is not that hard to do paste a soccerway (or maybe table) into excel which then outputs wikipedia code. -Koppapa (talk) 20:11, 14 May 2012 (UTC)

People to pick up after RF bot's task

Following the recent ARBCOM drama, we'll need some people to pick up many tasks done by Smackbot/Helpful Pixie Bot. Specifically

  • Add dates to cleanup tags.
  • Replace tags {{Disambig}} + {{Cleanup}} with {{Disambig-cleanup}}.
  • Fixing common section names:
    • See Also / Also see / Also See → See also
    • External Links → External links
    • Reference → References
    • Add missing References section when <ref> are found in the article
  • Fix/tag isbn:
    • Properly hyphenate ISBN
    • Switch ISBN: 0123456789/ISBN:0123456789/ISBN0123456789 to ISBN 0123456789 (hyphenated) so wikimagic can kick in
    • Tag invalid ISBNs with {{please check ISBN}}
    • Move {{please check ISBN}} outside of various citation templates (aka change {{cite xxx|isbn=0123456789}} to {{cite xxx}} {{please check ISBN}}}}

Some other tasks can be found at User:Helpful Pixie Bot#Tasks and authorisations, but the above seem to be the most important. Headbomb {talk / contribs / physics / books} 19:05, 15 May 2012 (UTC)

AnomieBot performs the first task already I think. -- Magioladitis (talk) 19:51, 15 May 2012 (UTC)
There are also several tasks associated with User:Femto Bot including updating the recent changes lists for WikiProjects that allows a sort of Watchlist for the WikiProjects to better manage their articles. Kumioko (talk) 19:52, 15 May 2012 (UTC)
Interesting observation: Many of the tasks described above are done by AWB's general fixes (dating tags, tags replacement, section name corrections, adding missing references section, etc.). WP:COSMETICBOT must be revisited. -- Magioladitis (talk) 08:58, 16 May 2012 (UTC)
See discussion at WP:BON#Rich Farmbrough's bots. Anomie 00:58, 16 May 2012 (UTC)

RFC: Deploying 'Start date' template in infoboxes

May I draw your attention to RFC: Deploying 'Start date' template in infoboxes on this page's talk page? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:37, 17 May 2012 (UTC)

Auto updating data of articles

I know I already brought this up in the past, but here we go again. Is there a bot that can read some specific data from a website and automatically update an article with that data? I am thinking of articles such as Rosetta@home, where the infobox stats such as Active users, Total users, Active hosts and Total hosts could be auto-updated with info from http://boincstats.com/en/stats/14/project/detail or Wieferich@Home, where the infobox stats also could be auto updated. I guess there are other types of articles (about other things than dc projects) where this could be useful. This might also be useful for example for company articles to keep things like the revenue / profit up-to-date. -- Toshio Yamaguchi (tlkctb) 17:14, 19 May 2012 (UTC)

update support for ISO 639-3 codes

We have an article List of ISO 639-3 codes with a search button. However, it relies on stubs having been created for each ISO code, and many of these are missing or link to the wrong page.

Could we create the missing stubs, and redirect the outdated/misdirected ones, so that our ISO search interface works properly?

The full list of ISO codes (as of February), with their ISO names, is at Wikipedia:WikiProject Languages/Articles by code, in the piped format [[ISO name|ISO code]].

Request

My request is as follows:

  1. Create an ISO redirect page for all blue links which don't yet have one, using the ISO name in the link on the WProject page above.
  2. Update any existing ISO rd page to the current ISO name (not to be saved until the following are considered).
  3. Follow any existing redirects from the ISO name, so that we link to the current WP article. (We often use s.t. other than the ISO name.)
  4. Verify that the article being linked to has a {{infobox language}} template, with either |iso3= or |lc(#)= set to the ISO 639-3 code in question
    (where (#) in a one- or two-digit number, such as |lc1=, |lc20=).
  5. If the article does not have the proper ISO code in the info box, direct the link instead to the 'Articles by code' page above. Add the ISO name, commented-out, to the rd, so that it is easy to review. These will be reviewed manually.
Examples

For example, the very first ISO code in the list at Wikipedia:WikiProject Languages/Articles by code is aaa. This is a blue link, piped to Ghotuo language. The ISO redirect, ISO 639:aaa, also links to Ghotuo language. These match, and the article has an infobox with iso3 set equal to aaa, which also matches. The link is therefore correct, so our ISO search will find the proper article, and nothing needs to be done by the bot.

The second code in the list, aab, has no corresponding ISO 639:aab stub, so one needs to be created. In the code list, aab is piped to Alumu-Tesu language, which redirects to Alumu language, which has an infobox with iso3=aab. That matches, so ISO 639:aab should be directed to Alumu language. (Note the format of the previous ISO redirect, which has {{R from ISO 639|AAA}} in it. All of our ISO redirects have this, and this should be replicated by the bot. If there are any ISO redirects which are missing that note (which triggers a hidden category), they should be fixed by the bot.)

aai is a red link, so the bot would do nothing.

The stub for aaq, ISO 639:aaq, already exists, and links to Abnaki language. The code in the list is piped to Eastern Abnaki language, which redirects to Abnaki language. That article has an info box with two ISO codes. The first is ours: lc1=aaq (ignoring white space), so again everything matches and the bot does nothing.

The stub for abo, ISO 639:abo, also exists, but it links to Tivoid languages. On the code page, the abo link is instead piped to Abon language. Now, Abon language has an info box with iso3=abo, so that's where the ISO stub should direct. The bot should therefore edit ISO 639:abo so that it points to Abon language.

For abs, the ISO redirect and the link on the ISO list both ultimately link to Malay trade and creole languages#Ambonese Malay, though the piped link goes through a redirect at Ambonese Malay. The linked section of that article is the correct destination for both, so the bot does nothing. There are several info boxes on that page, but I don't expect the bot to be sophisticated enough to verify it's linking to the correct section. (If you can do that, or at least verify that they both link to the same section, that would be great, but probably not necessary.)

For adg, the ISO redirect ISO 639:adg links to Aranda language. This has the proper ISO code in the info box. However, the piped link in the list, Andegerebinha language, redirects to a different article, Andegerebinha dialect, which also has the proper ISO code in the info box. This is presumably too complex for the bot to work out, so it would edit the ISO rd stub to link instead to Wikipedia:WikiProject Languages/Articles by code, and add the comment <!--ISO 639-3 code for Andegerebinha language-->.

For zaa, ISO 639:zaa links to Zapotec languages. The code list links to Sierra de Juárez Zapotec language, which also redirects to Zapotec languages. However, the infobox in that article does not have an iso3 or lc# field set equal to zaa, so ISO 639:zaa should be redirected to Wikipedia:WikiProject Languages/Articles by code, with a comment containing the ISO name.

If the ISO redirect exists, but the corresponding article does not, that would be a mismatch and so also be relinked to Wikipedia:WikiProject Languages/Articles by code. If the ISO redirect does not exist, and the language article does, but does not have the code in the info box, then again there is a mismatch; the ISO redirect would be created and linked to Wikipedia:WikiProject Languages/Articles by code. Etc.

This is something that might be repeated every year, as articles are created, merged, split, etc. — kwami (talk) 06:06, 20 May 2012 (UTC)

Removal of deprecated template

Can a bot be please tasked to remove {{WikiProject Natural phenols and polyphenols}} from all the talk pages it currently resides on? The associated WikiProject was deleted/userified, so the template should no longer be used. See discussions at Wikipedia:Miscellany for deletion/Category:WikiProject Natural phenols and polyphenols articles, Wikipedia:Miscellany for deletion/Category:WikiProject Natural phenols and polyphenols articles and User_talk:Nihonjoe#MFD (who suggested this bot task). Thank you. ChemNerd (talk) 13:30, 16 May 2012 (UTC)

This only involved about 400 articles so IMO could also be done as an AWB task if no one has a problem with that. I would say that several of the articles only have this one banner so it might be better to replace it with something else rather than empty the page. Kumioko (talk) 13:54, 16 May 2012 (UTC)
If there is no other template on the talk page, {{WikiProject Chemicals}} would be the appropriate one to replace it with. If someone is willing to run this as an AWB task, that is perfectly fine with me (I don't know how to do that myself). ChemNerd (talk) 14:00, 16 May 2012 (UTC)

Given my still fresh in the memory history of bot tasks that have ruffled some feathers in the past I don't feel its appropriate for me to do this personally but here is some simple code for an AWB module that should do most of what you are asking if someone wants to use it. It may not be 100% so the usual AWB warning disclaimers apply.

public string ProcessArticle(string ArticleText, string ArticleTitle, int wikiNamespace, out string Summary, out bool Skip)
 {
  Skip = false;
  Summary = "Deprecate WPNPP and Replace WPNPP with WPChemistry";
//Replace WPNPP with WPChemistry
  ArticleText = Regex.Replace(ArticleText, @"{{\s*(WikiProject[ _]+Natural[ _]+phenols[ _]+and[ _]+polyphenols|Wikiproject[ _]+NPP|WikiProject[ _]+NPP)\s*([\|}{<\n])", "{{WikiProject Chemistry$2", RegexOptions.IgnoreCase);
 return ArticleText;
  }

I hope this helps. Kumioko (talk) 15:41, 16 May 2012 (UTC)

I don't have access to AWB, so I'll just leave it to someone else. Thanks for your help. ChemNerd (talk) 18:41, 16 May 2012 (UTC)
I left a comment on the AWB tasks page to see if anyone is interested. A link to that is here. Kumioko (talk) 18:59, 16 May 2012 (UTC)
User:AnomieBOT can probably help with this task if an AWBer doesn't get around to it. I've pinged the bot's talk page about this thread. --Izno (talk) 04:16, 17 May 2012 (UTC)
I think someone might have already done it. I looked at it yesterday and there were only a couple left. Kumioko (talk) 16:50, 21 May 2012 (UTC)
Oh, right. I won't then. :) Rcsprinter (tell me stuff) 18:57, 21 May 2012 (UTC)

Task is now  Done. Thank you to all that helped. ChemNerd (talk) 20:53, 21 May 2012 (UTC)

Femto bot

This WP Hawaii Recent changes is linked to This Page, which in turn looks like it's being daily updated by User:Femto Bot.This information definitely is not accessible by just clicking "Recent Changes" over in the left-hand Toolbox - all that does is pull up anything and everything linked to the project page.

I think this is a valuable page, and want to know what it would take to get a Recent Changes page set up for two WikiProjects: Texas and Women's History. I believe both projects could benefit from being able to access this information.Maile66 (talk) 19:13, 21 May 2012 (UTC)

For what its worth Femto bot also updates WikiProject United States and several of the other supported projects of WikiProject United States. This was one of Rich F's bots that was blocked and shut down due to his Arbitration case. Its just one more example of how the Arbcom decision is adversely affecting Wikipedia. Kumioko (talk) 19:31, 21 May 2012 (UTC)
Yeah, I know it's Rich's. But I'd still like to know if there's a way some other sysop can create similar pages for Texas and Women's History.Maile66 (talk) 19:36, 21 May 2012 (UTC)
Oh sorry, I wasn't sure if you knew. Kumioko (talk) 19:43, 21 May 2012 (UTC)
No biggie. I just figure that if Rich is out of the picture, then anything productive he was doing needs to be taken on by others. This would be one of them. Women's History is a little new on the scene, and could maybe use help in these tools. They list "recent changes " - but it isn't. All that link is, is Talk Page postings. And, if Texas has anything at all like Hawaii's Recent Changes, I don't find it.Maile66 (talk) 19:57, 21 May 2012 (UTC)

Removing flag templates from infoboxes

According to MOS:FLAGBIO, flags should never be used to indicate a person's place of birth, residence, or death, as flags imply citizenship and/or nationality. Is it possible for a bot to remove {{Flag}}, {{Flagcountry}}, and {{Flagicon}} automatically from the {{Infobox person}} template? -- Luke (Talk) 23:02, 14 May 2012 (UTC)

AnomieBOT already does this for {{Infobox company}}, {{Infobox dam}}, {{Infobox power station}}, and {{Infobox World Heritage Site}}. Show me a discussion showing strong consensus for doing this for {{Infobox person}} specifically (with input solicited from WP:WPBIO and any other affected projects), and it can easily be added. Anomie 02:39, 15 May 2012 (UTC)
I'll ask for language infoboxes as well. Flags are fine for where the language is official, just not IMO for everywhere it's spoken. — kwami (talk) 05:58, 15 May 2012 (UTC)
Note that bots aren't too good at context. AnomieBOT can handle something like "replace anywhere except in |official language=", but it can't read the article to find out whether the language is official. At any rate, if you can show me consensus for a specific list of infoboxes, I should be able to add it. Anomie 11:28, 15 May 2012 (UTC)
I understand. It would simply be removing them from the infobox unless they were in the fields for official languages. I've posted on the WP page, but so far no response. If no-one objects, does that count as consensus? — kwami (talk) 04:13, 17 May 2012 (UTC)
Flags should not be used for language. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:28, 17 May 2012 (UTC)
Support. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:28, 17 May 2012 (UTC)
Looks like Andy is our only response from WPLANG. I updated the documentation to advise against flag icons on April 1,[3] and there have been no objections. I say go for it: Remove any flag icons from the fields "state"/"states" and "region". They are acceptable under "nation" and "minority". If they appear anywhere else, perhaps they could be flagged for human attention? — kwami (talk) 20:29, 22 May 2012 (UTC)

Bot for Template:Cleanup

Can a bot be programmed to help with {{cleanup}}? It needs to leave a message to anybody that has just tagged an article using this template without filling in the |reason= parameter. Something similar to what DPL bot (talk · contribs) does with DABs. The message is:

"You recently added a cleanup tag to Article name without specifying a reason. To help other editors identify the problems please add a reason to the tag using the |reason= parameter or replace it with a more specific tag. Cleanup tags without reasons will be automatically removed after a week."

If a message has been left and the {{cleanup}} template still has no reason supplied a week later the bot can then remove the template. Only templates added after the notification system has been implimented should be removed.

The relavent discussions and rfcs can be found at Template talk:Cleanup. AIRcorn (talk) 09:54, 6 May 2012 (UTC)

If this can't be done for some reason (even the notification part would be useful) is there a way (AWB, toolserver) to semi automatically notify people who placed the tag from Category:Cleanup tagged articles without a reason field from May 2012 and any subsequent monthly category. It was a lot of work to get to this stage and I would rather not let it die here. AIRcorn (talk) 00:58, 16 May 2012 (UTC)

Retrieved from archive. — Martin (MSGJ · talk)

Fixing Protection Templates

Not sure if this already exists, but could a bot be made that checks Category:Wikipedia pages with incorrect protection templates and then tries to fix them. Not sure exactly how much of the necessary fixes could be automated, but here's a list of possible tasks in order of how easy I'm guessing they would be to implement:

  1. If any of the pages in that category are unprotected, simply remove the protection templates
  2. If the protection level matches the template, check that the date format in the template's expiry parameter is in 24 May 2012 format, and that it matches the actual expiry date of the protection
  3. Try to figure out if the protection template is being transcluded onto the page (I'm guessing that this would not even be feasible, but I'm putting it here in order to justify the use of a numbered list, which I already started.)

— Preceding unsigned comment added by Millermk90 (talkcontribs)

I believe there are bots that do some of these already, specifically User:DumbBOT and User:Lowercase sigmabot. — Martin (MSGJ · talk) 09:42, 24 May 2012 (UTC)

Bot to alert WikiProjects about image issues

I'm requesting a bot to warn WikiProjects about images that are at risk for deletion. Many images, particularly older non-free images, get deleted for things like inadequate fair use rationale, and nobody notices until they're deleted. User:CommonsNotificationBot currently places notifications on article talk pages, but these don't always get viewed in the week before deletion. Ideally, this bot would maintain a page at each WikiProject like Wikipedia:WikiProject Example/File alerts (similar to many article deletion pages currently maintained). This bot would update the page when a new image is at risk for deletion, and automatically remove the images, or mark them in some way, when the image is either deleted or no longer at risk. This would not need to run more than once per day (I don't think it could or probably should keep up with CSD). The WikiProjects to notify would be those on the article's talk page (if the image itself isn't project-tagged). ▫ JohnnyMrNinja 11:49, 24 May 2012 (UTC)

This capability already exists through ArticleAlertBot. The catch is though that the WikiProject must be signed up with the bot and the file must be tagged as belonging to the Wikiproject. If these 2 things are done though it does a pretty good job of letting the project know if its been submitted for deletion. Kumioko (talk) 15:01, 24 May 2012 (UTC)
I'm not quite sure what you mean. Are you saying that that I should request the bot operator to do this (through the user's talk page)? Or are you saying that this already happens with tagged images? Because the images that inspired this request were tagged with WikiProjects, but I don't believe anyone was alerted. ▫ JohnnyMrNinja 21:49, 24 May 2012 (UTC)
A WikiProject has to subscribe to get the Article Alerts, check to see if the projects on the files are getting alerts. Keith D (talk) 22:29, 24 May 2012 (UTC)
Well, for example File:Spooky the Tuff Little Ghost.jpg was labeled as "no fair use", and was set to be deleted Friday. Wikipedia:WikiProject Comics/Article alerts does not list the file, nor does Wikipedia:WikiProject Comics/Article alerts/Archive. ▫ JohnnyMrNinja 22:41, 24 May 2012 (UTC)
If you have Twinkle, there are a set of tags for files under DI which is delayed speedy deletion. These are the deletions that those interested are least likely to see. ▫ JohnnyMrNinja 22:45, 24 May 2012 (UTC)
Just has a look at the BOT info and it only alerts on Files for deletion entries, there was discussion in 2010 on files and extending the scope of alerts but looks like nothing came of it. Keith D (talk) 22:57, 24 May 2012 (UTC)

Correcting category keys

All articles in the categories Western Football League seasons and Southern Football League seasons need their category keys tweaked so they list correctly. They currently have (for example):

[[Category:Southern Football League seasons]]
[[Category:1928–29 domestic association football leagues]]
[[Category:1928 in England]]
[[Category:1929 in England]]

and what's needed is:

{{DEFAULTSORT:Southern Football League, 1928–29}}
[[Category:Southern Football League seasons|1928–29]]
[[Category:1928 in England]]
[[Category:1929 in England]]

Could a bot do this? There are a couple of hundred of them - rather too many to easily do by hand. Colonies Chris (talk) 09:19, 25 May 2012 (UTC)

 Doing... with AWB. Rcsprinter (shout) 15:14, 27 May 2012 (UTC)
 Done Rcsprinter (speak) 20:28, 27 May 2012 (UTC)

Count of Requested articles

Is there a bot available that could take on the ongoing task of counting the number of requests on the various Wikipedia:Requested articles pages? Counting the number of * characters on each page should be enough to get a semi-accurate number. A bonus would be if the bot could also do a count per each ==Section==. Thanks -- Eclipsed (talk) (COI Declaration) 12:51, 25 May 2012 (UTC)

There is probably a piece of software or a magic word that can do this; I don't think it is really a bot task. Rcsprinter (message) 15:15, 27 May 2012 (UTC)
There's the {{PAGESIZE:page}} magic word that lists the size of pages in bytes, and that is useful (but not a direct count of requests). See: Wikipedia:WikiProject Requested articles/Page sizes
If there was some type of {{character count |page= |character= }} thing, that'd be great. But I havn't found anything like that in my search so far. Thanks. -- Eclipsed (talk) (COI Declaration) 15:32, 27 May 2012 (UTC)

Years

A lot of years-based topics-categories have been moved on Wikimedia Commons. The category "Years in sports" has become "sports by year". Use a bot to move them all. J 1982 (talk) 09:59, 26 May 2012 (UTC)

If I'm understanding you correctly, this is not Wikimedia Commons, but rather, Wikipedia (you said "Wikipedia Commons").  Hazard-SJ  ✈  03:43, 27 May 2012 (UTC)
Yes, it's Wikimedia Commons I mean. J 1982 (talk) 11:25, 27 May 2012 (UTC)

Signature Adder

I am not sure about Sinebot ( I have seen posts that have no signature ), but can anyone please make a bot that signs all unsigned posts? Thank you. — Preceding unsigned comment added by 24.87.148.68 (talk) 02:55, 27 May 2012 (UTC)

Guess what? SineBot signed your message above. The bot uses some things for opting out, so that makes it prevent signing some things.  Hazard-SJ  ✈  03:46, 27 May 2012 (UTC)

Wikipedia:Version 1.0 Editorial Team/Olympics articles by quality statistics

I would like to see this bot available for other wikis as well. I would particularly be interested to see if this is implemented for Bengali wiki. So the basic requirement would be to have another flag to denote the language of the wikipedia for which it would search the data. http://toolserver.org/~enwp10/bin/list2.fcgi?run=yes&projecta=Olympics&importance=Top-Class&quality=FA-Class 2 will have another param, may be. Please help. -Pasaban (talk) 18:19, 20 May 2012 (UTC)

If you talk to the bots developer, User:CBM on his talk page he might be able to help. Kumioko (talk) 21:39, 20 May 2012 (UTC)
I don't always see things on this page so asking me about WP 1.0 things is usually good. Regarding the request, the WP 1.0 bot is seriously understaffed, and I have been looking for a second maintainer for some time. I am able to maintain the current functionality for the time being but adding more functionality is not possible. At the same time, the source code is published, and someone else could in principle run the bot for another wiki, although it would take some coding to modify it to do that. Sorry about that, — Carl (CBM · talk) 01:57, 21 May 2012 (UTC)
Hi Carl, the thing is am from .NET background and looking at your Perl scripts are tough for me to decypher. Especially am yet pretty new in Wikipedia framework. I would happy to fill in the blank space for that project but need a little hand helding. First of all, please guide me on where I can test run the bot? And is there any documentation for the code? I see some inline commenting though. But would rather not touch them as it is heavily used in English wiki. If you guide me to use that to Bengali wiki without breaking things, that would be of great help. - Pasaban (talk) 05:28, 28 May 2012 (UTC)

Create a bunch of redirects for Indiana townships

Every Ohio township article is entitled "___ Township, ___ County, Ohio", but each one has either a corresponding "___ Township, Ohio" redirect (when the township name is unique) or a "___ Township, Ohio" disambiguation page when there are multiple townships with the same name. This is only partially true for Indiana townships — all of the disambiguation pages have been created, but "___ Township, Indiana" redirects to uniquely named townships are nonexistent. I'd like it if someone could write a bot to go through List of Indiana townships (which has a "___ Township, ___ County, Indiana" link for each township) and create each "___ Township, Indiana" as a redirect if it doesn't already exist. Since there might be an ambiguous name or two that hasn't been created, it would be good for the bot to log all its page creations so that I could go through and repair errors of this sort. The code for the redirects could simply be REDIRECT#pagename; no categorisation of redirects is needed. Nyttend (talk) 01:37, 23 May 2012 (UTC)

Assuming I understand the request, there would need to be some disambiguation for quite a lot of these. I've just sorted by article here, I haven't yet checked for any redirects. If no one else takes this up, I'll try to get it done over the next few days. Avicennasis @ 09:16, 5 Sivan 5772 / 09:16, 26 May 2012 (UTC)
I've created any missing redirects from the unique set, now just need to get the disambig pages created and in order. Avicennasis @ 11:07, 6 Sivan 5772 / 11:07, 27 May 2012 (UTC)
Looks great! I've gone through all of the ambiguous names, checked for the existence of "___ Township, Indiana" disambiguation pages, and created the ones that didn't already exist. Nyttend (talk) 15:03, 27 May 2012 (UTC)
Cool beans. I think we're  Done here, then, right? Avicennasis @ 06:00, 7 Sivan 5772 / 06:00, 28 May 2012 (UTC)
Yes; thanks a lot. Nyttend (talk) 11:25, 28 May 2012 (UTC)

It seems there is an increase in the number of user namespace sandbox pages appearing in content categories. Probably related to the new sandbox link. I would like to see a bot monitor all user sandbox pages that are created or modified and strip out all the categories and interwiki links. Any stub templates should be removed as well. -- Alan Liefting (talk - contribs) 02:16, 28 May 2012 (UTC)

BattyBot is approved to do the first, but not the latter two. GoingBatty (talk) 02:56, 28 May 2012 (UTC)
Content cats are more important of course but it would be good to get approval for project categories as well. They get cluttered up with unneeded user namespace stuff. Why is it that I am finding some cases of user sandboxes in content cats. Does BattyBot go batty and not do her job properly? . -- Alan Liefting (talk - contribs) 03:22, 28 May 2012 (UTC)
Commonly what is needed is not for pages to be removed from the category but to change [[Category:Category name|sort key]] to [[:Category:Category name|sort key]] this is particularly important for drafts where new users have put in the time and effort to find appropiate cats. Even better, the bot could also post an explanation to the users talk page. Stuartyeates (talk) 03:34, 28 May 2012 (UTC)
Agree on all that. -- Alan Liefting (talk - contribs) 03:49, 28 May 2012 (UTC)
Yes, Alan, BattyBot doesn't deal well with
<!--
header
several external links
nav templates
DEFAULTSORT

[[Category:Foo]]
[[Category:Bar]]
[[Category:Baz]]

language links
-->
so I would have to run it manually by category, so I haven't done so in a while. GoingBatty (talk) 14:58, 28 May 2012 (UTC)

WikiProject Hungary – Sports and games task force

Hello!

I'd request a bot to tag articles that are within the scope of the recently created Wikipedia:WikiProject Hungary/Sports and games task force with {{WikiProject Hungary|class=|importance=|sports=yes}}. I don't know how it goes exactly, but from what I managed to figure out I should give a category of which articles would get the banner. If it's so, then this should be the Category:Sport in Hungary, and all of its sub-categories. I don't know if it's possible, but if yes, then it would be also great to auto assess these articles according to the ratings of other banners. Since this is my very first time, I may well misunderstood something, so please be forebearing. :) Thanks for your answer and help! — Thehoboclown (talk) 15:18, 3 June 2012 (UTC)

Just a recommendation but you might want to limit it a little more than Category:Sport in Hungary and its subcats. Just glancing through there is going to be some collateral articles for people outside the projects scope otherwise. For example there are some Sports competition related categories that have people from other countries like the US and South Africa and I'm not sure if you mean to include those. Kumioko (talk) 15:28, 3 June 2012 (UTC)
I'd simply suggest you take it to one of these bots who already do this and just tell them or their operator your exact specs. Rcsprinter (converse) 15:50, 3 June 2012 (UTC)
Thanks for the answers! I've also quickly checked the categories and their content before named for tagging here, but found nothing unrelated. If there's some, I think it is still better to include a few possibly non-related articles than exclude a whole category. Sooner or later the misplaced banners will be removed by someone if they really don't belong there. Now I'm right on my way to find a friendly tagging bot.
Thanks again for your helpful words, Thehoboclown (talk) 16:29, 3 June 2012 (UTC)
Your best bet would be either Anomiebot or Hazardbot. Most of the rest of those are not presently working for a variety of reasons. By the way according to that category there are about 4705 articles relating to sports in Hungary of which 3510 aren't already tagged for Hungary at all. You should probably mention that to the bot op so they know how many articles affected. This will also help them write the code. The code changes a little if you are adding a whole project tag or just a parameter. Good luck. Kumioko (talk) 17:24, 3 June 2012 (UTC)

Additional CFD bot(s) needed

The regular bot that handles Wikipedia:Categories for discussion/Working has had several stalls in the last week. At least one additional bot that can process category tasks is needed to ensure full coverage continues. Thanks in advance. Timrollpickering (talk) 02:27, 27 May 2012 (UTC)

I can have AvicBot pick up part of this task - it can move categories - but as AvicBot is not an AdminBot, it cannot delete the empty category pages once it's done. Avicennasis @ 03:26, 6 Sivan 5772 / 03:26, 27 May 2012 (UTC)
I'd love to have Hazard-Bot help with cat renaming/removal as well.  Hazard-SJ  ✈  03:45, 27 May 2012 (UTC)
BRFA filed here. The more the merrier, right? Avicennasis @ 08:58, 6 Sivan 5772 / 08:58, 27 May 2012 (UTC)

Cydebot is operating again. I agree that additional bots would be helpful, but I am curious about something: What steps need to be taken to prevent the bots from competing (i.e., attempting the same task at the same time)? Thank you, -- Black Falcon (talk) 21:35, 27 May 2012 (UTC)

With good programming, nothing. For example, if both bots are set to only replace [[Category:foo]] with [[Category:bar]], and skip the page if [[Category:foo]] is not found, then it doesn't matter if both bots work on the same task - it'll merely skip the page if the other bot got to it first. No conflicts should arise, and if anything, the task will get done sooner than normal. Avicennasis @ 05:11, 7 Sivan 5772 / 05:11, 28 May 2012 (UTC)
I have possible the same code Cydebot uses, and so, could do more than what I mentioned above. As for conflicts, what would Cydebot do if a human attempted to do its task? Same as what the bot would do, I expect? However, my bot would have to tag for deletion, I believe, instead of actually deleting. I'll have to go through the script.  Hazard-SJ  ✈  01:40, 29 May 2012 (UTC)
Redundant I'm willing to help with this, but as Avicennasis already made a BRFA for it and Cydebot is alive, I'll abstain. I doubt BAG would approve two exceptionally similar tasks like this at once.  Hazard-SJ  ✈  02:07, 29 May 2012 (UTC)

Note: It is often (not always) better to replace the old category with a {{Category redirect}}. Rich Farmbrough, 15:40, 5 June 2012 (UTC).

Moving coordinates on the Welsh Wikipedia (cy)

Can someone help to put a pin on the map on all 700 Welsh peaks, please? The articles have been created without a map.

There has been some discussion on this - and a solution to the problem here.

All I now need is a BOT to upload the instructions to copy the geotags. Many thanks. Llywelyn2000 (talk) 07:52, 27 May 2012 (UTC)

What's the problem with this one: is it because it's not on the English Wiki or that it's too complicated? Llywelyn2000 (talk) 08:43, 3 June 2012 (UTC)
Or that it's not interesting to anyone who has time to do it, or some combination of all of these. Anomie 12:04, 3 June 2012 (UTC)
Another problem is that Wikipedia is bannishing bot programmers faster than new ones can be created/recruited and its a rather niche skill. Kumioko (talk) 15:45, 3 June 2012 (UTC)
Thanks for responding! Bots were created to upload many of the foundation articles on en and other languages. In certain circumstances, such as ourselves where we only have 36K articles (and a handful of men at arms), we need their help! Sorry: your help! It would be very laborious to geotag these articles manually - more than 3,000 - yet having a map to show each location is crucial. We have not one single bot on Welsh Wikipedia and I see many less useful bots being applied to en. Take a look at some of the above! Take up the challenge in the face of a ban! Or do we stoop low as to accept a monolingual, grey, grey Coko-kola kultur, thrown at us? Llywelyn2000 (talk) 06:03, 4 June 2012 (UTC)
I believe I am permitted to do this little task. Please confirm that the appropriate cy:wikipedia approvals are in place. Rich Farmbrough, 01:18, 5 June 2012 (UTC).
A bot that says YES! Thanks. Which bot needs approving? Llywelyn2000 (talk) 06:33, 5 June 2012 (UTC)
Helpful Pixie Bot is the article editing bot by and large. Rich Farmbrough, 15:20, 5 June 2012 (UTC).

MediaWiki has a neat feature built into file pages that automatically links to certain pages in the meta data. For instance, if you go to File:Cervus elaphus Luc Viatour 3.jpg here on WP you will see the link to NIKON CORPORATION and NIKON D3S (common links, both blue) and Bibble Pro 5 5.2.2 (currently red). On Commons, all of these links are blue, because they are interwiki links, so finding that the page is not present is unexpected. I'm hoping for a bot to do two things. 1) to compile a list of all "wanted pages" linked to from meta-data, so that appropriate redirects can be created. 2) maintain redirects with a template like {{R for file meta-data}} (similar to {{R from move}} and the like), so that these redirects aren't deleted as unneeded (as Bibble Pro was). ▫ JohnnyMrNinja 01:54, 1 June 2012 (UTC)

Is there any way to easily get a list of all of these? Anomie 08:33, 2 June 2012 (UTC)
The fields are filled out by the camera and/or photo editing software, so they can be any model of hardware or any version of software; there is no standard list. ▫ JohnnyMrNinja 09:42, 2 June 2012 (UTC)
The obvious approach is to list all the files in use on en:wp and parse their pages' HTML, this is easy but slow. I have no idea how the EXIF module works, clearly if it stores the exif data in the data-base things become relatively trivial. Rich Farmbrough, 17:07, 5 June 2012 (UTC).

I recently noticed that Template:Infobox_musical_artist has a category of associated articles with deprecated parameters that currently numbers nearly 4,000. The parameters that need fixed are "Born =" (replaced by Template:Birth date and age for the living and Template:Birth date for deceased) and "Died =" (replaced by Template:Death date and age). I have begun plugging away at them but I am slowed by my obsession with actually reading the articles and making whatever improvements I can. Is the template cleanup described here something that a bot could handle? - UnbelievableError (talk) 03:00, 1 June 2012 (UTC)

A feature request was made for AWB last year on the same subject - the stumbling block at the time was that Born and Died include the date and place in the same field, and if they don't use templates the bot has no way of separating them. There are also many different ways to separate the information, which any bot must account for as well. See below for some of the different uses I've seen, which also apply to Died and can be with or without date templates.
|Born = DATE, PLACE
|Born = DATE<br>PLACE
|Born = DATE in PLACE
|Born = DATE
PLACE

(I vaguely remember that User:Yobot used to work on this sometimes, but it's currently blocked.) 1ForTheMoney (talk) 13:48, 1 June 2012 (UTC)

Yep, this is pretty trivial, but only Mag and I have been interested in doing it. I have developed a slight aversion to that task, since no-one has sold me on the benefits of meta-data, though I would consent to clear up the relatively small category shown here were I allowed. Rich Farmbrough, 01:05, 5 June 2012 (UTC).
You may not have to bother; User:Avicennasis started work on this the same day this request came up. As of now, we're at ~2,900 articles and dropping. 1ForTheMoney (talk) 08:44, 5 June 2012 (UTC)
 Doing... Slowly but surely. Avicennasis @ 09:56, 15 Sivan 5772 / 09:56, 5 June 2012 (UTC)
Excellent! Rich Farmbrough, 16:45, 5 June 2012 (UTC).

Redirects that have Interwiki!

Hi, I listed Redirect pages that have Interwiki Hear. .I got this query by

SELECT /*SLOW OK */ page_title  FROM page  JOIN langlinks ON page_id = ll_from WHERE page_namespace = 0 AND page_is_redirect = 1 GROUP BY page_title ORDER BY count(ll_from) DESC;

please remove this interwikis by botReza1615 (talk) 19:28, 4 June 2012 (UTC)

Please don't do this blindly. If the best match here for another article is our redirect, then the interwiki should be on the redirect, not on the target. For example, suppose that author X wrote book Y. If our wiki redirects X to Y, but some other wiki has separate articles for X and Y, then we should have one interwiki on our redirect X and one on our article Y, not two on Y and none on X.
Examples of this from the list include:
  • 4278 Harvey is a redirect here but an independent article on dewiki. The langlink here should be on the redirect page.
  • We redirect cryptology to cryptography but dewiki has separate articles on them. Again, our langlink should be on the redirect page.
— Carl (CBM · talk) 19:37, 4 June 2012 (UTC)
The major issue is
1- we read articles in en.wiki not in other wikis
2-this kind of inter wikis makes readers confused because he reads article a in de.wiki and he guided to b in en.wiki but in en.wiki the article is linked to c!
we have many issue such as this list that shows first column and third column are not the same. for example a (in fa.wiki)-> b (in de.wiki) but b (in de.wiki) -> c (in fa.wiki).
this connection is like W shape. in my opinion one of the main problems of interwiki conflicts is redirects that has interwiki.Reza1615 (talk) 21:19, 4 June 2012 (UTC)
It is known that there are limitations in what interwikis can do, and that interwiki bots' assumption that interwikis (should) form a partition is naive. But redirects do actually simplify and reduce this problem. Though "we" (en: readers) may never see the interwikis on the redirect page, they confirm that the redirect is the correct target for the other languages. The interwiki on the other language is also functional, as it leads to the correct section (usually it is a section) of the appropriate article. Rich Farmbrough, 00:54, 5 June 2012 (UTC).
(Incidentally http://fa.wikipedia.org/wiki//%D9%88%DB%8C%DA%A9%DB%8C%E2%80%8C%D9%BE%D8%AF%DB%8C%D8%A7:%D9%81%D9%87%D8%B1%D8%B3%D8%AA_%D9%85%D9%82%D8%A7%D9%84%D9%87%E2%80%8C%D9%87%D8%A7%DB%8C%DB%8C_%D8%AF%D8%A7%D8%B1%D8%A7%DB%8C_%D9%85%DB%8C%D8%A7%D9%86_%D9%88%DB%8C%DA%A9%DB%8C_%D8%B6%D8%B1%D8%A8%D8%AF%D8%B1%DB%8C#fa_.D8.A8.D9.87_de is an empty page for me.) Rich Farmbrough, 00:56, 5 June 2012 (UTC).
I think https://fa.wikipedia.org/wiki/%D9%88%DB%8C%DA%A9%DB%8C%E2%80%8C%D9%BE%D8%AF%DB%8C%D8%A7:%D9%81%D9%87%D8%B1%D8%B3%D8%AA_%D9%85%D9%82%D8%A7%D9%84%D9%87%E2%80%8C%D9%87%D8%A7%DB%8C%DB%8C_%D8%AF%D8%A7%D8%B1%D8%A7%DB%8C_%D9%85%DB%8C%D8%A7%D9%86_%D9%88%DB%8C%DA%A9%DB%8C_%D8%B6%D8%B1%D8%A8%D8%AF%D8%B1%DB%8C#fa_.D8.A8.D9.87_de was meant.  Hazard-SJ  ✈  05:23, 5 June 2012 (UTC)
Thanks. Form what I could see these are mostly legitimate errors (if that makes sense) - capitalisation, confusing "meteor" and "meteroid" (I think), and so forth. Rich Farmbrough, 17:10, 5 June 2012 (UTC).
@Rich Farmbrough: when we want to have connect pages that they are in other language we can use subsection interwiki. it has two benefits 1-when user uses this interwiki s/he can understand that this interwiki is for that part of text not whole part of article (interwikis in redirects doesn't show this) 2-we can cover article that is not in wiki (like interwikis in redirects)
@CBM: interwikis in redirect pages also has other problem: standard interwiki bots can not edit these pages so if other wiki's interwiki should be chang they can not solve it and it makes interwiki confilicts Reza1615 (talk) 14:05, 5 June 2012 (UTC)
No idea about the technical issues - from a user perspective I'd rather have the iwl in the redirect giving the option to eventually expand to a fullblow article without having to remember to move the iwl. Agathoclea (talk) 15:49, 5 June 2012 (UTC)
CBM is an expert in the Peano axioms so I'd tend to follow his lead on the structure of whole page links (unless we were talking about the Giano peano).
The issues with interwiking too a subsection is that firstly you can't interwiki from a section, secondly that sections can change names. Both are handled by redirects.
If standard interwiki bots can't edit redirect pages, then we need better bots. (Or to dump the whole mechanism which I am assured will be done some time in the next year.)
Rich Farmbrough, 16:30, 5 June 2012 (UTC).

Wikiproject Katy Perry

I would like to get an article alerts bot for the WikiProject Katy Perry please. teman13 TALKCONTRIBUTIONS 05:02, 5 June 2012 (UTC)

See Wikipedia:Article alerts/Subscribing.  Hazard-SJ  ✈  05:22, 5 June 2012 (UTC)

DRV bot

WP:DRV could use a bot creating the daily review pages. They are currently hand-created each day by the first person to click a handy "create" link on the DRV main page, but a bot would be more convenient and consistent. T. Canens (talk) 10:30, 5 June 2012 (UTC)

Enforce Wikipedia's alumni notability guideline against namechecking

I have been working to cleanup the alumni sections of university articles. I commonly find many non-notable names listed in violation of Wikipedia's alumni notability guideline. WP:ALUMNI requires that the person listed either (a) already have a Wikipedia article, or (b) have a reference showing notability. I have written an essay, Your alma mater is not your ticket to Wikipedia" about namechecking in university articles. I am requesting a bot to flag names on university pages which do not meet either of the two stipulations (e.g., existing Wikipedia page or reference), which then could be removed. I suggest that the name-removal feature not be fully automated because there are occasional cases of misspellings or disambiguations where a person has their own Wikipedia article, but it doesn't link properly. NJ Wine (talk) 22:28, 30 May 2012 (UTC)

If I see an issue, I generally remove it. My rule of thumb is if they are redlinked (and don't have a good reason to be created in the future) or no link, I remove it. Still, it might be easier to just remove them as we find them, as I don't think a bot might be able to do all of this. Can you give some examples of this being a large issue, as I really have not seen this as a major problem, and would assume people are good at policing this issue on any school article (because universities are not the only ones having this issue). Kevin Rutherford (talk) 23:15, 31 May 2012 (UTC)
There are plenty of examples such as Institute of Rural Management Anand which are just nightmares. A bot to help enforce alumni notability guideline would be a great help in encouraging consistency across wikipedia, particularly in new articles. Stuartyeates (talk) 23:32, 31 May 2012 (UTC)
The worst article that I have seen was List of New Jersey Institute of Technology alumni. Before I started working on it, the article had roughly 110 names on it, less than a third of whom were notable. The problem with these university alumni articles is that they don't get a lot of traffic, and they are aren't really maintained. Even so often a name is added, but it's rare for a name to be removed. NJ Wine (talk) 04:41, 1 June 2012 (UTC)
The word "guideline" has a special meaning on Wikipedia. The page you link is an essay that "contains the opinion of one or more WikiProject" --92.6.202.54 (talk) 23:32, 5 June 2012 (UTC)
I looked at a similar problem, and I concur that this is, by normal standards, a better candidate for a bot report than bot removal. Rich Farmbrough, 01:16, 5 June 2012 (UTC).
Agreed. If the bot just flagged or reported names which didn't meet the WP:ALUMNI standard, I think editors would be far more likely to remove the non-notable names. NJ Wine (talk) 22:51, 5 June 2012 (UTC)

Chart archiving bot

Some charts cannot be indexed (example:Record Report), and then it would be very useful to have a bot that auto-archives the chart page each time it is refreshed (example:Record Report is refreshed every saturday) so to have a complete chart archive to use on singles and albums pages on Wikipedia. --Hahc21 [TALK][CONTRIBS] 20:21, 1 June 2012 (UTC)

To be more specific, what i want the bot to do is to, provided a day, load the given url (as an example of the Record Report url: http://webcitation.org/archive?url=http://recordreport.com.ve/publico/?i=top100&email=nowyouknowmymail@hotmail.com) and return the archive url (which in this case is http://www.webcitation.org/686PeUioN). The bot will do this for several urls, provided a day for each of the chart parameters. Is that possible? --Hahc21 [TALK][CONTRIBS] 20:45, 1 June 2012 (UTC)
Where would it put the old data? You can't have subpages of articles, like you can do with user talk archives. Rcsprinter (chatter) 20:51, 1 June 2012 (UTC)
And could the old data be written on an specific talk subpage of the main chart article page? --Hahc21 [TALK][CONTRIBS] 22:52, 1 June 2012 (UTC)
Bonus points may be awarded if the information can be incorporated into an easy-to-use referencing template. Just sayin. (Also, why daily? That chart suggests it updates weekly, is that enough?) --joe deckertalk to me 23:00, 1 June 2012 (UTC)
The bot is supposed to update each chart weekly. So, one day per week, it could update each chart it is programmed to archive, and then write the output url onto the talk subpage of each of the record charts it is tasked to archive. The information is supposed to will be incorporated on {{Singlechart}} and {{Albumchart}} when the Chart IDs of such charts are called. I'm thinking about a template to retrieve the archived url onto the chart table when the ID is called. --Hahc21 [TALK][CONTRIBS] 23:11, 1 June 2012 (UTC)
Gotcha. --joe deckertalk to me 17:32, 3 June 2012 (UTC)

If I understand this correctly, it is to force an external site to archive another external site and record the archive location. There have been previous projects along these lines, but on a much larger scale (for which reason they were stopped), although I do not know the details. Digging or consulting community memory might save some work. Rich Farmbrough, 01:11, 5 June 2012 (UTC).

But is it possible? or is not possible? =( --Hahc21 [TALK][CONTRIBS] 04:13, 6 June 2012 (UTC)

Removal

Can I get a bot to remove the |parking= field from {{Infobox shopping mall}}? The field was removed due to misuse and irrelevance. Ten Pound Hammer(What did I screw up now?) 20:24, 4 June 2012 (UTC)

If it has no visible output/effect it would put the botowner in hot water. Anything else that could be done to related articles at the same time? Agathoclea (talk) 20:30, 4 June 2012 (UTC)
Oh sure, you could do other fixes and get a goodly percentage of them done. But the botowner will still get in hot water with Carl if he sees you doing it. Because, you know, it's the intention that counts. Rich Farmbrough, 16:39, 5 June 2012 (UTC).
You can have this added to AWB template parameter handling. Rich Farmbrough, 16:39, 5 June 2012 (UTC).
Hi Rich! I know that WP:AWB/RTP will replace parameters. Where were you thinking of adding the ability to remove |parking=? Thanks! GoingBatty (talk) 03:29, 6 June 2012 (UTC)
Its built into the code there's no page for it. Kumioko (talk) 03:35, 6 June 2012 (UTC)
Ah, OK - those are documented on WP:AWB/GF. Thanks! — Preceding unsigned comment added by GoingBatty (talkcontribs) 03:46, 6 June 2012 (UTC)

Femto6 Redux

My post here about having two projects (Texas, and Women's History) added to Femto6 was archived, so maybe whoever reads this sees it as more appropriately listed here Wikipedia:Bot owners' noticeboard#Rich_Farmbrough.27s_bots, which I did. But there is so much of Rich's work they're sorting out over there, that I can't tell by what is being posted if Femto6 (Update recent chages pages for projects) was taken on by anyone. Is there anyone reading this who can clarify that for me? Since Special:RecentChangesLinked/Wikipedia:WikiProject_Hawaii/Recent_changes continues to be updated, the bot must be still working. What do I do to get these projects added? Maile66 (talk) 17:51, 3 June 2012 (UTC)

Although Femto bot hasn't made any changes since May 5th I agree this bot needs to be taken on by someone. This affects quite a few projects including several and potentially all for the WikiProject United states supported ones. Kumioko (talk) 13:43, 4 June 2012 (UTC)
So, based on what you've seen over at the noticeboard, are you saying that no one has yet taken it on? Maile66 (talk) 13:54, 4 June 2012 (UTC)
I am noticing one thing about the Recent Changes list over at Hawaii. I just ran a test. It continually updates changes like a regular Watchlist, but only for articles which existed prior to Rich's de-frocking. Any articles created (or changes to same) after that are not picked up on this. Interesting. Maile66 (talk) 14:57, 4 June 2012 (UTC)
No no one yet. As far as I can tell only a couple of the tasks have attracted much interest. Kumioko (talk) 15:06, 4 June 2012 (UTC)
Then I guess I'm gonna keep trying over here to get something going like this for Texas and Women's History. It's a need/want, whether it's Rich's bot or not. Guess we won't be seeing Rich around for a month.Maile66 (talk) 23:28, 6 June 2012 (UTC)

The Drug Enforcement Agency, a part of the government of Liberia, is linked by over 100 pages, but nearly all of the links are mistakes and meant to go to the US government's Drug Enforcement Administration. I understand that there would be some false positives if a bot went around and replaced all of them, so I've created a userspace page with all of the current links; I'm looking at each page myself and removing entries from the userspace page if there's a link that really should go to the Liberian entity. Would it be possible for a bot to change all of the links from pages linked here once I've checked all of them and removed the irrelevant ones? I doubt that there would be many errors, and what errors exist would purely be my fault; you could insert a note into the edit summary specifying that errors should be reported at my talk page. If this idea prove workable, I can leave a note here when I'm done checking all of the links. Nyttend (talk) 18:26, 6 June 2012 (UTC)

I just decided to go ahead with the link checking. Unless I've made a mistake, all of the pages currently linked at User:Nyttend/DEA have links (and a few have the same text in other contexts) that need to be changed from "Drug Enforcement Agency" to "Drug Enforcement Administration". Nyttend (talk) 18:43, 6 June 2012 (UTC)
 Done - based on context clues, there were no false positives that I found. Avicennasis @ 20:58, 16 Sivan 5772 / 20:58, 6 June 2012 (UTC)

tag some Fb templates

could someone append {{tfd|{{subst:PAGENAME}}}} to the templates listed here? they should all be orphaned, so it shouldn't make a difference if the tag is included in a <noinclude>...</noinclude> or not. Frietjes (talk) 20:56, 7 June 2012 (UTC)

 Doing... After a brief goof (didn't see the collasped list at first). Avicennasis @ 04:13, 18 Sivan 5772 / 04:13, 8 June 2012 (UTC)
 Done Avicennasis @ 05:51, 18 Sivan 5772 / 05:51, 8 June 2012 (UTC)

Hello. Per the requested move at Talk:Air21 Express (2011-present) could all current wikilinks to Air 21 Express please be updated to link to Barako Bull Energy? This would be a one time run that I assume could be done with AWB, so I don't think a BRFA would be necessary. Jenks24 (talk) 09:23, 12 June 2012 (UTC)

P.S. It would be best if, for example, [[Air 21 Express]] was changed to to [[Barako Bull Energy|Air 21 Express]] Thanks in advance, Jenks24 (talk) 09:26, 12 June 2012 (UTC)
There's only 14 of them, unless someone is already doing this and is almost done. You could probably have it done manually by the time someone would get around to AWBing it. Also, I note it is unnecessary to change all links to Air21 Express (2011-present) due to this move. Anomie 13:50, 12 June 2012 (UTC)
N Not done per above. This can manually be done with ease.  Hazard-SJ  ✈  02:29, 13 June 2012 (UTC)
Huh, I was sure there were way more incoming links than that ... maybe someone else did them? Anywho, yeah I can certainly do the last 14 myself without any automation. Also, just to be clear, I wasn't asking for any links to Air21 Express (2011-present) to be altered. Jenks24 (talk) 21:27, 13 June 2012 (UTC)

Ah, I see my mistake now, there shouldn't be a space between "Air" and "21". Air21 Express has many more incoming links. So, could [[Air21 Express]] please be changed to [[Barako Bull Energy|Air21 Express]] in preparation for the move? Cheers, Jenks24 (talk) 21:36, 13 June 2012 (UTC)

 Doing... --Ceradon talkcontribs 23:24, 13 June 2012 (UTC)
Thanks for this. And just to emphasize, the link that has to be changed is Air21 Express to Barako Bull Energy. I dunno if it's OK to link Air21 Express (2011–present) to Air21 Express, though. –HTD 05:18, 14 June 2012 (UTC)

Categories and main articles

During a discussion on a proposed category page MOS at Wikipedia_talk:Manual_of_Style/Category_pages#Cat_main the issue was raised of missing main articles in categories. As an example science should be in Category:Science with a category sort order of a space. Due to either forgetting in the case of new categories or removal due to vandalism this important link is missing. It seems to me that the task of checking and correcting this is ideal for a bot. It would only be used in cases where there a direct correlation. The cases where there is the singular and plural category names exists (eg Category:Murder and Category:Murders) may have to be left to an actual human. -- Alan Liefting (talk - contribs) 06:16, 13 June 2012 (UTC)

Generating a report along the lins of this one User:AnomieBOT/Afd-mergefrom report would be cool. -- Alan Liefting (talk - contribs) 05:56, 14 June 2012 (UTC)

Time template purge bot

The documentation of some of the cite templates (e.g., Template:Cite web, Template:Cite news) use {{ #time}} in the example |accessdate= parameter. When I create a cite, I usually copy the example string. Before doing that, I usually have to purge the template page to get the accessdate to be the current date. At this moment, the document for Template:Cite news shows accessdate=May 31, 2012, whereas today's date is June 1, 2012. I think there are a lot of pages that use {{#time}} (Template:Time?) that could use an automatic, once a day purge when the time changes from 23:59 to 00:00. Template:Time notes, "Most Wikipedia pages display a cached version of the page to reduce server load, so the template will only display the current time as of when the page was last parsed." Would you please develope a time template purge bot that, at about 00:01 UTC each day, purges pages that use time templates such as {{ #time: j F Y}} (Template:Time?) and/or pages that transclude those pages (such as template pages that transclude documentation pages using, for example, {{ #time: j F Y}} (Template:Time?). Maybe just limit the bot to template namespace if server load is a problem. Thanks! -- Uzma Gamal (talk) 13:35, 1 June 2012 (UTC)

{{{ #time}} is not referring to the template {{time}} - it is a Parser Function. As such, any bots would have to purge the templates using the actual parser manually. (As a side note, I'm not sure if the bot would have to purge every page using that template, rather than the template itself.) Avicennasis @ 01:51, 12 Sivan 5772 / 01:51, 2 June 2012 (UTC)
Yes, I think you only need to purge the template page itself. I'm not that familiar with how bots actually work. When you say using the actual parser manually, it it possible for a bot to 1) gather template pages using #time into a list and then 2) automatically go through that list once a day and purge each template page on that list? (E.g., create strings like http://en.wikipedia.org/w/index.php?title=Template:Cite_book&action=purge and http://en.wikipedia.org/w/index.php?title=Template:Cite_news&action=purge and run them?) -- Uzma Gamal (talk) 08:05, 2 June 2012 (UTC)
I think the API is a better choice for this. See mw:API:Purge - a user only has to open this link to purge the Coal, Ball, and Coal ball pages. So the link this task would be using would look something like http://en.wikipedia.org/w/api.php?action=purge&titles=Template:Cite_news%7CTemplate:Cite_book. Στc. 08:16, 2 June 2012 (UTC)
In general, MediaWiki should he handling this itself; it already does set the cache time lower on pages using various time-related magic words and parser functions (although I don't know if #time is one of them). If this is not happening, submit a bug. Anomie 08:32, 2 June 2012 (UTC)
I'll post a note at bugzilla if I notice that MediaWiki is not purging the pages. -- Uzma Gamal (talk) 10:50, 5 June 2012 (UTC)
Was a bug ever filed for this? It seems they are not updating - I just purged the pages; beforehand they were still showing dates from June 7 and 8th. Avicennasis @ 08:49, 20 Sivan 5772 / 08:49, 10 June 2012 (UTC)
Once I get the ability to use automation back I will take care of this. Rich Farmbrough, 01:14, 5 June 2012 (UTC).
You're only prohibited from using automation,[4] not developing automation for others to use. Bot policy allows bots operated by multiple users, so I don't see why you can't pass operation of your bots to one or more users. -- Uzma Gamal (talk) 10:50, 5 June 2012 (UTC)
Anybody who cares to can code this in a few moments. If they are too idle to do that I would not want them running the bot. Rich Farmbrough, 16:44, 5 June 2012 (UTC).
I would be happy to run this bot. This is a very simple task. However, I have not run a Wikipedia bot before and I do not know if this task even warrants an RfBA ?  TOW  talk  03:34, 16 June 2012 (UTC)
I thought every bot needed approval - see Wikipedia:Creating a bot#Overview of the process of developing a bot. Good luck! GoingBatty (talk) 14:47, 16 June 2012 (UTC)
As per what you said, I have submitted a request for approval for this bot.  TOW  talk  20:24, 16 June 2012 (UTC)

Bare reference conversion

I'm updating barelinks using {http://toolserver.org/~dispenser/view/Reflinks/NoTitle), but it occured to me that because it's very procedural it could be done with a BOT? Sfan00 IMG (talk) 12:51, 15 June 2012 (UTC)

You would need to look at the date format as the tool usually puts the dates in ISO format rather than honouring the date format of the article and you have to manually change it before saving. Keith D (talk) 12:59, 15 June 2012 (UTC)
Yes it could, its actually reflinks.py in the standard pywikipedia distribution. User:DumZiBoT used to run it before NicDumZ retired. It would be nice to have a bot like that running again; it would be very easy for me to add that as another task to RscprinterBot. Would you like me to BRfA? Rcsprinter (lecture) 16:31, 15 June 2012 (UTC)
That would be helpful , If I recall correctly there also used to be a bot that flagged up possible dead links in refs as well, might be useful to have that running again. Sfan00 IMG (talk) 19:06, 15 June 2012 (UTC)
Reflinks is a good tool, but there are many parameters which Reflinks doesn't populate properly. Besides date formats, there are incorrect author parameters (e.g. YouTube references) and magazines/newspapers that are listed with |publisher= instead of |work=. Therefore, editors should run Reflinks manually and check/fix each reference before saving. GoingBatty (talk) 15:01, 16 June 2012 (UTC)

3 million null edits

See commons:Commons:Bots/Work requests#3 million null edits. — Dispenser 13:46, 16 June 2012 (UTC)

Is is possible to have a bot to cleanup deleted duplicate info tags?

http://en.wikipedia.org/wiki/Special:WhatLinksHere/Template:Duplicate_file_info

has over a 1000 entries, and I'm finding most of the duplicates are already deleted. Sfan00 IMG (talk) 23:00, 16 June 2012 (UTC)

Working on it...may take a bit :) deleting the duplicate templates Skier Dude (talk) 23:59, 16 June 2012 (UTC)
This list is full of copyright violations - blatant, no permissions - no FuR's, - as well as incorrect licenses. I've gone through a couple hundred & think that this is something that should be reviewed by people, not a bot. Skier Dude (talk) 04:10, 17 June 2012 (UTC)
Hmmm, most are easily re-viewed though.. Sfan00 IMG (talk) 08:08, 17 June 2012 (UTC)

Sports Bot(?)

As I have seen so far, most articles have single sources for statistical info on sports and records. I was wondering if a bot could be created which would update the relevant page with the relevant information. And I Don't have any programming experience and haven't studied programming at all, but I'd like to contribute in any way I can!--Harsh Mujhse baat kijiye(Talk)(Contribs) 21:09, 13 June 2012 (UTC)

Idea is not well explained.  Hazard-SJ  ✈  00:45, 14 June 2012 (UTC)
I'll try and elaborate a bit. The idea is that sports events take place each and every day and records may keep changing, in each and every sport. This is sometimes very difficult when large events/sports competitions are going on. Now to tackle that, a bot could be made which would
  1. Check the date on which the record info has been last updated.
  2. Access either a pre-programmed site or access the site cited as reference.
  3. Retrieve information from accessed site's source/code{as seen be clicking the view page source button}
  4. Update article with relevant information.

This could potentially save a lot of time and brings into it's scope a lot of articles that are primarily concerned with detailing the records of the game. --Harsh Mujhse baat kijiye(Talk)(Contribs) 05:41, 14 June 2012 (UTC)

You will need to be more specific than that if you want to get help here. You will need to explain how you would update the scores manually and suggest how it can be automated. For example, take a sport like Cricket and a particular page you would want updated on a daily basis. Ganeshk (talk) 06:13, 14 June 2012 (UTC)
You can try manually updating stats at the end of the season. Updating stats while the season is ongoing might violate WP:NOTSTATS. –HTD 06:52, 14 June 2012 (UTC)
Ok then. I'll try and keep up the List of Test cricket records and List of One Day International cricket records up-to-date manually. Will see after that. Thanks! You may archive this thread. Actually since I am not a programmer myself I cannot explain in term of language/code how this can be done although I guess it would be similar to what this bot does. --Harsh Mujhse baat kijiye(Talk)(Contribs) 07:01, 14 June 2012 (UTC)
User:DrTrigonBot is designed to do exactly what you want; please confer User:DrTrigonBot/Subster for further info and examples! ;) Greetings --DrTrigon (talk) 18:14, 18 June 2012 (UTC)

Bot needed to count infoboxes on a very long list

A bot is need to answer the following question:


How many of the articles on this list, Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new, have infoboxes?


An example of what is meant here by “infobox” can be seen in the article 25001 Pacheco. Please have a look at the infobox along the right side, including statistics such as "orbital period" and "inclination". Here it is again, the example of an article that has an infobox: 25001 Pacheco.


How many of the Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new have such infoboxes containing orbitinfo? It cannot be counted by hand as there are far too many articles on the list. This is why a bot is requested to complete this task.


Also, in the process of doing this, could the bot keep track of which articles on the list have such orbitinfoboxes so that, if need be, they could be sorted out? This would be result in two lists, perfect subsets of the above list, and added together would include all articles on the list. One subset would contain only articles which have infoboxes, and the other a list of those which are not only on the above list but also do not have any orbitinfobox. Both sub-lists would be given appropriate descriptive titles.

Thank you for your kind attention to this matter. Chrisrus (talk) 17:51, 18 June 2012 (UTC)

If it's just to count, and not actually edit, wouldn't a database scan be better and quicker (and easier?)?  Hazard-SJ  ✈  21:06, 19 June 2012 (UTC)
Very, very, few articles on that list do not have infoboxes. Over 500 appear to not have one, but in fact virtually all of them are redirects. The only examples I've been able to find so far are 198110 Heathrhoades and 201777 Deronda. - Jarry1250 [Deliberation needed] 21:58, 19 June 2012 (UTC)
@Hazard-SJ, I don't know. If it is, please understand the above to read "bot or database scan needed.." and all other appropriate edits to get across that the important thing is to have the work done, not whether a bot or a database is the proper tool tool. Chrisrus (talk) 06:10, 20 June 2012 (UTC)
Jerry1250, I'm surprised! How many of the list are redirects and how many are articles? If I understand you correctly, you seem to be saying that all of User:Jarry1250/Planets_no_infobox are redirects except 198110 Heathrhoades and 201777 Deronda, but that you're not quite sure. Is that correct? Chrisrus (talk) 06:10, 20 June 2012 (UTC)
Those two are the only non-redirects. —  HELLKNOWZ  ▎TALK 15:35, 20 June 2012 (UTC)
I see. Then I need to make a new botreq.

Work needed to update very long list

To understand this request better, see the above section.

We need to convert the unfortunately named Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new, into new lists.

All those which have already been redirected should be removed and stored under a title like "Minor planet article converted to redirects on (date) by (person). These are "done". (Terribly sorry, but if I may just interject here HORRAY for Wikipedia.) This should probably have explanitory intro at the top.

The rest would form the new "Candidates for Redirection" list, but definately should not be named that. It should be named maybe "Minor planet article candidates for Redirection as of (date)" or some such, as "new" is not going to be true forever, obviously. It should include the entire "list history" that Wikipedia:WikiProject_Astronomy/Candidates_for_redirection_new has, but that obviously should be updated so that the history of the new list is up-to-date. Chrisrus (talk) 17:42, 20 June 2012 (UTC)

Copying B-class checklists from one project template to another

Enquiring about the possibility/feasibility of using a bot to put a dent in the backlog at Category:Military history articles with incomplete B-Class checklists. If the talkpage had a completed B-class checklist for WP:SHIPS ({{WikiProject Ships}}), it would copy the B1-B5 fields into the WP:MILHIST template ({{WPMILHIST}}).

  • Edit example 1: copying the Ships checklist to Milhist: [5]
  • Example edit 2: adding the checklist if it doesn't exist in Mihist yet [6]

I first suggested the idea at Wikipedia talk:WikiProject Military history/Archive 112#Category:Military history articles with incomplete B-Class checklists where it seemed to have support, primarily because of the almost-identical assessment standards for warships the two projects have. Although that particular discussion was archived without action (in the most recent archive), the checklist backlog is a recurring subject for comment (twice more in the archive, and currently on the main talkpage).

Thoughts? -- saberwyn 02:43, 19 June 2012 (UTC)

I personally think this is a pretty good idea and its possible but you would probably need to do each assessment parameter separately. I do think that it needs to have some solid support. In the past the general comment of the MILhist project was to not use bots to do assessments. Just my thoughts though and I will be out of town the next couple weeks. Kumioko (talk) 16:21, 20 June 2012 (UTC)

It seems there's still some low-hanging fruit in the orchard of interlanguage linking. Yesterday I hit "Random article" and found the need for this edit to link two articles entitled en:Tapah Road and ms:Tapah Road, both of long standing. Where wikis in two languages have articles with the identical title and much content in common (e.g. geocoding, dates, inwikilinks) or have a history of overlapping human editors there is a high probability they have the same subject. It strikes me that a tool to identify these would start with a sorted union list of article titles, subset those seen in multiple wikis, then subset those without interlanguage links. Depending how long the list is, either auto- or semi-automatic replacement would need a closer look at content. LeadSongDog come howl! 19:59, 20 June 2012 (UTC)

To decrease inaccuracies, the bots use links from other wikis and follow them and check for more interwiki links. I think this would be safer if done semi-automatically than automatically, due to the possibility of error (well...I'm not sure how many false positives there would be, that's why we have trials). Also, see Special:WithoutInterwiki.  Hazard-SJ  ✈  23:27, 20 June 2012 (UTC)
If it worked properly, Special:WithoutInterwiki might be helpful, but the 1000 page limit defeats the purpose. I agree that at least initially a semi-auto approach to editing would be safer, but even before that comes the determination of scale. How many candidate pairs of same-named pages need to be considered for linking?LeadSongDog come howl! 06:00, 21 June 2012 (UTC)

Backlog Priority Bot

Could a bot be created that lists "Backlog priorities". I am thinking of something based on page view statistics. For example Firefighting averages around 250 views a day (usually more) and it has 3 tags on it. I'm sure there are similar articles with even more average daily views. Would it be possible for a bot to take page view information for a 30 day period divide that number by 30 and add and remove articles with maintenance tags to a page like Wikipedia:Articles with maintenance tags receiving over 1000 views a day, Wikipedia:Articles with maintenance tags receiving over 500 views a day, and Wikipedia:Articles with maintenance tags receiving over 100 views a day? It wouldn't be necessary for the bot to remove the pages if it was done manually. Ryan Vesey Review me! 03:09, 21 June 2012 (UTC)

consolidate dates in lang infobox

I've requested this about four times now, but since Rich has been banned it doesn't seem that there's anyone to take care of maintenance. So this is just one piece of the earlier request.

Can we remove date'= from the language infoboxes? Any non-breaking space should be replaced with a normal space. Most of the entries are notes that the date is for a census. There are 200–300 of these. — kwami (talk) 06:52, 18 June 2012 (UTC)

Hey Kwami. I would be willing to do it, but could you provide an example so I could setup AWB properly (and to see if it can be done, properly)? Thanks! Kevin Rutherford (talk) 18:33, 18 June 2012 (UTC)
Sure. Here's an example:[7] I can give you a list of articles that need changing, if you like. There are (or were) 254 of them. — kwami (talk) 10:08, 19 June 2012 (UTC)
Could you provide another example, if they differ from that? I tried doing it one way, but I am probably doing it wrong. Also, I saw that you AWB'd that example, so if you could add what you did, that might help. Thanks! Kevin Rutherford (talk) 04:05, 21 June 2012 (UTC)

My special search&replace changes are \s*\| *date' *= * census census (w a leading space), \s*\| *date' *= *  * → (delete), \s*\| *date' *= *(-|–) → – (en dash), \s*\| *date' *= * → (delete), all within templates.

The article list I have from an old pre-parse is:

Nogai language Soddo language Waray-Waray language Maranao language Inupiat language Tzotzil language Keresan languages Kumyk language Karachay-Balkar language Dakota language Yokutsan languages Bussa language Albay Bikol language Aluku Nganasan language Kumam language Sidamo language Valley Yokuts Huastec language Ajië language Ngangikurrunggurr language Adhola language Xamtanga language Harari language Selti language Pintupi language Drehu language Achang language Sungor language Mentawai language Aklan language Libido language Busa language (Papuan) Godwari Chontal Maya language Tojolab'al language Huichol language Meriam language Mailu language Maisin language Isthmus Zapotec Bahing language Woleaian language Wolaytta language Gwere language Arop-Lokep language Otomi language Naxi language Central Bikol language P'urhépecha language Aghul language Kaugel language Pilagá language Mocho’ language Nyanga language Hamer language Lampung language Banjar language Malasanga language Maleu-Kilenge language Mapoyo language Macaguán language Guahibo language Cuiba language Mursi language Rutul language Yerukala language Aringa language Kangean language Abom language Upper Chinook language Iaai language Mezquital Otomi Chamula Tzotzil Dhuwal language Gnau language Qimant language Ayi language (Papua New Guinea) Gataq language Taos dialect Picuris dialect Southern Tiwa language Arbore language Daasanach language Dime language Karo language Chara language Barein language Basketo language Maale language Shinasha language Tsamai language Oyda language Sheko language Dizin language Gumuz language Chepang language Wayu language Tumak language Cua language (Mon–Khmer) Orok language Nayi language Alamblak language Touo language Ndrumbea language Anuak language Kachama-Ganjule language Kafa language Totontepec Mixe Hodï language Nyangatom language Kabalai language Yem language Luwo language Oroch language Hidatsa language Konjo Meadow Mari language Eastern Oromo language Kaikadi language Daju Mongo language Embu language Numanggang language Laha language Mamanwa language Kwama language Kwegu language Shekkacho language Zayse-Zergulla language Koore language Dargwa language Nepalese Sign Language Guhu-Samane language Fas language Baibai language Nobonob language Pal language Maia language Anamgura language Mudbura language Mountain Koiali language Dedua language Yopno language Yipma language Vanimo language Siane language Kamano language Gadsup language Agarabi language Kopar language Yerakai language Tuwari language Heyo language Juwal language Yil language Yangum language Mekmek language Zimba language Simbari language Kunyi language Adjora language Ebrié language Werni language Barambu language Bwa languages Raji language Khiamniungan language Sema language Central Nicobarese languages Tày language Caolan language Tai Ya language Tai Hongjin language Vaiphei language Gangte language Kom language (India) Sangtam language Yimchungrü language Angor language Xaracuu language Yessan language Sanio language Kwasengen language Central Banda language Gulay language Sar language Markweta language Sabiny language Gungu language Samia dialect (Luhya) Kwang language Budza language Mesme language Ngbundu language Koi language Jagoi language Bukar Sadong language Yakan language Chiapas Zoque Chimalapa Zoque Komering language Irish language in Britain Burum language Mesem language Ngaing language Borong language Bamu language Morawa language Keoru language Orokaiva language Kewa language Narak language Sepik Iwam language Baramu language Davawenyo language Numee language Yuaga language Babalia Creole Arabic Maramba language Foia Foia language Hoia Hoia language Kobol language Rembarunga language Binumarien language Bitur language Pei language Yawiyo language Pahi language Pasi language Bisis language Berinomo language Koiwat language Edolo language Dibiyaso language Safeyoka language Doghoro language Seta language Ningil language Amol language Bauwaki language Binahari language Kein language Coyotepec Popoloca language North Bolivian Quechua Quapaw language Hrê language Setaman language Suganga language Pochuri Naga language Dobase language Tai Mène language Tlaxcala–Puebla Nahuatl Michoacán Nahuatl Ometepec Nahuatl Temascaltepec Nahuatl

kwami (talk) 16:32, 22 June 2012 (UTC)

 Done Thanks for your help! Kevin Rutherford (talk) 05:52, 25 June 2012 (UTC)

Great. I'm glad this is finally done. If you're interested in further fixes to the info boxes, I have several uncontroversial requests that never went anywhere; just drop me a note. There's also a larger project here for making sure there are proper redirects from the ISO-code search engine. — kwami (talk) 06:37, 25 June 2012 (UTC)

Broken Redirects

Hello. I need help creating a bot for another wiki site that I work on (www.imfdb.org). I am wondering if there is anyone willing to help me out? The bot would help to locate broken redirects.

#REDIRECT [[PAGENAME#SECTION]] I want to find the redirects that don't work because the SECTION part is wrong. I know that there is the "Special:BrokenRedirects" page, but this will only tell you if the PAGENAME part of the redirect is wrong. I want to find the redirects that don't work because the SECTION part is wrong or doesn't exist. Anyone willing to help me understand how to do this would have my undying gratitude! No but seriously, I would love some help...

--Zackmann08 (talk) 18:52, 25 June 2012 (UTC)

WikiProject tagging

The new project Wikipedia:WikiProject_Globalisation needs a bot to tag articles in Category:Globalization down to 3-4 levels. Bot help appreciated. Meclee (talk) 22:41, 25 June 2012 (UTC)

List images from Commons

Hello! I've an idea for a bot but I don't know where to start from. The idea would be building a list of images from certain category on commons that could be inserted in articles without images. We have lots of images of, for example, animals, that could have an image on the infobox but they don't have one despite you can find something on commons. Maybe this is not very normal on en:wp, but it happens in other languages. Could it be possible to make something like that? -Theklan (talk) 11:55, 25 June 2012 (UTC)

Actually, there is something to that. It is safe to say that many articles on EPWP that have free images are language-linked to an article on another wiki without images. It would be easy enough for a bot on ENWP to mark which ENWP articles have free images and link to non-English wiki articles that do not, but what would be done with this info? A list would be massive, and mostly useless, as there is no way to have a bot from this wiki add images to every article in every other wiki. Perhaps a bot could tag the talk page, saying "This article contains images uploaded under a free license. These images might also be appropriate for the foreign-language version(s) of this article (French, Japanese, & Norwegian) which currently contain no images." Or something like that. ▫ JohnnyMrNinja 04:50, 30 June 2012 (UTC)

Deploying 'Start date' template in infoboxes

#RFC: Deploying 'Start date' template in infoboxes was closed, with unanimous support to implement the following:

A great many infoboxes already emit microformats, and have for months, or even years. However, in some articles, these are incomplete, because the dates which form part of them do not use an appropriate sub-template, in order to emit the date in the correct metadata format. A bot (or bots - this task could be subdivided) is required, to complete the task of converting opening-, release-, first shown-, incident- and such dates from plain text to use {{Start date}}, as seen in this example edit for a year, and this one for a full date and as described in the various infoboxes' documentation. Note that {{start date}} allows for YYYY, YYYY-MM, YYYY-MM-DD and in a few cases YYY-M-DD:HH:MM formats. Note that Smackbot was approved to do this, and started, but failed to complete the task. A list of affected templates is available.

Can someone assist, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 16:11, 30 June 2012 (UTC)

reorgering lang info box

Another piece of my former request, which got sidetracked during Rich's ban.

Could someone reorder the parameters in transclusions of {{infobox language}} to match the order on the documentation page?

The problem with having them mixed up is that sometimes they end up getting duplicated, which causes problems: a blank entry will override an earlier, filled, entry in the expected position, for example, and there is some bizarre stuff hidden in such duplicates.

It would be nice to have a separate line for each parameter (which, for the most part, we already have), and also for the closing "}}". The two exceptions would be latitude/longitude and for multiple ISO codes, which are stacked like this in most articles and are expected this way:

|lc1=abc |ld1=name a
|lc2=def |ld2=name b
|lc3=ghi |ld3=name c

(lc comes before ld because it's always the same length, and so lines up better this way.)

If there are duplicate parameters, could they be tagged with a fix-it category for manual review? (Even if the 2nd is empty, because we'd need to review whatever it is hiding in the 1st instance before we display that.)

Any unsupported params (not included in the documentation) should be ordered at the end and also tagged for review. (Unless they're empty, in which case they can be deleted.)

kwami (talk) 21:48, 29 June 2012 (UTC)

That is a cosmetic change and such a bot should not be approved or deployed. The params are keyword arguments and can be supplied in any order as a matter of design. 69.228.171.149 (talk) 07:33, 2 July 2012 (UTC)
It's not a cosmetic change when it interferes with the display of the template. I've also tried to get this to ride on more substantive edits, but getting anything done here takes months, so I was hoping it would be easier if it were broken up. — kwami (talk) 11:36, 2 July 2012 (UTC)
Could you explain how it interferes with the display of the template? The wiki system (as far as I have ever seen) allows named parameters to appear in any order, with no change in the way that the page is rendered. If you mean that changing the order changes the appearance of the source code, that's certainly true, but changes that only affect the source code are not generally approved for bots. Tagging duplicate parameters would likely be approved, because that is a genuine maintenance issue, but that task would only affect the pages that have duplicate parameters, rather than all pages that use the template. — Carl (CBM · talk) 23:43, 2 July 2012 (UTC)

Delete attack/test pages

Get a bot to delete user pages that meet WP:U1. Admins have better stuff to do.--Otterathome (talk) 16:31, 1 July 2012 (UTC)

Just a comment, if this was created, the bot would need to recognize who added the tag, otherwise we'd run the risk of vandals placing U1 on the subpages of another user. It would also need to be designed not to delete the user's talk page. Ryan Vesey Review me! 16:35, 1 July 2012 (UTC)
It would also need to be able to handle a case where, for example, a vandal moves a page into their userspace and then tags it as U1. Jenks24 (talk) 17:36, 1 July 2012 (UTC)
There already is one: 7SeriesBOT. Anomie 23:24, 1 July 2012 (UTC)

What about attack/test pages? Or does edit filter already stop this? (section renamed)--Otterathome (talk) 19:36, 5 July 2012 (UTC)

How is this bot supposed to to determine whether the page actually is an attack or test page, versus some troublemaker incorrectly placing {{db-attack}} or {{db-test}} on a non-attack or non-test page? Anomie 20:29, 5 July 2012 (UTC)

Changing !scope="col" to !scope="row"

Hello-

Can I get a bot to replace all !scope="col" tags from the table at User:Albacore/Sandbox to !scope="row" tags? This is advisable since double bolding in tables is discouraged, and it's easy enough to change the !scope="row" tags back to !scope="col" tags for the columns (only five). Tony Award for Best Featured Actor in a Play and Tony Award for Best Featured Actress in a Play need this as well. Thanks. Albacore (talk) 04:16, 5 July 2012 (UTC)

If I understand this properly, I don't think this needs a bot. Could you copy the table or article text to a word processor, use find/replace, and then paste the revised text back into the article? GoingBatty (talk) 04:44, 5 July 2012 (UTC)
Yes, thank you. However when I saved the page in my sandbox on the secure server Wikipedia gave me a "database error". My edit went through according to the history but I can't see the page.

Technical details about this error: Last attempted database query: (SQL query hidden) Function: SqlBagOStuff::set MySQL error: 1114: The table 'pc000' is full (10.0.6.50) Albacore (talk) 13:37, 5 July 2012 (UTC)

Unrelated. Albacore (talk) 14:50, 5 July 2012 (UTC)

User sandboxes in content categories - again

I am forever cleaning up polluted categories (see Wikipedia:Database reports/Polluted categories) by removing pages from the wrong namespace out of content categories. A big culprit is user sandboxes, especially now that they are more easily used. I would like to get a bot to keep an eye on it and remove any categories (and interwiki links if possible). BattyBot can do it but apparently it is only semi-automated. It should be an easy bot task especially if it is only done for user sandboxes. Any takers? -- Alan Liefting (talk - contribs) 04:41, 29 June 2012 (UTC)

Under what circumstances should any user subpage be in any category anyways? Users are in user categories, by why should a subpage ever be categorized? Maybe essays, but that's all I can think of. It seems that a bot should have the list of acceptable categories, rather than a list of unacceptable ones. That list would be needed before a bot could run, I would think. ▫ JohnnyMrNinja 05:48, 29 June 2012 (UTC)
Firstly, there should be no reason why a bot couldn't take any category out of a sandbox. They are after all sandboxes. Should I try and get policy sorted on it? The main reasons why user pages turn up in content categories is either usified articles or draft articles that have been given categories. -- Alan Liefting (talk - contribs) 06:01, 29 June 2012 (UTC)
I just posted on your talk page as well, but I'll repeat the basics here: cats shouldn't be blindly removed from userspace pages, they should be neutralized by putting a ":" before the "Categgory". This makes it much easier for the people creating articles in userspace to move the pages to the mainspace and to check that the categories they want to use actually exist, while still having the wanted benefit of removing non-mainspace pages from mainspace categories. Please don't bot-remove cats from userspace (or other namespace) pages. Fram (talk) 07:31, 29 June 2012 (UTC)
I'm with Fram on this. Shredding an articles' categories during a brief userfication is unhelpful. Either insert an ':' or comment out the cats (does commenting out work for cats? I'm not sure). Stuartyeates (talk) 07:47, 29 June 2012 (UTC)
Yep, yep. Using the colon or commenting is just as good. Achieves the same thing.-- Alan Liefting (talk - contribs) 07:53, 29 June 2012 (UTC)
@JohnnyMrNinja: Userboxes in userspace may be in appropriate userbox categories, for one thing. Other types of "templates" might also. User scripts are another thing that might be usefully categorized, and categories used by bots might be legitimately added to user subpages. And occasionally a sandbox is placed in a (non-content) category specifically to test something related to categorization. Anomie 14:13, 29 June 2012 (UTC)
I think it would be great if someone could automate this. I suggest you read User talk:BattyBot#Bot changing things on user's sandbox for the issues I ran in to. Good luck! GoingBatty (talk) 16:58, 29 June 2012 (UTC)
I think I'm already approved for that with my (fully automated) bot's fourth task. I've only been doing Category:Living people but could easily do more from your polluted categories page. Consider it done. Rcsprinter (deliver) 15:28, 6 July 2012 (UTC)
Yes, you are. Anomie 18:03, 6 July 2012 (UTC)

While many such issues can be resolved by commenting out the category from the sandbox page, some categories are embedded inside templates. Is it possible to add namespace detection in such templates so that the categories are only included on article pages? See categories with the hidden template {{polluted category}} for many examples. Thanks! GoingBatty (talk) 19:51, 30 June 2012 (UTC)

broken external links, discovery.co.uk

All old links to discovery.co.uk now redirects to dsc.discovery.com. That means that 112 links to discovery.co.uk, most of them deeplinks in refs, needs fixing through archive.org or such. I'm not sure how to best sort this out, one possibility might be to simply add {{wayback}} to every link to discovery.co.uk. From the few tests I made it seems many, but not all, these pages are archived at archive.org. Just adding wayback without a date isn't ideal, but it should be pretty straightforward? Finn Rindahl (talk) 22:13, 7 July 2012 (UTC)

Recently, some concerns were raised on the External links noticeboard about links to Wikimapia:- [Wikipedia:External_links/Noticeboard#Wikimapia]

The response from there was based on the criteria, links to Wikimapia weren't eligble.

I then checked here: http://en.wikipedia.org/w/index.php?title=Special:LinkSearch&limit=5000&offset=0&target=http%3A%2F%2F*.wikimapia.org

And found there were quite a few pages using them, sometimes a references, sometimes as External Links.

I was told that a bot might be able to handle removals. Sfan00 IMG (talk) 17:54, 8 July 2012 (UTC)

IMO External links are probably ok but it shouldn't be used as references. Kumioko (talk) 18:09, 8 July 2012 (UTC)

New Anime/Manga Project Cleanup listing bot needed.

It appears that the bot used to generate the page Wikipedia:WikiProject Anime and manga/Assessment/Cleanup listing and Wikipedia:WikiProject Anime and manga/Cleanup task force/Cleanup listing stopped working in March 2010 and has been down ever since, is there a bot that can be used to replace the old one and keep these pages auto-updated per month? - Knowledgekid87 (talk) 22:17, 9 July 2012 (UTC)

See User:Svick/WikiProject cleanup listing. The one for Anime and manga is here. LegoKontribsTalkM 10:43, 10 July 2012 (UTC)

Can a bot create a table from a list of users?

Is there a bot that could create a table with a couple of cells from a list of users at What links here from {{retired}}? The table would need cells for "last contact attempt" and "notes". Ryan Vesey Review me! 19:45, 6 July 2012 (UTC)

I don't know about needing a bot for this. I could do it when I get home pretty quickly but if you need it to be updated frequently then a bot might be needed. Kumioko (talk) 19:56, 6 July 2012 (UTC)
Once the table is created, the rest will be manual. There's 2.5 thousand transclusions, so unless you know something about making tables that I don't, you're in for a lot of work. I considered using AWB to pull up a list, which would be an acceptable alternative and could be turned into a table slowly over time. Ryan Vesey Review me! 20:03, 6 July 2012 (UTC)
If you need it "to be updated frequently" (as Kumioko mentioned), then User:DrTrigonBot could copy the content of What links here to any page you want in any format (mor or less ;) that you like... Please let me know when you need it or if you need help setting it up! Greetings --DrTrigon (talk) 21:13, 6 July 2012 (UTC)
Is there any reasonable way for DrTrigonBot to only copy new occurances? Otherwise, I could create a list page for the bot and we could update the table based on the changes. (If it did it like once a week or something along those lines). Ryan Vesey Review me! 22:42, 6 July 2012 (UTC)
What do you mean by "reasonable way"? Just take a look at the target pages version history and e.g. the last diff and you can easily see the new added entries... (there's no need to re-invent the wheel ;)) Greetings --DrTrigon (talk) 08:59, 7 July 2012 (UTC)

Here you go, I hope this is close to what you want. I added an extra column for the talk page because I noticed some didn't have a main page and thought it might be helpful. Here are a few other things I noticed:

  • Of the 2500 some were dupes so I removed them
  • Of the 2002 that remained some were Templates and other non user stuff so I removed that as well.
  • Some of these are IP's
  • Some of these are indef blocked or banned and not retired (I assume they added the retired template prior to their being indeffed or banned).
  • At least a couple are deceased as in the case of Cumulus clouds.
  • Some are archives and sandboxes that IMO shouldn't have the retired template.

Please let me know if you need anything else. Kumioko (talk) 22:22, 6 July 2012 (UTC)

Thank you very much for that. I'll move it to the relevant project page soon. Ryan Vesey Review me! 22:42, 6 July 2012 (UTC)
No problem your welcome. Kumioko (talk) 00:55, 7 July 2012 (UTC)
When the Toolserver catches up, you might find this tool of limited use, tools:~dispenser/cgi-bin/useractivity.py/Wikipedia:WikiProject Editor Retention/Retired editors list. — Dispenser 20:03, 7 July 2012 (UTC) Updated link — Dispenser 18:58, 11 July 2012 (UTC)

Bot to target icon usage of non-free images

I'm sure this won't pass, so I'm not holding my breath. Not meaning to be sarcastic per se, just cognizant that every effort to enforce NFCC is shot down these days with nauseating regularity. But anyway...

WP:MOSLOGO notes that non-free images are "nearly always prohibited" as icons. Yet, I routinely see non-free images being used as icons.

Case example; File:Hezbollah Flag.jpg came up at Wikipedia:Non-free_content_review#File:Hezbollah_Flag.jpg recently. I found this interesting because I have, in the past, removed the image from many articles for failing WP:NFCC #10c and WP:MOSLOGO. It keeps getting restored anyway, especially in uses as icons. It is in fact used 21 times as an icon in various articles.

I propose, therefore, that a bot be created that patrols mainspace looking for icon uses of non-free images such that the rendered image is 30 pixels or less. For example, use in conjunction with {{flagicon}}. In operation, the bot would remove the use. If applicable, it would remove the template where the non-free image is used as a parameter to that template. Further, a notification be placed on the talk page of the article in question explaining why the image was removed.

I hope you prove my first paragraph wrong. --Hammersoft (talk) 23:08, 11 July 2012 (UTC)

I think the problem with these types of requests (and why they get declined) is that bots are really bad at making judgement calls, and even worse at deciding consensus. Once the NFCR is finished, if the decision is to remove all the links, a bot could do that quite easily (and many are already approved for such tasks). -- Selket Talk 17:01, 12 July 2012 (UTC)
  • This isn't about the NFCR. That's only about less than two dozen uses. I just used it as an example. I could wipe those out easily enough by hand. The issue is global, and we already have guidance, as quoted, that such use is almost never permitted. We already have consensus for removal. --Hammersoft (talk) 17:47, 12 July 2012 (UTC)
I think a bot is needed but the only question to ask is if there would be any false positives if we targed images that rendered 30px or less:
  • Is there any case where we'd render an image (outside of Wikipedia: namespace) at 30px and not be treated as a logo/icon?
  • Are there any valid non-free images that already at only at 30px or less?
We may need an opt-out template for these if there's more than a handful. --MASEM (t) 18:02, 12 July 2012 (UTC)
It seems to me that this task would require scanning a database dump to find images rendered at 30px or less, and then cross-referencing the list against Category:All non-free media. So if someone is interested in running this bot, the thing to do to answer those questions would be to run that query, post the list of such images somewhere, and look through them for false positives. It may even turn out that such a report is the best way to go about this. Anomie 20:08, 12 July 2012 (UTC)
  • I believe Anomie is only suggestion a one-time data dump to make sure that we're not talking 10%+ of false positives, before making sure a continuous bot is a good idea. If a sizable chuck of images that meet that metric are not icon-type uses, then yes, a bot may not be the best way to proceed. --MASEM (t) 23:48, 12 July 2012 (UTC)