Jump to content

Wikipedia:Bot requests/Archive 49

From Wikipedia, the free encyclopedia
Archive 45Archive 47Archive 48Archive 49Archive 50Archive 51Archive 55

Creating a report

Is there a bot that can scan all Wikipedia articles and make a report of all cases, where links to Wikipedia articles exist between ref-tags? See for example ref no.7 at Data (Star Trek) for what I mean. Then I could go through that list and fix them systematically. I don't know whether the fixing of this also could/should be done by a bot or not. If it could/should be fixed by a bot, then of course there is no need to do it manually. -- Toshio Yamaguchi (tlkctb) 13:50, 15 July 2012 (UTC)

Also, I don't know how common a problem that is. Perhaps it's just a few cases but I have no idea how to find out. -- Toshio Yamaguchi (tlkctb) 14:40, 15 July 2012 (UTC)

I think there are at least three main cases to look for:
  1. References that contain wikilinks (e.g. <ref>[[Clues (Star Trek: The Next Generation)]]</ref>)
  2. References that contain URLs (e.g. <ref>[https://en.wikipedia.org/wiki/Clues_%28Star_Trek:_The_Next_Generation%29 Clues (Star Trek: The Next Generation)] wikipedia.org</ref>)
  3. References that contain citation templates (e.g. <ref>{{cite web|url=https://en.wikipedia.org/wiki/Clues_%28Star_Trek:_The_Next_Generation%29 |title=Clues (Star Trek: The Next Generation) |publisher=Wikipedia |accessdate=July 15, 2012}}</ref>)
GoingBatty (talk) 14:48, 15 July 2012 (UTC)
Would it be more appropriate to post this on Wikipedia talk:Database reports? GoingBatty (talk) 14:59, 15 July 2012 (UTC)
Perhaps .... to be honest, I don't know. I guess that depends on how this report would be generated and updated. If it were maintained and updated by a bot, then I guess here is the correct place. If it should be a database report on the toolserver, then perhaps Wikipedia talk:Database reports is the more appropriate place. -- Toshio Yamaguchi (tlkctb) 15:09, 15 July 2012 (UTC)

US cities update to 2010

I've already coded a bot for updating US cities to 2010 census data and I've been informed I'm suppose to request permission to run it? I've never did any other significant editing on Wikipedia before, a few small edits here and there for current events, so I was unaware of this policy. I've already updated Iowa, North Dakota, South Dakota, Nebraska, Kansas and half of Missouri if you want to see what my work looks like. The person who posted on my talk page said I could get banned if I don't get permission so here I am. Is it okay with the community if I continue to update cities to the 2010 census data or not? Jamo2008 (talk) 17:16, 15 July 2012 (UTC)

Done rightly, this is an extremely valuable possibility. Our US cities' demographics were created by bot with 2000 data, and a 2010 update is sorely needed. However, Jamo is deleting the old information wholesale, while consensus on the issue has held that it should be kept as a separate subsection, and he's also changing around geography numbers against the sources. A properly done bot will only add sourced information and will not ignore consensus; given Jamo's ability, I am confident that he can reprogram his bot to be within the bounds of what has previously been accepted. Nyttend (talk) 17:43, 15 July 2012 (UTC)
The actual place to go would be WP:Bots/Requests for approval. --Izno (talk) 18:06, 15 July 2012 (UTC)

Biography articles without WPBiography tags

Is it possible to generate a list of biography articles (all those with a 'XXXX births' category) that lack a {{WPBiography}} tag on their talk pages? I've noticed a lot of these around, and was wondering if anyone was able to help clear that backlog or find out how large it is? I think there were bots or editors who dealt with this in the past, but it doesn't seem to be being done any more. Carcharoth (talk) 12:12, 2 July 2012 (UTC)

Yobot was doing this but the last I heard that bot was blocked due to some petty complaints about it doing minor edits. Kumioko (talk) 13:35, 2 July 2012 (UTC)
Blocked on 13 April 2012. I've been trying to load the talk page contribution of Yobot to find when they last did this, but it is taking a long time to load for some reason. Ah, here we go. General fixes is not what I'm looking for. I'm looking for a dedicated bot specifically for this task. And my other question was whether there is a way to generate a list of articles where this task could be done? i.e. Is anyone available to actually help fulfil the request? I could provide a list of about 50 articles right now, but I'd like to actually learn how to scan categories and generate lists - anyone willing to provide guidance there? Carcharoth (talk) 15:24, 2 July 2012 (UTC)
Try using catscan to find the article list. Keith D (talk) 15:48, 2 July 2012 (UTC)
There are thousands of articles that need it without a doubt. I have added it to a couple hundred in the last month or so but there are tons more. 138.162.8.57 (talk) 17:36, 2 July 2012 (UTC)
I can generate a list of articles using the toolserver; give me a few minutes to see what I can do. — Carl (CBM · talk) 23:45, 2 July 2012 (UTC)
If at all possible could you identify living and nonliving? IMO all the other fields can be populated as we find the time but its important to identify living people. Kumioko (talk) 00:03, 3 July 2012 (UTC)
That's not a problem. But the replication lag on the toolserver is very high, so the queries I am running to make the list keep getting terminated before they finish. Once I find a way to make it work I'll post a link and give the people who posted here a talk page note. Probably tomorrow at the earliest. — Carl (CBM · talk) 01:00, 3 July 2012 (UTC)
No problem Carl thanks. Kumioko (talk) 03:12, 3 July 2012 (UTC)
On a side note, this probably should have been requested at WP:Database reports. --Izno (talk) 23:24, 5 July 2012 (UTC)
Would you or someone be able to make that request? Is the difference that this would repeat periodically and not be dependent on bot that may or may not be blocked (or may or may not be busy with other stuff)? Carcharoth (talk) 01:24, 9 July 2012 (UTC)

Yobot can add the banner in all the pages need it and don't have it. In the old days I ve been doing only this but at some point we ran out of pages. :) -- Magioladitis (talk) 03:50, 6 July 2012 (UTC)

I generated a list of untagged bio articles and put it at http://toolserver.org/~cbm/untagged.txt . The process I used to create it is described at [1] so that someone else with toolserver access could recreate it later. I tried to save it to a wiki page but the system choked on it :(. There is a 0 or 1 beside each title to say whether the page is in Category:Living people. — Carl (CBM · talk) 12:29, 7 July 2012 (UTC)

Can you please unblock Yobot toperform this task? I won't use Yobot for CHECKWIKI error fixes until there is a clear consensus. -- Magioladitis (talk) 13:13, 7 July 2012 (UTC)
 Done in good faith to allow it to handle this task. Be sure it doesn't violate WP:COSMETICBOT in any way. Anomie 15:17, 7 July 2012 (UTC)
I for one am glad to hear that Yobot has been unblocked but I still think that too many are finding reasons why edits "shouldn't be made". The checkwiki errors have been determined to be an edit that a lot of editors think should be made so a minority of high profile and high ranking (for lack of a better word) editors shouldn't be putting up walls to ensure that their opinions that these edits shouldn't be done, are enforced. IMO these edits should be done by bot rather than people. Kumioko (talk) 15:56, 7 July 2012 (UTC)
Yobot was not unblocked to resume that task, and this isn't the place for that discussion. Let's keep things on-topic here. Anomie 19:44, 7 July 2012 (UTC)
No worries, just saying. No one wants to hear it anywhere no reason this venue would be any different. Kumioko (talk) 21:10, 7 July 2012 (UTC)
Thanks Anomie. I'll start adding the banners of WPBiography tomorrow. Now I'm a bit busy with Wikimania :) -- Magioladitis (talk) 00:25, 8 July 2012 (UTC)

Thanks to Carl for generating the list. How many articles are on that list? It is not particularly urgent, but is there any reason Yobot has to do it rather than another bot? It seems a bit pointless unblocking a bot to do this if the owner is busy doing other stuff. I'd be happy to wait until after Wikimania, but I think Izno is right that this should be a database report that would allow more people to help do this, rather than leaving it all to one person or one bot. Carcharoth (talk) 01:24, 9 July 2012 (UTC)

55,500 pages. Bot started right now. The task is not urgent. The project has already 800,000 pages to take care of. :) after finishing the tagging, I'll run to add living yes/no and fix any other parameter problems. -- Magioladitis (talk) 01:51, 9 July 2012 (UTC)
I was also going to add that there are a couple of DB reports that capture them but they only show the first 1000 so its very limited use. Kumioko (talk) 02:05, 9 July 2012 (UTC)
The problem with Carl's report is that unicode characters don't show correctly. I am unable to load pages with unicode characters on their title from this list. -- Magioladitis (talk) 02:59, 9 July 2012 (UTC)


Grrrr, thanks alot guys. I've had Category:Biography articles without listas parameter and Category:Biography articles without living parameter cleaned out for awhile. Now you go and make work for me again. Think of the poor schmuck (aka me) that gets to add parameters to the tag.

I do have one request. Could LaraBot go thru the list first. LaraBot will add living and stub class to the Biography banner for living biographies. That will help save me some time. I'm currently the only one that tags new living articles, adds living and adds listas, plus I add class and work-groups. Anyone want to volunteer to help out? I use AWB with some scripts by Magioladitis and Kumioko that makes things go faster. Bgwhite (talk) 04:39, 9 July 2012 (UTC)

Yobot will add living parameter to most of them. I can help with the listas parameter afterwards. I don't know how fast I can finish with the 55k pages. If I have wi-fi in Washington D.C. I'll be running the bot day-night to finish faster. -- Magioladitis (talk) 11:29, 9 July 2012 (UTC)
I'll help out too. I recommend we give you bot a few days to do what it can and then we can come in behind it and clean up the rest. Kumioko (talk) 14:02, 9 July 2012 (UTC)
Thanks guys. It is really depressing to see the category go back up. Sure... give the bot a few days. Hmmmm, partying hard after Wikimania sessions every night would have nothing to do with it. :) Bgwhite (talk) 20:27, 9 July 2012 (UTC)
Thanks from me as well. I had thought this backlog had accumulated, but hadn't realised it had gone over 50,000. Hope not too many of those are PROD or AfD material. Carcharoth (talk) 23:04, 9 July 2012 (UTC)
I found a few AFD's but hard telling how many there are. Kumioko (talk) 01:01, 10 July 2012 (UTC)

The bot is running as fast as it gets but I wonder if we should use group runs in the future to reduce time of huge tasks to be finished. -- Magioladitis (talk) 04:06, 17 July 2012 (UTC)

bot to fix redirects caused by a site moving?

The New Zealand Electronic Text Centre recently moved domain from http://www.nzetc.org/ to http://nzetc.victoria.ac.nz/ The old links redirect to the new links, no content was removed and all the links should redirect to the same content in the new domain. There are 2455 links in en Wikipedia (mainly in references) to this site. Some of the links include fragment identifiers pointing to specific places in long pages of text. Would it be possible to change these in an automated or semi-automated fashion? Ideally I'd like to also (a) being able to check that links actually work, putting a list of broken links in my user space or user talk space for me to fix and/or (b) checking for links on foriegn wikipedias. See previous discussion of move. I have a conflict of interest with this website. Stuartyeates (talk) 00:19, 13 July 2012 (UTC)

Is there any indication that the old links will eventually break? Some editors might feel that since all the links redirect, it's WP:NOTBROKEN. (And yes, I know that applies to Mediawiki redirects, but the same arguments can used here.) Avicennasis @ 03:51, 27 Tamuz 5772 / 03:51, 17 July 2012 (UTC)
Yes, the old URLs are going to go away, but only in the long term (5-10 years). Stuartyeates (talk) 20:34, 17 July 2012 (UTC)

Looking for a Commons bot operator to do a simple category merger

I've made a request at Commons:Commons:Bots/Work requests, section "Category merger", for a bot to merge categories for nearly two thousand images, but the only help I've gotten so far is a pointer to a bot that hasn't been active in six months. Could anyone here write a bot to help there? Nyttend (talk) 02:45, 17 July 2012 (UTC)

BRFA filed here. Avicennasis @ 03:16, 27 Tamuz 5772 / 03:16, 17 July 2012 (UTC)
Thanks for the help! Nyttend (talk) 03:27, 17 July 2012 (UTC)

ChzzBot IV

Wikipedia:Bots/Requests for approval/ChzzBot IV (edit | talk | history | links | watch | logs)

Chzz is retired since Jan/Feb and his bot stopped in Mai, so would somebody overtake this task? mabdul 15:45, 11 July 2012 (UTC)

Sure, if someone hands me the code. Rcsprinter (state the obvious (or not)) @ 21:35, 14 July 2012 (UTC)
Although Chzz is retired, he is still answering emails. I bet that he would handle out his C# code for you if you ask him; depending on my time I will ask him, sadly he didn't published it... mabdul 21:53, 14 July 2012 (UTC)
Until now I don't got any response by Chzz... mabdul 17:55, 20 July 2012 (UTC)

Wikilivres site move

Wikilivres is a site operated by Wikimedia Canada to host texts and images that are free in Canada but can't be hosted on WMF projects (i.e. Commons requires works to be free both in the US and the source country; due to differing copyright durations and the URAA restoration, many works are now free in their source country but remain copyrighted in the US). It appears that they recently moved from wikilivres.info to wikilivres.ca. The old URLs are not forwarding, and this has left us with a fair number of broken links that could be easily fixed, either by replacing "wikilivres.info" with "wikilivres.ca" or by using the interwikilink prefix wikilivres:. Thanks, cmadler (talk) 15:51, 18 July 2012 (UTC)

 Doing... Easy AWB fix. Rcsprinter (state) @ 16:01, 20 July 2012 (UTC)
 Done Rcsprinter (chat) @ 09:29, 21 July 2012 (UTC)

Create a table from a bunch of data

Is someone with a bot able to turn the data at User:Ryan Vesey/Minnesota State Legislators into a table with 4 or 5 cells? Right now the information exists as "Legislator and link", "Term/district/body of legislature", "link to website", "citation". It would be great if that could be turned into a table and a notes cell could be added. Ryan Vesey Review me! 15:57, 20 July 2012 (UTC)

I don't need a bot to do that but if you could tell me what columns you want and what you want in it I can make that for you. Kumioko (talk) 17:42, 20 July 2012 (UTC)
This is how the first row would look. I wouldn't suggest that you create the table manually unless you really want to and enjoy making them. It's a time saver for me, but not entirely necessary. Ryan Vesey Review me! 18:12, 20 July 2012 (UTC)
Legislator District Link citation note
Bruce W. Anderson House 1977-82 (District 26A); House 1983-84 (District 28A) Anderson, Bruce W. "Buzz" {{cite web|title=Anderson, Bruce W. "Buzz"|url=http://www.leg.state.mn.us/legdb/fulldetail.aspx?ID=10014|work=Legislators Past and Present|publisher=Minnesota Legislative Reference Library|accessdate=19 June 2012}}
Done, take a look here. Feel free to revert if you don't like it. Kumioko (talk) 01:35, 22 July 2012 (UTC)

replacement RMbot, or a fix for RMbot

User:RM bot is currently down, and causing problems for WP:RM. So, a fix, or replacement would be useful. See the discussions at Wikipedia:Village_pump_(technical)#RM_bot_inactive and WT:RM#What's the deal? -- 76.65.131.160 (talk) 08:58, 23 July 2012 (UTC)

Tagging Latin music project

Okay, the Latin music task force was converted to its own WikiProject. When it was still a task force, it used the Template:WikiProject Latin America banner, but now it has it'sown banner. What I need is for a bot to replace the previous banner with the new banner. If it's possible, carry the assessment from the old banner to the new one. That is the quality of the Latin American articles and the music-importance parameter to the new one. Here are the categories that need to be replaced: Category:South American music Category:Central American music Category:Mexican music Category:Cuban music Category:Dominican Republic music Category:Puerto Rican music

In addition to the above, I need a bot to tag these articles as the scope of the project has expanded: Category:Spanish music Category:Portuguese music Category:Cape Verdean music

Thanks! Erick (talk) 02:52, 24 July 2012 (UTC)

Try Anomie. His bot is the best one still running for this sort of task. Most of the rest of them have gone inactive. Kumioko (talk) 03:03, 24 July 2012 (UTC)

Smackbot and Femtobot tasks

As far as I can tell the majority of the tasks previously done by Rich F's bots have not been picked up by anyone. I'm not going to write a long comment about it because I believe few if any besides me care so there's no reason to waste my time. I just want to say that there are still a lot of tasks that Rich's bots did that still need adopting if anyone feels so inclined. One in particular is the WikiProject Watchlists that Femtobot was creating. Kumioko (talk) 18:13, 7 July 2012 (UTC)

What's useful about these WikiProject Watchlists? What's the advantage over something like tools:~tim1357/cgi-bin/wikiproject_watchlist.py? Or just a better WikiProject Dashboard? — Dispenser 19:41, 7 July 2012 (UTC)
Well I like Tims tool but I don't think the list can be pulled into AWB, at least I haven't figured out how. To answer the first question its useful in the same way as an individuals watchlist, only for a project. Kumioko (talk) 21:02, 7 July 2012 (UTC)
BTW that was just one example, As far as I know only a couple of the tasks were taken up and that was by Anomie dating the maintenance tags. Kumioko (talk) 15:19, 9 July 2012 (UTC)
In a perfect world, all users would have perfect vision and could see the Toolserver list the same way, and would have the latest computers faster than lightening. It's a lot more difficult to look at than the Femtobot project list. And it takes longer to load on the page. Do you know a way users can customize the skin on the Toolserver list? Also, many - but not all - projects already have the Femtobot watch list previously set up, and generate a list for the projects. The problem with it now, since the Femtobot is down, is that those daily generated lists don't include any articles added since Femtobot went down. So, Wikipedia is generating incomplete information to the projects. Maile66 (talk) 16:22, 9 July 2012 (UTC)
The list can be built using Catscan: Select the desired namespaces, under templates use the project banner name, and select "Wiki" as the output format. This'll work up for project up to 30,000 pages. — Dispenser 04:17, 12 July 2012 (UTC)
That may be true but I have never been able to figure out that application any degree of satisfaction that the data I am pulling in is right. There always seems to be something wrong, usually like having articles in the list that shouldn't be there. With the current block first and ask questions later mentality that a lot of people have on WP today I just don't trust it. Plus its just hard to use. Kumioko (talk) 11:18, 12 July 2012 (UTC)
Also, too, this assumes two factors my experience does not find in place on any project: (1) A designated person consistently in charge of a given project overseeing the processes; or, (2) identical technical skills and mental functionality for all editors, experienced or not. Maile66 (talk) 14:42, 12 July 2012 (UTC)

Kumioko, I just ran Catscan, according to above instructions. There was an output, all right. It's not in any kind of format that would be usable to the projects. For the targeted audience - the casual lurker or reader at a given project - this is all gobblegook that doesn't make sense. I don't even know the why or the wherefore of its output. It's just a long list of anything and everything in the project, and no rhyme or reason why it's there. There is nothing clickable on the output, so that right there torpedoes its usefulness. And you tell your browser to go back to the Catscan page, there's a list (yes, clickable) that just seems to be a random listing of any old article in the project. Big whoop. But to give you an example. In the last week, I've created three different articles for the Hawaii project. None of them showed up on the output. So, if I narrow the search to one of those three articles I actually created, the output is this:

== Subset ==
{| border='1'
!
|-
|}
----
;querytime_sec:1.480544090271
;total_categories_searched:0
;query_url:http://toolserver.org/~magnus/catscan_rewrite.php?ns%5B0%5D=1&ns%5B4%5D=1&templates_yes=mahi+beamer&format=wiki

If clicked, that just takes you back to Catscan and nothing useful. Whatever else this tool is, it's no comparison to what Femtobot did for the projects. As least Rich's bots were automated. Even if the Catscan were more average user friendly, it would still take a dedicated individual to run it manually. Do you know any projects that have someone with nothing else to do with their time, and plan on being around and available to the projects indefinitely? Somewhere, I swear I hear Rich Farmbrough singing, "One of these days...you're gonna miss me, honey..." Maile66 (talk) 13:35, 19 July 2012 (UTC)

Yeah I hate to say it but I pretty much knew it was gonna happen this way and tried to explain it. Catscan is an OK app for certain things but one thing its not is easy to use. AS for fallout from Rich, Arbcom makes a decision and the rest of us users who are trying to be productive and get things done are left with the fallout because no one wants to take up the tasks and Arbcom doesn't really care. Their mostly out of touch with the realities of editing and just trying to clear their plate. It happened when Betacommandbot stopped working (a lot of those tasks haven't been replaced either) and its happened again. That his bots were banned because a few users got annoyed with his bots filling their watchlists and minor edits has got to be one of the worst decisions and worst things to happen to WP in a quite a while. It only makes me wonder which bot is next. Almost every bot that actually makes edits to article is being systematically taken out by a few editors that don't like bots. I gotta get off my soap box now but the whole thing makes me very angry and makes me question where WP is gonna be in a couple more years. People say they want to recruit more editors and work on editor retention and then they do stupid stuff like this. "shaking head"...Kumioko (talk) 14:58, 19 July 2012 (UTC)
I advocate the Toolserver for simple avoidance of the enwiki bull. Now, you haven't given any specifics what your doing and are unfamiliar with AWB (missed the advanced HTML scraper and the list builder). You don't get any results in your example because Template:Mahi beamer doesn't exists . Assuming you want pages tagged with Template:WikiProject Hawaii banner, Catscan can easily find those pages and we can exclude stubs, photo requests, infoboxes, and other fun stuff (Note: thyme's copy of enwiki is a week behind due to rev_sha1 population, this could be improved by bringing money to Wikimedia-DE). — Dispenser 06:22, 25 July 2012 (UTC)
I agree there is more bull in enwiki these days than a West Texas cattle ranch and actually I am very familiar with AWB but admittedly I don't know every function. I have tried to use the HTML scraper and a couple of other functions against tne toolserver for this but couldn't get them to work. I have described what we want and no I am not trolling. What I am trying to do is get the functionality we already had back, that was eliminated when Arbcom made the decision to kill a useful bot by blocking its operator for stupid reasons. Rather than rail on me for "trolling" or Assumining bad faith. If you want to see what I want, take a look at what Femtobot did in its contributions here or take a look here at how it worked or take a look here at the edit history of a page it update. Now I know that you are trying to find a simple solution but the solution of using the toolserver is not easier for us its easier for you. I don't want to use something that requires a complicated series of steps to run, another complicated series of steps to pull into AWB and requires a bunch of instructions for the average user to use. I am trying to make it simple for the project members that use it. IF you aren't interested or don't know how to make it work then that's ok just say so, but don't blame or insult me for not accepting a solution that doesn't meet the requirements. The reason I didn't give any specifics is everyone here should know how a watchlist works, they are capable of looking at the links I included above or at the history of Femtobot. As useful as those toolserver tools are I also don't want to use week old data because that defeats the purpose of a Watchlist and lately the toolserver has been up and down like a roller coaster so until they get that problem straightened out its not a reliable solution. Kumioko (talk) 11:28, 25 July 2012 (UTC)
My edit summaries tend toward sarcastic. Anyway, looking closer at Femtobot's work, I see that its junk as it splits into as many 20 pages requiring each of those to be checked separately. Now being dissatisfied with Tim's tool, I wrote my own tools:~dispenser/cgi-bin/transcluded_changes.py/Template:WikiProject_Hawaii. It uses the RR server which has no replag at the moment and you can pull its changes into AWB using <span class="mw-title"><a href="[^<">]*?/wiki/([^<">]*)" with the "HTML Scraper (Advanced Regex)" with Group 1. — Dispenser 13:36, 25 July 2012 (UTC)
Just for clarification on what the various projects need, it would be to resume what was already in place under "Bot status" on this page: Wikipedia:WikiProject_Council Femto Bot replaced WolterBot. And when that is up and running, I would like the Texas project and Women's History project added to it. This would need to be run for all projects by one person on a daily basis, and not dependent upon any one project or its transitory participants. In theory, like TedderBot is run. I'm not a technical person, so this would not be something I could do. Maile66 (talk) 14:03, 25 July 2012 (UTC)
@Dispenser, I do agree that it would have been better if it didn't break it up but I never had a problem with it not showing the recent changes under Project/Recent changes. I think that was because it just cloned the title of the most currently edited articles from the project. Personally I wish there was a better way to build a watchlist for a project so if you have a suggestion that does not require every project to manually pull the list when needed and doesn't require a multi step process to do it then I am more than willing to listen and work with them. Although the tools you suggest are very useful in a variety of ways, those suggestions are simply not realistic as a routine maintenance tool for the WikiProjects. Kumioko (talk) 17:44, 25 July 2012 (UTC)
And in regards to Dispenser's reference being dissatisfied with Tim's tool, it's a moot point now. First Tim's tool stopped working. Now it says the account expired for being in inactive for over 6 months. Maile66 (talk) 18:37, 25 July 2012 (UTC)
So I do not have a complete understanding what's going on here, but if you need e.g. results from catscan (or other tools) added here to wikipedia, on e.g. daily basis, please consider using User:DrTrigonBot which is doing this already for other pages, projects and wikis. May be this is of some use to you?! Please contact me if you have problems figuring out how to set it up. Greetings --DrTrigon (talk) 14:43, 28 July 2012 (UTC)
Hello, DrTrigon. Can you please post links to examples of output, in regards to what your bot does. My issue may be only part of what Kumioko has in mind. But I can illustrate for you what I'm talking about with the projects. FemtoBot produced Watchlists for the Projects. Rich Farmbrough didn't get around to all the projects, but here's the Hawaii output. Since FemtoBot has been disabled, what it produces is somewhat out of kilter. It only lists changes for old pages, not picking up the new ones. It's an incomplete run. I have no way of knowing how many projects FemtoBot was linked to for this purpose, but the project Watchlists should be giving output just like your personal user Watchlist. I do know I wanted the Texas project and the Women's History project added, but FemtoBot got sidelined before that happened. Can DrTrigonBot duplicate the project Watchlists? If so, one person needs to set them all up. Not left up to the individual projects to set up themselves. They didn't set up the ones run by FemtoBot - Rich Farmbrough did that for them. Maile66 (talk) 19:29, 28 July 2012 (UTC)
If there are not more than 5000 pages to link then User:DrTrigon/sandbox should work. But you have to be patient until the bot does his daily run. The output format can be changed at any time. Greetings --DrTrigon (talk) 20:39, 28 July 2012 (UTC)
Ok thanks, so is the bot limited to 5000 or was that just an arbitrary number you set for the test? Kumioko (talk) 20:53, 28 July 2012 (UTC)
No it's an arbitrary number the wiki API set for bots (look here). Everything else would involve requesting further pages, which cannot be done out of the box at the moment. May be I could/should include such a mode in future... Greetings --DrTrigon (talk) 10:37, 29 July 2012 (UTC)
...but unfortunately at the moment the API does not recognize the bot and returns 500 results only... does not look good. Sorry. Greetings --DrTrigon (talk) 10:45, 29 July 2012 (UTC)
Well, OK, Thanks. Kumioko understands this better than I do. But as I say, whatever is in place needs one single person (programmer) to oversee it as one unit for all the projects. It needs to be run daily for all the projects. And it should not be an "as needed" type of thing. A lot of the projects, if not most of them, don't really have anybody in charge for anything on the project page. It's kind of like that old "God is dead" theory - somebody sets up a project and then often never participates again - it's left up to the casual visitor on any given moment of any given day. Maile66 (talk) 15:22, 29 July 2012 (UTC)

updated number of articles in Wikipedia:WikiProject Chennai and their respective ratings

Hi, Can you please provide the updated number of articles in Wikipedia:WikiProject Chennai and their respective ratings? thank you so much.Challengethelimits (talk) 11:06, 24 July 2012 (UTC)

You could use User:WP 1.0 bot/Tables/Project/Chennai instead of the current one. LegoKontribsTalkM 01:41, 25 July 2012 (UTC)

Thank you so much Challengethelimits (talk) 04:10, 25 July 2012 (UTC)

ReflistBot

I think when we get drive by IPs add articles and the basis for a stub i think we put a lot off editing by them not knowing how to add references or not knowing how to see the finished result. So what about a bot (similar to the "this article cites no sources" message but) which adds the reference section to the bottom of these articles, which don't already have one to make it easier? It's a pain for me to do it without copying and pasting from another article so a bot would be a big help. Thanks Jenova20 (email) 10:45, 25 July 2012 (UTC)

Hmmm - this is already part of AWB's general fixes. What would be the best way to find articles that are missing {{reflist}}? Thanks! GoingBatty (talk) 17:10, 25 July 2012 (UTC)
Adding it automatically to new articles? Or scanning them and adding them as they're found randomly? Sorry, not too experienced in this bit, it's just a shame that there's so many articles without reflists and i think it puts editors off improving articles. Thanks Jenova20 (email) 17:19, 25 July 2012 (UTC)
I believe they're automatically categorized into Category:Pages with missing references list. The list was usually handled periodically by Helpful Pixie Bot, but that has been stopped by ArbCom decision or something. Jafeluv (talk) 17:55, 25 July 2012 (UTC)
Well i just cut it from 22 to 13. The rest appear to be templates or something. So we can't get a bot to redo this?
Thanks Jenova20 (email) 19:35, 25 July 2012 (UTC)
I just picked off a few more using AWB, but some had to be fixed manually. GoingBatty (talk) 00:35, 26 July 2012 (UTC)
It's a neverending job though without a bot...Thanks Jenova20 (email) 08:10, 26 July 2012 (UTC)

antivandalbot

I want to make a bot able to detect vandalism. (Charlie22712 (talk) 11:22, 25 July 2012 (UTC))

There is one, at least: User:ClueBot NG. Jafeluv (talk) 12:04, 25 July 2012 (UTC)
There are several - see Category:Wikipedia anti-vandal bots. GoingBatty (talk) 17:08, 25 July 2012 (UTC)
Yes, RscprinterBot, ClueBot NG and 28bot do handle reverting of vandalism, editing tests (though not continuously) and and multiple instantaces of tests, respectively. However, regardiing GoingBatty's point, all the bots in that category are not operational. --Ceradon talkcontribs 22:51, 25 July 2012 (UTC)
Some of them are; ClueBotNG just edited a minute or two ago. Charlie, the problem with some vandalism is that it's too hard for a computer program to catch; you and I, as humans, can understand that it's vandalism, but if we wrote a program that would see it as vandalism, it would see large numbers of good edits as vandalism as well. Nyttend (talk) 21:28, 27 July 2012 (UTC)

narrow down digraph redirects

For those which don't already have it, could you add a section to redirects to List of Latin-script digraphs? That is, Dr (digraph) should redirect to List of Latin-script digraphs#D (just use the first letter of the title of the redirect page, ignoring acute accents on vowels: anything funky has been done manually). There are about 250 of them. Exceptions are variants of the article title: anything which has "digraphs" in the plural.

Also, could rd's to Trigraph (orthography) be redirected to List of Latin-script trigraphs, also with a letter-section link? (Don't worry about matching; there are anchors for each letter.)

Thanks, — kwami (talk) 20:58, 17 July 2012 (UTC)

Coding... LegoKontribsTalkM 06:32, 20 July 2012 (UTC)
Followed up on requester's talk page. LegoKontribsTalkM 09:29, 24 July 2012 (UTC)
Thanks! — kwami (talk) 03:55, 29 July 2012 (UTC)

A bot to remove puffery

Hi, I've seen this: Wikipedia:PEACOCK#Puffery,

and I think that a bot could be implemented to automatically remove some of these words. For example, the word "virtuoso" could be easily removed. This word appears almost exclusively on the first sentence, and the formula is always similar:

Ferdinand David (19 June 1810 – 18 July 1873[1]) was a German virtuoso violinist and composer.

or:

Shlomo Mintz (born October 30, 1957) is an Israeli violin virtuoso and conductor.

In the first example, the word "virtuoso" could simply be removed, whereas in the second, word should be removed and "violin" turned into "violinist".

What do you think?

Thank you--Fauban 16:25, 21 July 2012 (UTC)

I think it would be hard to remove all of them by bot. Maybe some. I think it would be possible to add a tag to the article so that it can be reviewed but there are a lot of discussions about the use of tags lately so I'm not sure that would get much support. IMO the best course would be to create a page for AWB similar to the Typos page that would generate an alert about it so that it can be reviewed that way. Kumioko (talk) 16:29, 21 July 2012 (UTC)
I like Kumioko's suggestion about an alert on AWB. In addition, if a bot was fed a list of words that are commonly puff words, couldn't it create a table of pages with those words and list the words they contain? I'm assuming we'd have to limit the table size, or start with only a subset of articles that the bot was checking. Ryan Vesey Review me! 16:38, 21 July 2012 (UTC)
A report is certainly possible, most reports are limited to the first 1000 but using AWB or the toolserver we could generate a list of articles affected by at least some of the obvious ones. I could probably write the code for it but I don't have toolserver access so someone else would have to implement it. Kumioko (talk) 16:52, 21 July 2012 (UTC)
Considering "virtuoso", you'd have to somehow have the bot know enough to not remove uses of the word in direct quotes; or in sourced discussions that reliable sources describe someone as a virtuoso; or references to music albums, plays, television episodes, comic book characters, computer programs, violin-playing machines, or various people who happen to have that word in their name. In other words, this is WP:CONTEXTBOT. A report (as suggested above) that humans can go through would be more appropriate. Anomie 16:49, 21 July 2012 (UTC)

Well, I've chosen the word "virtuoso" because of the formulaic nature of most sentences where it appears. It's always the first sentence, and int involves the wordings "virtuoso + [instrumentist]" or "[instrument] + virtuoso".

The bot's actions could be restricted to the first sentence, and only when it's one of these two kinds:

[person's name] ([dates]) [was/is] [a] [nationality... (may not appear)] VIRTUOSO [pianist/violinist...] [other stuff (may not appear)] [.]

[person's name] ([dates]) [was/is] [a] [nationality... (may not appear)] [piano/violin...] VIRTUOSO [other stuff(may not appear)] [.]

By restricting the bot's actions to this, we would avoid the 100% of mistakes.--Fauban 17:34, 21 July 2012 (UTC)

Your proposal apepars to be broken for the case where the musicians' virtuoso status is discussed in detail elsewhere in the article body with references to reliable sources for the claim. Stuartyeates (talk) 20:53, 21 July 2012 (UTC)

Sorry, but if you check some good articles about undisputed virtuosos (e.g. Franz_Liszt), you will see that the thing is different; and that if only the suggestions I wrote above are followed, these sourced claims would be left untouched.--Fauban 09:48, 24 July 2012 (UTC)

Did you consider using e.g. mw:Manual:Pywikipediabot/replace.py from pywikipedia bot framework? It's very likely that it already supports what you try to do... Greetings --DrTrigon (talk) 14:30, 28 July 2012 (UTC)
I think Virtuoso is probably a bad example here because that could be appropriate. However there are a lot of others on the linked Puffery age like Legendary, exrtaordinary, Heroic, etc. that I think are better examples of this functionality. I have suggested a logic change to AWB to allow this to be an Alert which, as I suggested above, would allow the user to manually review and make the edit if appropriate. Of course this doesn't mean that the developers will agree nor do the Request but I asked for it so well see. I just wanted to follow up. Kumioko (talk) 23:03, 29 July 2012 (UTC)

Bot request for adding oldid parameters to good articles

Hello, everyone.

Category:Good articles without an oldid collects articles which have been tagged as good articles, but do not have oldids in the talk page box linking to the reviewed version of the article. I have been regularly removing articles by going through this list and adding the oldids, but considering that the process is relatively straightforward, I wanted to inquire whether it was possible to automate the process. I'm not sure if this is something that GA bot might do or not.

Here is the process I use:

  1. Look to see whether there are any talk pages in Category:Good articles without an oldid.
  2. If there is at least one talk page in the category, open the first talk page in a new tab.
  3. When the talk page has loaded, open the talk page's related article page in a new tab.
  4. When the article page has loaded, open the article's View History page in the same tab.
  5. Starting with the latest version and working backwards, find the version in which the {{good article}} tag was added to the article. (Instead of going through each revision, I often can figure it out based on the edit summaries, but a bot might need to check through the versions one by one or use the history search function.)
  6. Copy the oldid from the URI of the revision page where the {{good article}} tag was added to the article.
  7. Close the Version History tab.
  8. Switch back to the tab with the talk page.
  9. Edit the talk page.
  10. Add the oldid as a new parameter to the existing {{GA}} tag.
  11. Save the edit with an edit summary of "Added oldid to GA tag".
  12. Close the talk page tab.
  13. Go back to Step #1.

There are several ways to find the oldid of a GA version and add it to the talk page, but this is the process I use. I figured that this might be possibly automated.

Just a suggestion! Keep up the great work, bot owners! Kind regards, Matt (talk) 22:01, 29 July 2012 (UTC)

Olympic Partcipants

I'd like to get this test-page working: here, which is a list of countries participating in the Olympics this year. I'll then copy and paste it to: here, where one of our users has started doing it manually. You would need to use both country names (in Welsh) and the IOC column which you will find on this page, and redirect them to the existing flags (same Welsh names). Any problems or can we do it? Many thanks. Llywelyn2000 (talk) 15:03, 29 July 2012 (UTC)

Bot to create redirects starting with "The" for some articles

Hello-- I clear a lot of red links, and we could eliminate a substantial number with bot-created redirects for certain articles starting with "The," specifically rivers and battles for starters. For instance:

I think this would be a great bot job. I did some for the major rivers by hand the other day (a dozen redirects fixed about a dozen red links), but it was way too tedious. This is a pretty standard procedure in indexing, and it would make the site more navigable. River seems like a pretty easy one: any article ending in text string "River." There may be false positives, but they are just redirects and would not be visible to most readers. Battles seem a little more complicated, but I suppose the string "battle of" would get most of them. If it goes well, we could also do ships or other things down the road. Let me know your thoughts. Thanks! Jokestress (talk) 01:10, 30 July 2012 (UTC)

Just by doing some cursory searches, such a proposal like this really needs more discussion to establish a consensus since this would be creating a lot of redirects. Just by looking at Category:Rivers by country and Category:Battles by country, it would be thousands of new redirects. And for future reference, many rivers don't have "River" at the end of them, Spring River (Missouri) is a good example. I think going through categories is a better idea for identifying potential articles. LegoKontribsTalkM 01:40, 30 July 2012 (UTC)
Thanks for the reply, Legoktm. Where does one go to get consensus for this, and what does "going through categories" mean in this context? Jokestress (talk) 01:56, 30 July 2012 (UTC)
Try WT:RIVERS, WT:MILHIST? Both of those are active so it should be easy to figure out whether creating all those redirects would be useful. I was suggesting that as a method of creating a page list for new redirects to be created. Just identifying by articles ending with "River" isn't efficient nor all-inclusive. My suggested way would be to go through every [[Category:Rivers of State or Country]] and create a list of sub-categories which only have rivers as articles and then create redirects based off of that list. LegoKontribsTalkM 03:32, 30 July 2012 (UTC)
While it's reasonable to presume that someone might create a link such as The Spring River, do you see people creating links with parentheses such as The Spring River (Missouri)? I would have guessed that those editors who aren't checking their redlinks aren't trying to disambiguate them with parentheses. Thanks! GoingBatty (talk) 17:04, 30 July 2012 (UTC)
I personally feel that we'd only need The Spring River etc., which will redirect to a disambiguation page Spring River etc. unless it's a unique name or the most famous of all similarly-named rivers. Readers can then find the one they want, and editors can later disambiguate. Jokestress (talk) 23:47, 30 July 2012 (UTC)

Template usage overview?

 Not done - late.
Can I ask for a filtered template usage (transclusion) overview here? I'd like to check a template's param input (600 transclusions, so AWB would be cumbersome for me and still not secure)
Topic: {{IPAlink}} has to-be-deprecated params (awkward names & background). I'd like to have a list of pages that use these params off-regular.

  • 2= > or [ (param 2 has closing bracket)
  • bracket= > or [ (named param bracket has closing bracket)
  • errortext= (named param errortext is used)
  • name= (named param name is used)

-DePiep (talk) 23:31, 20 July 2012 (UTC)

Anything wrong with my question? -DePiep (talk) 00:02, 26 July 2012 (UTC)
IMO the easiest way would be to add something like the below to the bottom portion of the template.

{{#if:{{{2|}}}{{{bracket|}}}{{{errortext|}}}{{{name|}}}|[[Category:IPAlink articles with invalid parameters|{{NAMESPACE}} {{PAGENAME}}]]}} If you want to do that let me know and I can do it in the sandbox and we can test to see if it will work. Kumioko (talk) 01:02, 26 July 2012 (UTC)

That would be my trick too, but a bit disturbing. I thought there would be an bot for these. Anyway, no question left for now, consider solved. -DePiep (talk) 09:52, 26 July 2012 (UTC)
No problem. For what its worth the template only has about 615 transclusions so I could run through with AWB and make a list of the ones that have those parameters pretty quickly. We could also do some search functions that could find a lot of them first to make it quicker. Kumioko (talk) 19:28, 26 July 2012 (UTC)
If you put the below code in the skip if not = box and check the regex button it should get you pretty close but I haven't tested it yet so it may need some tweeking. I can do it if you uncomfortable doing it just let me know.
{{\s*IPAlink(.*?)(2|bracket|errortext|name)\s*([\|}{<\n])

Kumioko (talk) 19:35, 26 July 2012 (UTC)+

Forget it.
I cannot tweak regex, let alone trust it to give what I request (how to check?).
Of course I could add, myself, hidden if-categories to the source for ~600 transclusions. Of course I could run AWB myself a while (but how to catch spaces in param1=x, or catch like (param3)x?) AWB cannot catch these 100% (does regex really catch the input value, or just the input?). That is why I turned to this bot's page. I was only just asking: can some bot do that more easily and systematicaly (catch & select the values of a param, 100%) and throw me a list? No. Also, the responses being this late - for such a simple q - is disappointing. -DePiep (talk) 01:10, 28 July 2012 (UTC)
Fair enough. Just offering some possibilities. Unfortunately, more and more bot operators are either retiring or being forced out. Many of the operators that are left, although they are great programmers and capable of doing a variety of tasks, are either too busy IRL to take on more work or are hesitant to taking on tasks which perform edits, because those cannot be guaranteed 100% without error and could lead to blocks or bans. Good luck though either way. Kumioko (talk) 20:56, 28 July 2012 (UTC)
I get it. Thank you, human. Things really change. Still this was also a try to get the id of a living template-track bot for future requests. ;-). -DePiep (talk) 23:54, 31 July 2012 (UTC)

Update articles to a more specific Olympic stub template

If you look at Category:Olympics stubs, you'll see it contains about 600 articles. About 1/3 of those have "2012" somewhere in their title, (i.e., Canoeing at the 2012 Summer Olympics – Men's slalom C-1, Chad at the 2012 Summer Olympics).

Those articles all contain either {{Olympics-stub}} or {{Olympic-stub}} where they should have {{2012-Olympic-stub}}.

Is there a bot that can easily update all these articles? I've been doing them by hand slowly, but a bot seems to make a lot more sense.

Alternatively, given that this would likely be a one-time process, is there a simple way someone fairly geeky can babysit a process that does it semi-automatically? If it doesnt involve using Windows, I'm up for it.

Thanks, DoriTalkContribs 23:45, 31 July 2012 (UTC)

 Done using AWB Find & Replace. -GoingBatty (talk) 04:26, 1 August 2012 (UTC)
Thank you! DoriTalkContribs 04:29, 1 August 2012 (UTC)

Request for Specific article group Bot

Please refer to this TfD section where Template:Gotras of Jats is discussed.

The template is intended to be for the navigation of the entire set of articles contained in the article List of Jat Clans, and applied to all named articles in that list. However, the template both does not contain all the articles in that list and contains articles that are not in that list. For the template to be useful, and the discusson is heading towards the weak consensus that it is useful, it must be 100% in agreement with the list article, and the articles in the list and the template must have the template applied to them.

So the Bot should:

A bot is needed because the number of Gotras (clans) of Jats is likely to be 2,700 in the end, and manual maintenance is next to impossible. A full analysis may design a better procedure than my outline above Fiddle Faddle (talk) 14:32, 27 July 2012 (UTC)

Not sure if I understand you right, but if you intend to syncronize List of Jat Clans and Template:Gotras of Jats then User:DrTrigonBot could be of some use to you. The bot can also run permamently on IRC and check in real-time for page modifications to become active and syncronize again. Have a look! Greetings --DrTrigon (talk) 09:12, 28 July 2012 (UTC)
Looks promising. I believe we need to synchronise three things, (1) a list, (2) a template, and (3) the group of articles in the List and/or temp[late. If it can do even only the first two that is an excellent start. If it can do the third or be modified or extended to do the third then we have struck gold. How do we take this forward please? Fiddle Faddle (talk) 15:05, 28 July 2012 (UTC)
So whats the source? What's the dataflow? (1) → (2) or vice versa?
I do not understand how to integrate (3) here?
I would suppose to choose one, either (1) or (2), as the source which then is mirrored to the other one. Users should be adviced only to edit the source the bot edits the other one. If you would choose (2) - the template - to be the source and use a suitable format you could include the template into the list page and would not have any need for a bot here! ;)
My bot will not be able to crawl the list of pages and add a template to those missing it.
Btw.: What about the template "considered for deletion"? Greetings --DrTrigon (talk) 17:33, 28 July 2012 (UTC)
At present the template and the list are wholly out of synch. Probably the list is the data source. Users, I fear, are impossible to control. We can advise them, but that is as far as I believe we can go, unless we can justify protecting the bot maintained item. The list changes regularly, with members added and subtracted often.
Would you consider extending your bot to check for the presence of a named template on named article pages and to add it in the right location if absent? This is less pressing than making sure the template is kept up to date
Why a bot? Because the list is growing vast, enormous. The template is imperfectly maintained, less well than the list (hence the choice of the list as data source). This area of interest is often edited by those less familiar with the ways of Wikipedia, those less diligent in keeping things up to date. Note also the frequency of edits to the list
The template is currently up for discussion rather than deletion. The consensus is building towards the request for a bot to maintain it Fiddle Faddle (talk) 18:06, 28 July 2012 (UTC)
In your place I would use the template to hold the data only and use it on the other pages. This way you do not have to control users, since they will see when then try to change the list that the data comes from a template and change this instead. One open point is how user friendly (regarding newbees this would be). But this is just my oppinion and I am not the one to decide... ;)
If you insist in having a bot mine can offer you the service to mirror/sync (1) → (2). This still needs the users to behave as supposed and edit (1) only in future...
You should read User:DrTrigonBot/Subster in order to understand how the bot works and what it offers to you. Everything beyond that has to be done by another bot or else. Greetings --DrTrigon (talk) 18:46, 28 July 2012 (UTC)
Insist? No, but it is advisable to use one. I suspect we need, somewhere, to build consensus to use one, though. Advce on where woudl be appreciated
I don't really mind which is the data source, AFTER the initial synch.
If you say your bot can do the job, I have no ned to read the documentation. I am not a Regex person. Fiddle Faddle (talk) 19:24, 28 July 2012 (UTC)
Yes that sound good! We should definatively build consensus what workflow and bot setup to use! Now the place to go ... I think this page here is a good place (but have somehow the feeling only few bot operators are active at the moment) and may be we can go to the Wikipedia:Village pump to get some more attention and further hints?
So the bot is mainly intended to do frequently occuring data syncs not for one time jobs. The first/initial sync has to be done by human anyway in order to clean things up and verify the data. After that either we are done (use template) or the bot comes to action and does daily mirroring. (Technially the first would be more robust, always up-to-date, less space consuming... in one word "smarter"... that's why I came up with this... ;) --DrTrigon (talk) 11:10, 29 July 2012 (UTC)
The need for manual attention could be a drawback here. I am not the volunteer for that, and the pages are handled by a multiplicity of editors, not all of whom are diligent and not all of whom are competent. I see the problem, and know what the probable solution is, but I can't be the one to attend to it and babysit it daily. Fiddle Faddle (talk) 11:57, 29 July 2012 (UTC)
Or have I misunderstood and it is 'just' the initial synch that requires a human? Fiddle Faddle (talk) 12:00, 29 July 2012 (UTC)
2 possible solutions are discussed at the moment (better ones welcome):
  • use the a template as source and remove any data from other pages but the template; technically speaking the best solution but may be not that user friendly for newbees since they might not understand how to edit the list (but we could provide "edit" links that point to the template directly)
  • use any page (the list (1), the template (2) or others) as data source and the bot mirrors the data from there to the other pages; don't know... could be confusing to new users as well, since there are several identical lists and only one of them may be edited
Theoretically none of those two solutions needs human attention (despite the initial sync/merge). BUT to be honest and more realistic both solutions bear the potential to confuse users (especially new ones) and thus will still need human attention, except you decide to protect some of those pages (protecting will work with the template solution ONLY since the bot needs to edit the page). So better suggestions are very welcome! ;) Greetings --DrTrigon (talk) 14:28, 29 July 2012 (UTC)
I'm going to stay out of this. I'm at the limit of my expertise. :) Always best to know when to stay silent. I'd like to defer to those with the skill to take t through to a solution. Remember we might get 2,700 articles for Jat Gotras. We have to be able to cope with this competently. The template may also need further subdivision to cope with that large amount of articles. Fiddle Faddle (talk) 14:51, 29 July 2012 (UTC)
Sounds good! Please let me know if/when I can be of any use for you! E.g. inform me by dropping an e-mail if I do not respond... ;) Thanks and greetings! --DrTrigon (talk) 15:58, 2 August 2012 (UTC)

soccerdatabase.eu

Would it be possible for a bot to remove all links to http://soccerdatabase.eu - it appears to be an illegal website which is simply copying content from the now defunct http://www.playerhistory.com - the owner of the playerhistory website is Polarman (talk · contribs) and he has said he is launching legal action against soccerdatabse. Regards, GiantSnowman 13:52, 2 August 2012 (UTC)

Baronetcy articles

It was decided in a recent RM to use "X baronets" instead of "X Baronets" in all baronetcy article titles. This would mean renaming all the non-redirect pages in that category to use the lowercase version. I hope this can be done by bot as it involves moving a very large number of pages. Jafeluv (talk) 09:11, 25 July 2012 (UTC)

I'll file a BRFA for this in a day or two. LegoKontribsTalkM 21:02, 28 July 2012 (UTC)
BRFA filed: Wikipedia:Bots/Requests for approval/Legobot 16. LegoKontribsTalkM 21:55, 29 July 2012 (UTC)
Thanks for stepping up. Jafeluv (talk) 07:20, 3 August 2012 (UTC)

revamp redirects from ISO639-3 codes

I've requested this before, but perhaps I made it too elaborate.

We have a search engine that allows the reader to look up a language with its ISO639-3 code. However, it only works as well as the redirects are maintained, and they're badly out of date: either directing to the wrong page, or in many cases missing altogether.

Request

Create or update rd pages for all piped blue links at Wikipedia:WikiProject Languages/Articles by code:

For each link of the form

[[ABC|xyz]]

a rd should be located at

ISO 639:xyz

and the syntax should be

#REDIRECT [[ABC]]{{R from ISO 639|XYZ}}

For instance, the first link is

[[Ghotuo language|aaa]]

so the rd at

ISO 639:aaa

should read

#REDIRECT [[Ghotuo language]]{{R from ISO 639|AAA}}

(as indeed it does, so this one would be skipped as not requiring any change)

Some of these links will themselves be to redirects, but that can be fixed by one of the automated link patrollers. (Unless you want this bot to follow the rd chain? That might be a better use of server load, as it would avoid changing rd's only to have them changed back for articles which are not at their ISO name.)

It would be nice if the bot would verify that the article at the end of the rd chain contains an infobox listing the ISO code in question, but that starts getting more involved. If it wouldn't be too much trouble, maybe flag the rd with an error category for manual review if it doesn't?

kwami (talk) 10:50, 3 August 2012 (UTC)

Create a lot of unpunctuated redirects

Many of the articles in Category:United States Supreme Court cases and its subcategories (all of which I've checked; there are no irrelevant subcategories, such as Category:American military personnel killed in the Gulf War being a subcategory of Category:Morocco) cover individual Supreme Court cases; almost all of these articles are entitled "PARTY1 v. PARTY2". One sometimes sees "PARTY1 v PARTY2" in ordinary writing, and it's easy to omit the punctuation after the "v". Could a bot go around and create redirects for all of these articles, using Virginia v West Virginia as a model? Nyttend (talk) 21:25, 27 July 2012 (UTC)

Has there been a discussion establishing consensus to do something like this? This task would be creating 2,000+ redirects just to fix a punctuation error. From a technical standpoint this is trivial to do, but consensus should be established first. LegoKontribsTalkM 21:33, 29 July 2012 (UTC)
No; this is the kind of simple maintenance that doesn't need a big discussion. You wouldn't object if I created them manually, so the only approval we need is technical, to ensure that the bot do what it should. Nevertheless, copy/pasted this to WP:VP/Pr. Nyttend backup (talk) 13:34, 30 July 2012 (UTC)
The only thing that really matters is if there is consensus to have articles at "Foo v Bar" or "Foo v. Bar". If there's consensus that one form is preferred over the other, then the bot can move things and create redirects. If it's a "it's fine either way", then the bot should just create the redirects for both variants when they are missing. Headbomb {talk / contribs / physics / books} 17:25, 31 July 2012 (UTC)
Coding... Looks good, I'll file a brfa soon. LegoKontribsTalkM 22:11, 31 July 2012 (UTC)
BRFA filed here. LegoKontribsTalkM 16:29, 4 August 2012 (UTC)

AFC reviewer informing if article is at AfD

As the AFC reviewer has somehow a part of the responsibility, can somebody write a bot, which monitors the articles listed at AfD (or xfd) and informing the AFC reviewer/accepter? The reviewer is listed in the AFC project template at the talk page in the parameter |reviewer=. So rather a simple task. ;-) mabdul 11:58, 4 August 2012 (UTC)

Need new operator for User:RM bot

RM bot has maintained the WP:Requested moves page for quite a while now, but it's owner has stopped editing, and the bot has shut down. We desperately need someone to adopt it and start it up again, as the Requested moves page is currently being manually updated. The source code is at User:RM bot/requestedmoves.php for anyone that is capable of compiling and adopting it. Thank you.--Aervanath (talk) 14:49, 4 August 2012 (UTC)

Someone already requested this: Wikipedia:Bots/Requests for approval/RMCD bot. Anomie 01:21, 5 August 2012 (UTC)
Oh, I was not aware of that until now. Glad it's being handled already.--Aervanath (talk) 10:18, 5 August 2012 (UTC)

Request for code

Вы могли бы мне помоч сделать облако тегов вот для этого сайта adrenalin-css.3dn.ru Напишите код в ответе всиысле htm код блога который бы получился — Preceding unsigned comment added by 178.122.3.140 (talk) 20:18, 5 August 2012 (UTC)

Machine translated: "Could you help me make a tag cloud for this site's adrenalin-css.3dn.ru write the code in response vsiysle htm code to be turned blog" presented here without comment for others to handle Fiddle Faddle (talk) 20:25, 5 August 2012 (UTC)
А какое это имеет отношение к Википедии вообще и запросам к ботоводам в частности? --Basetalkсontr. 12:03, 9 August 2012 (UTC)
Эта страница для запросов ботов только для английской Википедии и связанными сайтами. —  HELLKNOWZ  ▎TALK 13:31, 9 August 2012 (UTC)

Protection notification bot

See the "Question about protection" section of WP:VP/T (which will probably end up at Wikipedia:Village pump (technical)/Archive 101 rather soon) — an admin recently put temporary full protection on an article that had been indefinitely semiprotected, but due to the nature of protection, the page will be completely unprotected at the expiration of full protection. Since (at least right now) software doesn't permit any alternatives, we'll have to go back to the page and add semiprotection after the full protection expires (or cut short the time for full protection), but this requires that someone realise that it's time to semiprotect. What if we had a bot to post a notice at WP:AN to remind admins to protect pages like this? The bot could have a page in its userspace where someone could list pages that would need protection and times; the bot could be programmed always to notify WP:AN a certain number of minutes before the protection was needed. Because I'm asking only for a notification bot, not a protection bot, we wouldn't need the bot to be an admin, and the list page wouldn't need to be protected.

Five days ago, I posted a request at the proposals VP asking for input on this idea; the only response was "I take it as obvious that the best approach would be a change tot he software, but this is a temporary measure, until such changes are made. Sounds like a good idea to me." by SPhilbrick. Nyttend (talk) 00:37, 10 August 2012 (UTC)

User pages with excessive unrelated content

I often post at WP:MFD. Wikipedia:User pages lets users know that they many not have excessive unrelated content in their user pages. I'm looking for a bot to populate a series of administration categories entitled Category:User pages with excessive unrelated content having subcategories Category:User pages with excessive unrelated content 2010, Category:User pages with excessive unrelated content 2009, 2008, 2007, ..., 2001 (based on the date the page was created), each of which could be a subcategory of the MfD project, such as Category:Miscellaneous pages for deletion. The bot would look for only those user pages which meet each of the following four criteria:
1. of users who have only posted edits to any user space page (user and user talk),
2. user pages that at least two years old (user pages that were created in the year 2010 or earlier),
3. of users that have at least 5 edits collectively in all user space pages, and
4. where the user page now has at least 10,000 bytes.
I got the idea from the user page User:Skverma1949's listing at MfD.[2] The Skverma1949 page originated in August 2009‎, the user has only posted edits to user space (three different user space pages of users having similar user names), has more than 5 edits to all user space pages, and the page now has 17,000+ bytes. There are many user pages where the user has one or two edits total, usually to their user page. I'm looking to screen those out, at this time, via the requirement of at least 5 edits collectively in all user space pages to focus on further human screening of each page listed in the catetorirs for Wikipedia:User pages and WP:NOTWEBHOST issues. -- Uzma Gamal (talk) 15:07, 10 August 2012 (UTC)

Please use bullets :-). — Dispenser 16:07, 10 August 2012 (UTC)
I revised the above post. -- Uzma Gamal (talk) 07:51, 11 August 2012 (UTC)
Categories are probably not the best plan here, as that would require the bot to edit every one of these user pages to add the [[Category:...]] tag, and then the bot might get reverted (although most of these users are probably long gone) and you'd never know. It's more likely you just want a list of these users/pages, which someone with Toolserver access could probably create for you easily enough. Anomie 20:07, 10 August 2012 (UTC)
Yes, no need to add the pages to a category. A list would be fine. -- Uzma Gamal (talk) 07:51, 11 August 2012 (UTC)

Keeping track of Good articles

It would be usefull to us at WP:GA to have an easy way to keep track of who is reviewing Good articles and who is submitting them. It will potentially help in identifying trends, identifying new or inexperienced reviewers so we can guide them, make it easier to fix problems from poor or bad-faith reviewers and to identify potential new reviewers. To submit a review someone puts this {{subst:GAN}} template on the page. This will list the article at WP:GAN. To review the article the reviewer will create the review page (it will be in the form Talk:articlename/GAreview numer eg Talk:Prussian Homage (painting)/GA1). The more information we could get the better, but even the raw numers (nominations/reviews) would be useful. Other details that may prove useful would be a link to the articles and/or reviews, how many have failed, passed, or been re-assessed and the dates the nominations/reviews were conducted.

If a new page could be updated with this information that would be great. The most usefull presentation though would be to have it next to new nominations at WP:GA. Something like:

  • My article (edit|talk|history|links|watch|logs) (start review) Aircorn (talk) 19:56, 21 July 2012 (UTC) N=4 R=12

where N=Nominations and R=reviews. That list is currently populated by GA bot (talk · contribs) so maybe it could pull information from the new page and add it when posting? Any help much appreciated. N.B. Chris (the owner of GA bot) has been notified of this thread. AIRcorn (talk) 09:39, 8 August 2012 (UTC)

I'm playing around with this atm. To clarify with "*My article (edit|talk|history|links|watch|logs) (start review) Aircorn (talk) 19:56, 21 July 2012 (UTC) N=4 R=12" N & R refer to the nominations/reviews made by the user (as opposed to how many times the article has been nominated)?
Also, is there anyway of working out who nominated a current good article, without having to search through the entire talkpage history? --Chris 10:53, 10 August 2012 (UTC)
You are correct in that N and R would refer to nominations and reviews for the editor. It would also be good to have the review info for the reviewers too
  • On hold: this article is awaiting improvements before it is passed or failed. Aircorn (talk · contribs) 15:36, 27 July 2012 (UTC) R=12
I don't know an easy way to get the nominators (Scottywong suggested that this would be tricky too when I questioned about a toolserver approach). I would assume that after the list is populated GA Bot could just update it like it does with the WP:GAN page. AIRcorn (talk) 23:31, 11 August 2012 (UTC)

Detox palm.com urls

Looks like all varieties of palm.com urls have been discontinued. We have >> 100 links at Special:Linksearch/*.palm.com. It would be great if someone could run a detox run through to {{deadlink}} them all. Thanks. — billinghurst sDrewth 16:22, 11 August 2012 (UTC)

This could be done as an AWB task if the BAG folks have no objections. Its a pretty small group and pretty straight forward to need to go through the full BAG process IMO.Kumioko (talk) 23:34, 11 August 2012 (UTC)
Absolutely, ad hoc for these simple tasks is primarily getting light from rubbing two neurons together. — billinghurst sDrewth 02:17, 12 August 2012 (UTC)

DASHBot NFCC#9 replacement

I don't know whether DASHBot still performs this task or not, but according to User:DASHBot/Logs the bot didn't remove any NFCC#9 violations since February 2012. If DASHBot is no longer performing this task, then I think a replacement is needed. -- Toshio Yamaguchi (tlkctb) 08:47, 12 August 2012 (UTC)

Noindex my user space

Could a friendly bot go through my userspace and add {{Noindex}} to every subpage wrapped in include only tags? (Some pages are transcluded or substituted). I keep trying to add them, but continually find ones I've missed. Thanks. Ryan Vesey 07:05, 12 August 2012 (UTC)

Have you worked through Special:Prefixindex/User:Ryan Vesey and Special:Prefixindex/User talk:Ryan Vesey? You could probably do that with AWB. --Izno (talk) 23:29, 12 August 2012 (UTC)

Updates to {{Infobox automobile}} et al.

As per Wikipedia talk:WikiProject Automobiles#Image formatting in articles, we have decided to implement some changes to:

However, the scope of these changes is beyond what could reasonably be done manually so I, on behalf of WikiProject Automobiles would like to request the necessary assistance.

Before a bot is required, Stepho-wrs will update the templates to include a new "image_file" parameter.

Stage 1:

  •  Not done Replace the parameter "image" to "image_file"
  • Delete all image formatting, for example:
I image = [[File:Example.jpg|250px|Example caption]]
Becomes:
I image_file = Example.jpg
  •  Done Delete historical parameters (including any data with them) — {{Infobox automobile}} only:
    • similar
    • parent_company
    • fuel_capacity
    • fuel_economy
    • electric_range
  •  Done Delete template redirects (ensure each article uses the template name Infobox automobile) — {{Infobox automobile}} only:
    • Infobox Automobile
    • Infobox Automobile generation

 Not done Stage 2:

  • Stepho-wrs to edit the templates so the old "image" parameter functions the same as the new "image_file" parameter
  • Bot to replace the parameter "image_file" back to "image"
  • Stepho-wrs to then edit the templates to delete the now superfluous "image_file" parameter

Regards, OSX (talkcontributions) 09:29, 14 August 2012 (UTC)

I started doing a part of stage 1. I 'll keep you posted. -- Magioladitis (talk) 09:41, 14 August 2012 (UTC)
Wow, quick response. Thanks, OSX (talkcontributions) 10:20, 14 August 2012 (UTC)
User:Yobot/Automobile is the AWB custom module I am going to use to remove the old parameters. At the same time I am normalising file namespace to help doing the last part of the first stage. -- Magioladitis (talk) 13:18, 14 August 2012 (UTC)
For the image part we don't actually need to rename the parameter. We can first change the code in the infobox, then create a tracking category for the pages that still use the old format and then run a bot/tool to fix all instances. I am going to ask WOSlinker to help us with that. -- Magioladitis (talk) 13:30, 14 August 2012 (UTC)
Ok, that sounds excellent. You have been very helpful. And cutting down the number of steps to do this is more than welcome. OSX (talkcontributions) 13:50, 14 August 2012 (UTC)
I don't know how to check {{Infobox electric vehicle}} but that presumably should also be included in the above changes to ensure consistency? Warren (talk) 14:39, 14 August 2012 (UTC)
Probably, but if we touch the "Fuel_economy" field in there, we'll have another firestorm.  Mr.choppers | ✎  17:22, 14 August 2012 (UTC)

Stage 2 is not needed. I added support to barefilenames to the existing parameter. I also added a tracking category and after I fix all pages in there I'll update the code once more. -- Magioladitis (talk) 19:38, 14 August 2012 (UTC)

I updated my script to convert to bare filenames and move caption to the correct position. But the script still needs some adjustments to run automatically. Otherwise, I'll have to run it manually for 4,000 pages. -- Magioladitis (talk) 09:04, 15 August 2012 (UTC)

Why are we losing the parent company field? Thanks Jenova20 (email) 09:15, 15 August 2012 (UTC)
I think we should ask the WikiProject for that. -- Magioladitis (talk) 09:29, 15 August 2012 (UTC)
Parent company has been disabled for a while now. The bot is merely removing inactive code. OSX (talkcontributions) 09:33, 15 August 2012 (UTC)
Disabled? The field was in use as recently as yesterday on the C3 Picasso article and now i have to wikilink and add it to the article with a reference rather than just leaving the code in the infobox. Thanks Jenova20 (email) 09:47, 15 August 2012 (UTC)
The code may have been sitting dormant yesterday, but it would not have been displayed at all after August 2010. OSX (talkcontributions) 09:49, 15 August 2012 (UTC)
I thought it was...I suppose it doesn't make any difference since the information is already in the article in question. Thanks Jenova20 (email) 10:01, 15 August 2012 (UTC)

Pages that still use the old format can be found in Category:Infobox automobile image param needs updating. -- Magioladitis (talk) 11:53, 15 August 2012 (UTC)

WP:VG could use a bot to automatically archive our deletion page. This page features transclusions from the main deletion form, and keeping up with archiving completed discussions can be a bit much. Even if the bot could only move the discussions to the closed section at the bottom of the page that would be appreciated. Currently the "goal" is to archive discussions moved to the closed sections monthly into Wikipedia:WikiProject Video games/Deletion/2012, but if this can't be automated the latter would be appreciated. Thanks. --Teancum (talk) 14:11, 16 August 2012 (UTC)

Olympics athletes bot

There are a lot of olympic athletes whose only link is to the main olympics page. Can a bot be created that would crawl Category:Competitors at the 2012 Summer Olympics find pages whose only source is http://www.london2012.com/ then find the link to the athlete? It might be something that a bot can't do without human assistance. Ryan Vesey 21:24, 15 August 2012 (UTC)

Yes what we need is something which can search the athlete name, retrieve the url and replace the main page link and also read the birth date. Its definitely bot programmable but its finding somebody to do it.♦ Dr. Blofeld 21:37, 15 August 2012 (UTC)

Note that an other frequent occurrence is athletes where the only reference is to www.sports-reference.com which isn't much better. I've also edited hundreds of stubs by Banhtrung1 (talk · contribs) who only have www.london2012.com and www.sports-reference.com as references. That's again subpar but in most cases I did not bother to track down more solid references. Pichpich (talk) 21:45, 16 August 2012 (UTC)

help in populating Category:Transport infrastructure by year of completion

Hello. I'd like a bot to help with the following task. There's a newly created series of by-year-categories Category:Transport infrastructure by year of completion. In part, these have to be populated manually but here's the part I'd like to have performed by a bot. The category Category:Transport infrastructure completed in XXXX should at a minimum contain the following six subcategories:

So it's a fairly straightforward task: add [[Category:Transport infrastructure completed in XXXX]] to the six corresponding categories. As an extra task: not all transport infrastructure by year categories already exist so the bot would have to create them if need be. They should be created using the following 1992 example.

{{10years|20th|Transport infrastructure completed in|1992|Transport infrastructure by year of completion}}

[[Category:Transport infrastructure by year of completion|1992]]

[[Category:Infrastructure completed in 1992]]

[[Category:1992 in transport|Infrastructure]]

It might make sense to start with the years of the 20th and 21st century as Vegaswikian are still trying to figure out the best way to populate these categories fully. Thanks in advance for any help. Pichpich (talk) 21:39, 16 August 2012 (UTC)

Bot(s) to catch bare urls and copyvio

I think I stumbled across a big problem that involves all referencing, and probably copyvio, by a blocked user named Billy Hathorn in the entire state of Louisiana. Louisiana is not my usual territory. In recently creating an article there, I found everything created by Bill Hathorn used bare urls that are now primarily dead links. One of the reasons he got blocked is cut-and-paste copyvio, so that's an additional possible problem there. He made over 100,000 edits, many of them in Louisiana. Big mess with the Louisiana articles. Is it possible - and is anyone willing - to create one or more bots to search all articles affiliated with Billy Hathorn in Louisiana? If you need examples, please read This Thread. Kumioko has created a table of Hathorn's edits to help. Maile66 (talk) 01:10, 17 August 2012 (UTC)

Just to clarify its only 25000 of the edits. I can't pull them all in. Kumioko (talk) 01:32, 17 August 2012 (UTC)
This should be a comprehensive list. Noom talk stalk 14:29, 17 August 2012 (UTC)
Thank you. Kumioko (talk) 14:31, 17 August 2012 (UTC)

It seems that http://www.findarticles.com now redirects to search.com and archive.org is blocked by robots.txt. Should a bot add {{dead link}} to all the urls pointing to findarticles?Smallman12q (talk) 21:07, 18 August 2012 (UTC)

Old ITF-links doesn't work anymore. Here is old and new links:

--Stryn (talk) 15:10, 15 August 2012 (UTC)

Quick search reveals none of the old-style links are in use. Noom talk stalk 16:43, 15 August 2012 (UTC)
What about [3]? --Stryn (talk) 17:03, 15 August 2012 (UTC)
I made a mistake, let me recheck. Noom talk stalk 17:06, 15 August 2012 (UTC)
Oh yes, there's about 6376 links in 3152 pages. Noombot can do this in three days as uncontroversial link fixing. Noom talk stalk 17:15, 15 August 2012 (UTC)
Where does [4] go to? Noom talk stalk 14:36, 18 August 2012 (UTC)
It goes also to [5]. And this one goes to [6]. --Stryn (talk) 16:13, 18 August 2012 (UTC)
 Doing... with slightly strict rules. May not catch all broken links. Will finish in a couple of hours. Noom talk stalk 15:59, 18 August 2012 (UTC)
 Done Noom talk stalk 13:17, 19 August 2012 (UTC)
Thanks! --Stryn (talk) 13:27, 19 August 2012 (UTC)

It seems that category is currently not used and empty. Bulwersator (talk) 11:37, 19 August 2012 (UTC)

Interwiki fixing

I recently tried getting good information on a particular concept that doesn't have "straightforward" translations: While the concept itself may be said to exist in several cultures and languages, it is not conceptualised in the same way; Germans conceptualise it with a link to "images" (bildung) while Norwegians conceptualise it with a link to "forming" (dannelse), and the english language has no particular translation (thus the German word is usually used to signify the concept in English). Now, then, if I go to the Norwegian wiki for it, and see that the information is both short and partly erroneous (or in the very least I haven't seen anything that documents the truth of what appears to be a common myth), and I want therefore to check out the other languages, I get redirected to manners in English, and Umgangsformen in German. If I go to the german page for bildung, and want to see whether there is some good information on it in English, I get sent to the page for education rather than the English entry for bildung (which does exist); if someone who speaks both English and a bit of German, wants to know what the German wiki says on education, they will be sent to bildung (instead of ausbildung, which is more correct); and if they want to get the Danish sense of bildung from the German article, they'll get sent to the Danish page for Uddannelse (rather than dannelse); etc., etc.

In short, someone, at some time, has made some sort of link somewhere that seems to have taken on a life of its own -- probably through bots -- and that it is now very difficult to untangle, simply because there are so many wrongly linked pages (not to mention the fact that it is highly unlikely that any single individual knows enough languages to know whether the link from the English page on education really links to both the German and the Malay page on education (rather than bildung or something completely different altogether)). However, the solution may be simple: If someone could make a bot that could be prompted to visit a certain page and erase all interwiki links from all the linked wikis (in a short enough time that no "competing" interwiki-linking bot could re-establish them), the linking process could be started anew by users (who are now effectively left helpless against the bots), and hopefully getting it right this time around (possibly also with the added possibility of adding nobot-tags to all the pages to avoid autolinking if the concepts eventually turn out to be too confused). It does, of course, carry a real possibility for misuse, and I don't know enough about how bots work, but perhaps activation could be left up to some moderator?

Der Zeitgeist (talk) 09:29, 21 August 2012 (UTC)

A bot for requested moves

Resolved

Hey everyone! So, I have a question, that I realized a bot would be best to fix. A few months ago, I tried to tackle the "Requested moves" category backlog, but to no avail. It is simply too large and populates itself too quickly to be adequately tackled by a user. One of the issues that I noticed is that a lot of times, someone would slap a tag onto the article, but not initiate an discussion. As a result, probably half of the tags do not have any reasoning behind then, making them not only showing a false positive, but making the categories unnecessarily huge. Is there any way that a bot could be coded to rectify this problem? If this is AutoWikiBrowser-doable, I would be willing to do it, but I fear that this is something only a bot can accomplish. Thanks a lot, and I look forward to your suggestions! Kevin Rutherford (talk) 20:42, 22 August 2012 (UTC)

Already in trial, see Wikipedia:Bot_requests/Archive_49#Need_new_operator_for_User:RM_bot and Wikipedia:Bots/Requests for approval/RMCD bot. Regards, mabdul 23:15, 22 August 2012 (UTC)

Request to update interwikis to Wikipedia namespace of hy:wp

Hello. This spring Armenian Wikipedia changed its name from "Վիքիփեդիա" to "Վիքիպեդիա". So did its "Wikipedia" namespace. Articles were moved from old namespace to new one, without leaving redirects. That broke interwiki links and as it seems, confused IW bots. Some interwikis were removed from articles, some remain but take to "nonexisting" pages. What I'd like to ask is to look for articles in en:wp containing [[hy:Վիքիփեդիա: and change it to [[hy:Վիքիպեդիա:. This will restore interwikis between English and Armenian wikipedias, and I hope bots will update it in other Wikipedias, after that. Thanks. --Aleksey Chalabyan a.k.a. Xelgen (talk) 01:44, 24 August 2012 (UTC)

Orphan bot

Is there a bot that removes {{Orphan}} from pages with more than one incoming link? I've got a couple of articles that are currently orphaned, but will be de-orphaned as I create more articles; however, I might not know when they are de-orphaned. Does a bot currently remove the tag and if not, is there any reason one can't? Ryan Vesey 01:45, 24 August 2012 (UTC)

BattyBot uses AWB general fixes to remove {{Orphan}} if the article has more than 2 incoming links. GoingBatty (talk) 03:30, 24 August 2012 (UTC)
Cool, how often does it do this? Is there any reason for it to only remove the template if the article has more than 2 incoming links? I believe Orphan was modified a while ago to only refer to articles with zero incoming links. Ryan Vesey 03:34, 24 August 2012 (UTC)
I don't have any specific schedule for running it. I ran it last night for articles tagged in 2012, and am running it tonight for articles tagged in 2011. Template:Orphan says "This template should only be put on articles which meet the orphan criteria." Wikipedia:Orphan#Criteria says "Although a single, relevant incoming link is sufficient to remove the tag, three or more is ideal and will help ensure the article is reachable by readers." I guess the AWB developers are being conservative in the tag removal. If there's consensus to change AWB's behavior, you could certainly open a feature request. GoingBatty (talk) 03:42, 24 August 2012 (UTC)

Removing a template

Please remove {{Newinfobox}}/{{New infobox}} from these talk pages, because I have replaced infoboxes with new ones. Thanks. --MakecatTalk 07:47, 24 August 2012 (UTC)

Template:John Zorn

Please add the navbox {{John Zorn}} to all exist articles, which refers from template. --Marek Koudelka (talk) 16:30, 26 August 2012 (UTC)

Let's say there's a wikilink to a specific section of an article. Then, let's say that section is renamed. Is there a bot in place for this particular scenario? Said bot should search all of Wikipedia, upon any section being renamed, for any links to that section, and then rename the link to correspond to the new section name.

Example: For the article Determinism, section 5.4.2 Quantum world
If the line reading "====Quantum world====" were changed to "====Quantum realm===="
To avoid deadlinks, the bot would find any and all links that read "[[Determinism#Quantum_world]]" and change them to read "[[Determinism#Quantum_realm]]"

Bot would scan for edits to existing lines beginning and ending in equals signs. If the entire line were removed, and the bot found a link to the removed sectioning line, it could either rename any links to point to the above section that contained the removed section, or simply rename any links to the main article at large (i.e. "[[Determinism#Quantum_world]]" becomes "[[Determinism]]" if the sectioning line of code were removed).

If such a bot already exists, please post its link here, so that I might scrutinize it further. If, on the other hand, such a bot does not exist (and is not simply inactive at this time), I would be interested in attempting to make it myself. Thank you. JimsMaher (talk) 17:57, 27 August 2012 (UTC)

First, I'm renaming Determinism#Quantum_world to Determinism#Quantum_realm. Second, to clarify, as it stands at this time ... incorrect section links prefaced in correct article titles (or redirects) link to the top of the article, defeating the purpose of section linking. I am looking for a bot to remedy this, as described above. ... Working on it. JimsMaher (talk) 16:33, 28 August 2012 (UTC)

There are several articles with a broken link to the IUCN Red List (iucnredlist.org) entry. Is anyone able to generate a list of such articles? --Leyo 22:29, 28 August 2012 (UTC)

Bot to insert navbox on affiliated pages

I have created the navbox Template:Louisiana Political Museum and Hall of Fame and have inserted it at the bottom of Louisiana Political Museum and Hall of Fame. Can anyone please run a bot for me that will insert this template at the bottom of the existing articles listed on the template? And...if possible..at the same time, remove any Orphan Tags that may exist on those articles, as this navbox will resolve the orphan issue. Thanks for your help. Maile66 (talk) 14:59, 30 August 2012 (UTC)

May I suggest doing this as an AWB task? It seems a fairly low number of articles and non contentious enough to make it not worth going through the full bag process. Kumioko (talk) 15:13, 30 August 2012 (UTC)
If someone will do it for me. I don't use AWB myself. Maile66 (talk) 15:45, 30 August 2012 (UTC)
 Done. Template added. --Stryn (talk) 17:09, 30 August 2012 (UTC)

Thank you. Maile66 (talk) 17:12, 30 August 2012 (UTC)

bot to add NRHP nomination docs to NRHP articles

Could a programmer write a bot to go through NRHP place articles and check for availability of an NRHP nomination document at the National Park Service website, and add a reference to ones that are found, if not already included in the article? This would be hugely helpful in updating thousands of NRHP articles, as the National Park Service puts more and more states' nomination documents on-line.

For example, there are 2,543 NRHP-listed places in California, indexed from List of RHPs in CA. There are articles already for about half of those, i think. Few have references yet to the relevant online NRHP nomination documents, because the National Park Service only just made them all available for that state. The bot would draw a reference number from the NRHP infobox in an article, and use that to look up the nomination document at the National Park Service. It would add a reference (such as <ref name=nrhpdoc>{{cite web|url=http://pdfhost.focus.nps.gov/docs/NRHP/Text/REFNUM.pdf |title=National Register of Historic Places Inventory/Nomination: NAME |author= |date= |publisher=National Park Service}} and [http://pdfhost.focus.nps.gov/docs/NRHP/Photos/REFNUM.pdf accompanying photos]</ref> but with REFNUM and NAME filled in by field values from the NRHP infobox in the article) into the article, just above the references section. Such as in this diff adding NRHP nomination doc reference to a California NRHP article

A complication stems from the fact that the National Park Service's website returns a dummy document saying "Not yet digitised" in cases where the real document is not available. We want the reference to be added only when a real document is in fact available at the expected URL.

I expect this is more difficult to program than #bot to bring photos from NRHP list-articles to individual NRHP place articles item above, but it is actually far more important and useful. --doncram 20:13, 31 August 2012 (UTC)

Wiki Loves Monuments India

Hi,

Wikimedia India Chapter is organising WLM in India, and hence would like to invite all Indian Wikipedians to participate. Find the invitation message here at my sandbox: User:Karthikndr/sandbox. Message needs to be send to all the Wiki Project India members.

Needs to be delivered by tomorrow. Thanks! -- ♪Karthik♫ ♪Nadar♫ 18:45, 31 August 2012 (UTC)

You should take a look at EdwardsBot. LegoKontribsTalkM 20:14, 31 August 2012 (UTC)

Request to support CfD close

Following Wikipedia:Categories for discussion/Log/2012 July 22#Category:Film redirects, we need the following changes for every member of Category:Film redirects (currently 80 or so):

  • the members are redirects, please update the redirect not its target
  • create a talk page with, or replace the current contents by, {{WikiProject Film|class=redirect}}
    • making sure the class is redirect if the project is already present on the talk page (some at least are incorrectly allocated to another class such as stub for example)
    • the redirect {{Film}} may have been used instead of {{WikiProject Film}}
    • just checked, all the talk pages are either themselves redirects (which should be replaced by the project template), or to be created
  • remove the category from the redirect page itself

Once the category is empty we will deal with it speedily. Thanks. --Mirokado (talk) 21:20, 23 August 2012 (UTC)

Update: I've requested that we add support for redirect to {{WikiProject_Film/class}}. Probably better to wait until that support is in place before making these changes and I will post here once that has happened. --Mirokado (talk) 11:39, 28 August 2012 (UTC)
The redirect support has been added but I now notice that there is some error checking to be updated. I will post again once this is resolved. --Mirokado (talk) 22:06, 1 September 2012 (UTC)
Checks now added. I will do these changes by hand: several issues have arisen so far and 80 or so (but not many more) are manageable by hand. --Mirokado (talk) 21:15, 18 September 2012 (UTC)

Bot work

Hi. I was thinking we could create a bot that closes Wikipedia A-class reviews after two supports, and updates the article's talk page to that. TBrandley 02:19, 1 September 2012 (UTC)

Which WikiProject would this be for? I'm not sure that this would be a good task for a bot though. Just looking at MILHIST's A-Class review process, they require 3 "comprehensive" supports before closing, which would require human discretion. A bot could be used to clerk (move into archive subpages, update the article's talk page, etc) the process, but closing doesn't seem like a good task for a bot. LegoKontribsTalkM 18:52, 1 September 2012 (UTC)

Removal of {{wikify}}

Is there a way we can get a bot to remove all transclusions of {{wikify}} now that the template has been deprecated at TFD? Ten Pound Hammer(What did I screw up now?) 18:55, 30 August 2012 (UTC)

There is a discussion in WP:Wikiproject Wikify. As soon as we form a concrete strategy there and also make AWB rules for {{Dead end}} (there is a discussion in Template talk:Dead end for that), Yobot can run and fix/replace/remove accordingly. We also need a strategy for {{Wikify section}}. -- Magioladitis (talk) 19:00, 30 August 2012 (UTC)
Neither of those should happen. The purpose of deprecation was to keep the template from being added to any articles. It should be removed manually as the issues get solved. Ryan Vesey 04:39, 31 August 2012 (UTC)

I merely closed the discussion. If people want to have a discussion about the most prudent way to remove the template, that sounds like a good thing, regardless of whether the changeover is done manually from the start (replacing each instance with whatever more specific templates are appropriate), or automatically (removing every instance, and leaving a specific hidden category so that every page can be then dealt with manually eventually) or whatever. As long as we have conscientious Wikipedians working on this, I would hope that everything comes to a fair result. - jc37 20:26, 2 September 2012 (UTC)

Trivia

Is there any way to find every article that has a section reaing "Trivia", "Miscellaneous", "Miscellany", or any variation thereof, and automatically tag it with {{trivia}}? Ten Pound Hammer(What did I screw up now?) 00:10, 3 September 2012 (UTC)

From {{trivia}}
"In accordance with Wikipedia:Trivia sections, this template should only be placed at the top of sections that contain indiscriminate and unselective lists."
So why do you think it would be appropriate to replace a considered assessment of the content of an article to be tagged, with an automatic labelling based on no more than a section heading? Andy Dingley (talk) 00:46, 3 September 2012 (UTC)
Because literally 100% of "trivia" sections that I've seen are "indiscriminate and unselective lists"? Ten Pound Hammer(What did I screw up now?) 01:28, 3 September 2012 (UTC)
You know... Someday people will understand the difference between "indiscriminate information", and "indiscriminately placing information".
Unfortunately, I'm guessing today won't be that day... - jc37 01:45, 3 September 2012 (UTC)

The National Portrait Gallery, London has a website with a very well-organised index of its collection, which is also searchable. The database allocates a unique numerical ID (of the form mp01234) to each sitter (i.e. subject of a portrait), which allows the creation of a link to a list of whatever portraits of a particular person are in the catalogue. (see for example this list for Philip Rea)

In 2006 I created the template {{NPG name}} to facilitate adding external links from biographical articles to the site's collection, but so far it appears to be used in less than 1,000 articles ... whereas the NPG has 175,000 portraits, and their online catalogue appears to list over 100,000 sitters.

It seems to me that it should be possible to have a bot which at least identified possible matches between sitters in the NPG database and biographies on Wikipedia, and maybe added the links. --BrownHairedGirl (talk) • (contribs) 23:32, 31 August 2012 (UTC)

Here is a list of sitters against their LinkID (CSV format). The script tried to list possible name variations (based on brackets etc). Since there are probable errors because of that, here is the raw version.
I haven't tried to filter this list against Wikipedia articles. Would you like me to try? Noom talk stalk 15:28, 1 September 2012 (UTC)
Many thanks for creating the list -- that's a great start.
It would be wonderful if you could try to filter it against Wikipedia articles. Even if there is a high error rate, it will still be a lot faster to add the links is the bot has made some suggestions. --BrownHairedGirl (talk) • (contribs) 15:54, 1 September 2012 (UTC)
 Doing... Seems to be taking longer than I expected, can't tell how much progress its made as well. Will check back once it's finished. Noom talk stalk 00:18, 2 September 2012 (UTC)
Thanks! --BrownHairedGirl (talk) • (contribs) 12:21, 2 September 2012 (UTC)
If memory serves, a substantial amount of these are simply catalogue entries - recording the NPG has a portrait but not displaying it. Would it be worthwhile to limit the links only to cases where an image is actually visible on the NPG site? Andrew Gray (talk) 15:34, 1 September 2012 (UTC)
From my recent searches, it seems that the majority of entries do now have an image, which is a big change from the situation a few years ago.
I think that it is worth linking even if the NPG website currently does not display an image. Firstly, we can hope that images will be added to the website in due course ... and secondly, not everything need exist on the web to be useful to our readers. Many biographical articles link to the UK's National archives website, which catalogues papers rather than displaying them. This serves as a pointer for researchers, and the NPG links can serve the same function. --BrownHairedGirl (talk) • (contribs) 15:50, 1 September 2012 (UTC)
Fair enough - if there's a reasonably high hit-rate then it seems a good idea. I wonder if it might be worth systematically linking to the archive pages as well (location of papers, etc, is something we are very patchy on), but that's a question for another time... Andrew Gray (talk) 23:13, 3 September 2012 (UTC)

Done. The page is extremely large, if you're having problems viewing it in your browser, I can cut it down to a couple of pages. Some of the guess results are pretty good, whilst others are miles off. There were 57972 sitter names to check, 22613 had exact title matches, with the other 35359 being guessed. Noom talk stalk 16:50, 3 September 2012 (UTC)

Mark broken redirects

Could someone add a task to an existing bot so that it would monitor broken redirects? Wikipedia:Database reports/Broken redirects, updated daily, is a list of all redirects that (as of the update) existed but pointed to deleted pages. I'm imagining the bot looking at that page each time it's updated and marking each broken redirect with a template that would be a modified version of {{db-redirnone}}. I'd be willing to create said template, which should bear a warning to admins that the history should be checked and the page re-redirected to a suitable target instead of deletion, if possible. I'm thinking of something akin to the F8 tag that Multichill's bot was placing on also-on-Commons images some times ago, which bore a message that was essentially "This image is probably on Commons, but because it was placed by a bot, you must check the Commons image before deleting". Nyttend (talk) 22:20, 1 September 2012 (UTC)

Someone certainly could. OTOH, it seems the backlog is already fairly well handled. Do any of the (other?) admins who currently handle this backlog have anything to add to this discussion? Anomie 01:24, 2 September 2012 (UTC)
I've handled those now and again. I don't have a strong feeling about how helpful it would be, but I certainly don't see it as being particularly harmful, either, which leaves me ever-so-mildly supportive. --j⚛e deckertalk 20:40, 2 September 2012 (UTC)
We just had a notice at WP:AN yesterday from someone alerting us admins to the fact that there was a backlog. It's not often a problem, but this would presumably get rid of the "not often", and it would probably make even the day-to-day work a little easier. Nyttend (talk) 20:47, 2 September 2012 (UTC)
Well then, that strengthens my support. --j⚛e deckertalk 20:50, 2 September 2012 (UTC)
Standard pywikipedia already has a redirect.py script that can handle broken redirects (among other things). It currently tags with {{db-redirnone}}. I could modify the script to use a different template and read from the DBR page if that is what is wanted. LegoKontribsTalkM 21:06, 2 September 2012 (UTC)
I've created {{Db-redirnonebot}}, but there's probably something that will result in its deletion or complete rewording, so don't implement it yet. I've asked for help at WT:CSD. Could you explain for me what the pywikipedia does in this situation? I know absolutely nothing about it and have no clue what would happen if you make the modification that you suggest — e.g. would it mean that bots would do the task I'm requesting, or is this something like AWB, or is this something completely different? Nyttend (talk) 03:54, 3 September 2012 (UTC)

This is what the script currently does: Get a list of broken redirects from either a dump or Special:AllPages, then verifies that each page is in fact a redirect, and that the target does not exist. If both of those conditions are met, it would mark the redirect for deletion. The modifications I would make is to get a list from the DBR (instead of a dump), and use your template instead of {{db-r1}}. LegoKontribsTalkM 04:04, 3 September 2012 (UTC)

What or who runs the script? Is it done by a bot, or would a human need to do it? I'm not attempting to be annoying; I'm simply unaware of how this works. Nyttend (talk) 04:20, 3 September 2012 (UTC)
Sorry, a bot account would run the script, and take care of tagging all the redirects. LegoKontribsTalkM 04:32, 3 September 2012 (UTC)
Okay, thanks; scripts are something I've never touched. Nyttend (talk) 16:00, 3 September 2012 (UTC)
Is it necessary to have a separate template for a bot, when {{db-meta}} supports the addition of a |bot parameter? Σσς. 04:45, 3 September 2012 (UTC)
I don't know, and I don't care: I just needed some way to get a speedy deletion template to display this message, and the creation of a new template was the only way that I personally knew how to do it. As I said at CAT:CSD, "I'm fully aware that it probably will need substantial reworking, moving, deleting, or something else". If you have any suggestions, let me know, and I'll be happy to implement them if I understand how; if you think it necessary, I'll be happy to make a G7 speedy deletion request for the new template. Nyttend (talk) 15:59, 3 September 2012 (UTC)

Apply Timeline-event template to lists of aircraft incidents

[perhaps a job for AWB rather than a true bot; if there;s a better place to ask, please advise]

I'd like someone, please, to covert lists in Category:Lists of aviation accidents and incidents (such as List of accidents and incidents involving the Vickers Viscount) to use {{Timeline-event}}, like in this similar edit. Some (like List of accidents and incidents involving the DC-3 in the 1980s) will need the years to be copied from subheadings. Date formatting (DMY vs. MDY) will need to be preserved. There's also scope to convert sections within articles, such as Bristol Britannia#Accidents and incidents. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:50, 4 September 2012 (UTC)

bot to bring photos from NRHP list-articles to individual NRHP place articles

Hi. I wonder if a programmer would be willing to write a bot that would go through NRHP geographic-based list-articles, and where a photo is present in the list-article, check to see if it is present in the corresponding, linked NRHP place article, if it exists. Often there is no individual article yet, the link can just be a red-link. Or vice versa: go through the existing NRHP place articles in one state, and check to see if each corresponding list-article has a photo for it.

Some NRHP editors have been dissatisfied at times that new individual articles created by others did not immediately include available photos. Also there are often photos added to list-articles that don't immediately get added to already-existing individual place articles. This would address that complaint. I think it should be a one-way thing, just bringing pics from the list-articles to the individual NRHP place articles, not the other way around. During September there is a Wikipedia Loves Monuments (WLM) campaign going on which may bring a lot of new photos to the list-articles, by the way.

The system of NRHP geographic list-articles is indexed from List of RHPs. It links to state list-articles which link to county list-articles which in some cases link to city- and neighborhood list-articles. The list-articles are all identified in Category:List-Class National Register of Historic Places articles and should also have state or county or other geographic categories. The individual NRHP place articles all should have the NRHP infobox template, which could be tweaked to indicate something if useful, and also all fall within NRHP categories and county or other geographic categories.

The benefit would be to have a bot that could be run occasionally, and to allow quicker improvement of the individual NRHP articles. Any first reaction on whether this could be done? Have similar bots already been written? --doncram 15:09, 31 August 2012 (UTC)

There's a bot, ErfgoedBot, run by editor Multichill, now running daily which checks pictures uploaded to commons in the current WLM campaign, as to whether they are used in articles yet, and places unused ones into Wikipedia:WikiProject National Register of Historic Places/Unused images. This has been working very well. I guess my request here could possibly be addressed by Multichill as a variation on that, maybe after the WLM campaign is over. It could go through pics in NRHP-related commons categories and identify those not used in any Wikipedia article, besides uses in NRHP list-articles. But I don't know if it could check on whether the corresponding individual NRHP article has been created already. It would only be useful if it did that. I'll ask Multichill to comment here. --doncram 12:34, 6 September 2012 (UTC)

Adding {{DISPLAYTITLE}}

If the title of a page is "List of Foo bar" and "Foo" has {{italic title}} on it, can {{DISPLAYTITLE:List of ''Foo'' bar}} be added to it? This would be useful for pages like "List of Whatever TV Show characters". David1217 What I've done 02:06, 4 September 2012 (UTC)

Apparently AWB (and thus Italic title bot) can't do this task. Is there a way besides AWB to do this? David1217 What I've done 23:51, 7 September 2012 (UTC)
If you could give me a list, or even a category, it might be possible to do using regular expressions. LegoKontribsTalkM 00:34, 8 September 2012 (UTC)
Category:Lists of fictional characters by medium has quite a few. Is that okay? But there are other articles, like List of The New York Times employees, that aren't in that category. Will the method I described in my original post work? David1217 What I've done 03:14, 9 September 2012 (UTC)

Could we get a bot to remove redlinks that have been tagged for say 6 months. The reason is obviously because links that have been there for that long have very little chance of being created as per Wikipedia:Red link. And of course if the article is created then the user can plug the link directly into the article (only after its creation). It doesn't have to be six months, necessarily. It can be shorter or longer depending on what the higher-ups say. Let me know if anybody has any questions. Thanks. Lighthead þ 07:35, 8 September 2012 (UTC)

I don't this is a good task. Wikipedia has no time limits and red links can be useful. Moreover, when you request tasks like these you first need to gain consensus in some other page. -- Magioladitis (talk) 09:03, 8 September 2012 (UTC)
"The reason is obviously because links that have been there for that long have very little chance of being created" Do you have any empirical data to support this claim? It would be interesting if someone could compile a list for this, i.e. how long it takes from appearance of first red-link to article being created. —  HELLKNOWZ  ▎TALK 09:21, 8 September 2012 (UTC)
Declined Not a good task for a bot. This idea would require the bot to be able to determine whether any particular link is a "valid red link term". I don't think AI is that advanced yet. Anomie 13:08, 8 September 2012 (UTC)
Actually, I do realize that there are valid red links. For example red links for movies that are currently in production. Good call. :) Lighthead þ 19:44, 8 September 2012 (UTC)

Archiving bots

I'm pretty new at this stuff, but I wanted to know if it is possible for bots to archive urls. I think it can be very convenient for users. Sorry if this is a stupid proposal. Good day. ComputerJA (talk) 22:52, 8 September 2012 (UTC)

This actually is not a stupid proposal - a bot was previously approved for this task here, and a recent discussion occurred here. Also, Δ (talk · contribs) maintains this tool, for archiving all the links on a page. Σσς. 23:01, 8 September 2012 (UTC)
Woah. Thanks! I was doing things manually and it was taking FOREVER! ComputerJA (talk) 17:34, 9 September 2012 (UTC)
I've started doing some work on this ([7]), but it will be a while before this is ready for any major work. LegoKontribsTalkM 19:31, 9 September 2012 (UTC)

Great! So if I add an article here it will get archived by a bot in the upcoming days? ComputerJA (talk) 22:30, 9 September 2012 (UTC)

I'm posting this on behalf of Δ via IRC: "I will be running the archive bot shortly and will throw up a cgi script to add pre-archived URLs to a page" LegoKontribsTalkM 23:09, 9 September 2012 (UTC)
Δ has created a script to automate entering the |archiveurl= parameters (amongst other cleanup) once his bot has archived the urls, save the following as a bookmarklet:
javascript:window.location.href='http://toolserver.org/~betacommand/cgi-bin/sandbox?page='%20+%20encodeURIComponent(mw.config.get('wgPageName'));
It usually takes a minute or so to run, and after that should redirect you to a diff. LegoKontribsTalkM 00:01, 10 September 2012 (UTC)

Bot to generate lists from categories

I think there is one available? Will need to walk subcategories. Need not weed out duplicates or sort entries; I could easily do that. Thanks in advance. Churn and change (talk) 19:38, 11 September 2012 (UTC)

I can easily generate a simple bulleted list with links to all the articles. What category needs listify-ing? LegoKontribsTalkM 19:39, 11 September 2012 (UTC)
Category Psychologists. The list exists, but is obviously quite incomplete. There are no redlinks, but will need to ensure category-generated list is a true superset (I could do that with some Unix scripts). Thanks. Churn and change (talk) 19:41, 11 September 2012 (UTC)
Also, has there been any discussion of autogenerating lists from categories? I assume there has been; what was the general feeling? Churn and change (talk) 20:03, 11 September 2012 (UTC)
Here's a list of every single article in that category (recursively searched): User:Legobot/Psychologists. Feel free to move it/edit it/etc.
As far as consensus goes, many times CFD's will end with a listify consensus, and get put on this page. I don't recall ever auto-generating lists being brought up, but it wouldn't be that controversial if it's being just used for comparison purposes. LegoKontribsTalkM 20:56, 11 September 2012 (UTC)
Hi, thanks a lot for that. Helped me clean up some category inclusions. Sorted the list; will now compare. Churn and change (talk) 22:21, 11 September 2012 (UTC)

Per the suggestion at the idea lab

At the idea lab, I made a request for help in finding a way to track how long it's been since the baseball players infobox stats were updated. I was advised to come here. I would appreciate if someone could look over the brief request there, and determine if a bot would be helpful in any way. AutomaticStrikeout 21:24, 12 September 2012 (UTC)

Take a look at User:Legobot/Baseball. I generated it based on the first 5000 transclusions of {{Infobox MLB player}}, then filtered out players which were "retired" players. I didn't sort by namespace so userspace drafts will show up, however I can easily change that. Obviously this doesn't cover all baseball players, but as a quick example it works. I can easily expand it based on other templates/categories. I'll cross post this to the idea lab. LegoKontribsTalkM 23:23, 12 September 2012 (UTC)
Strikeout I replied to your pose at Village Pump. Rich Farmbrough, 15:36, 13 September 2012 (UTC).