Wikipedia:Bot requests/Archive 39
This is an archive of past discussions about Wikipedia:Bot requests. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current main page. |
Archive 35 | ← | Archive 37 | Archive 38 | Archive 39 | Archive 40 | Archive 41 | → | Archive 45 |
Move "Category:Lapurdi" to "Category:Labourd"
As decided in the WikiProject Basque discussion page, this Basque province's name is Labourd in the English-language Wikipedia, and not Lapurdi. So we need that all the contents in Category:Lapurdi be moved to Category:Labourd. Thanks in advance. --Xabier Armendaritz(talk) 12:32, 29 October 2010 (UTC)
- see WP:CFD and the speedy options. ΔT The only constant 12:33, 29 October 2010 (UTC)
- Thank you! --Xabier Armendaritz(talk) 12:39, 29 October 2010 (UTC)
Replace bare URLs with template
I don't know if this has been requested before, but there should be a bot that replaces bare URLs (in those articles that are marked with Template:bare URLs) with Template:Cite web. Essentially like this: http://www.website.com
replaced with {{Cite web |url=http://www.website.com |title=Website title <!-- added by bot ---> |accessdate={{subst:DATE}} <!-- added by bot --> }}
. —bender235 (talk) 18:56, 29 October 2010 (UTC)
- What's wrong with the reflinks tool that is linked from that template? — Dispenser 20:00, 29 October 2010 (UTC)
- Never heard of that. I'll take a look at it. —bender235 (talk) 21:07, 29 October 2010 (UTC)
- Just tried it. Very, very nice. Just what I was looking for. Thanks. —bender235 (talk) 21:13, 29 October 2010 (UTC)
WildBot dab tag remover
WildBot had tagged talk pages of articles which had disambiguation links. However, its operator is busy in real life and these notices have become dated. Using the Toolserver, I have create a list where these tags can be safely removed. We now need a bot to remove those tags (up to 14,000). Any takers? — Dispenser 05:14, 30 October 2010 (UTC)
En icon in references
On occasion, I see articles with English-language references which contain {{en icon}}; see, for example, Task Force Danbi. It can be assumed that English-language sources are the default on the English-language Wikipedia, and so it seems unnecessary to identify a source as being in English. Assuming that large-scale removal of {{en icon}} in these situations would not be controversial, could a bot check for and remove instances of {{en icon}}
contained in <ref>...</ref>
tags? -- Black Falcon (talk) 06:24, 31 October 2010 (UTC)
- Well, there are circumstances where the language of a source could be ambiguous and not obvious just from the link/title. Otherwise, why would {{en icon}} exist at all? --Cybercobra (talk) 07:01, 31 October 2010 (UTC)
- Not a good task for a bot. 17428 articles using the template is quite likely far too many, but I don't think a bot could remove them without excessive false positives. According to the template documentation, the template should be used when "there is a reason why the reader would assume that a source is not in English", for example "a reference had a foreign-language title, but English contents". While a bot might be able to handle the example (e.g. I hear Google released the code they use in their translator for detecting languages), it could never figure out the unspecified other reasons for using the template.
- If you can get strong community approval for some "nuke them all and let users revert if necessary" plan, a bot might be able to go ahead. Or you could see if someone will write a user script to assist in doing them manually, but be aware that a "meat bot" (i.e. just clicking ok without really examining the edit) is not allowed any more than an unauthorized "true" bot. Anomie⚔ 14:33, 31 October 2010 (UTC)
- OK, thank you. I think I'll check a bigger sample of transclusions to get a better idea of how {{en icon}} is used, and then decide how to proceed based on the results. Thanks, -- Black Falcon (talk) 00:21, 1 November 2010 (UTC)
Deletion bot?
I'm looking for a very simple bot that searches for a word within pages and deletes it. Can you make it with the word "test"? I can't seem to find a simple program that just does that. Thanks! →ΑΧΧΟΝΝfire 21:47, 1 November 2010 (UTC)
- Such a bot would not be particularly useful, as it would be very prone to false positives. Anomie⚔ 00:14, 2 November 2010 (UTC)
- If we can get Botlaf restarted it will give you lists of articles containing a word and you can then manually correct them or add them to a safe page list. It has worked well to sort out posses, pubic and other easily confused words, and would probably work to cull peacock terms. ϢereSpielChequers 08:06, 2 November 2010 (UTC)
- I'm not sure if this request is for a different wiki? If so, take a look at WP:AWB which can do find and replace jobs like finding "test" and replacing it with "" (in effect, deleting the word). However, I agree that such a bot would not be approved for use on en.wiki. –xenotalk 13:48, 2 November 2010 (UTC)
CitationTool
user:CitationTool was a very useful bot that isn't running anymore. It automated the job of archiving external links in citations, among other tasks. The source code has been released by its owner, User:Lulu of the Lotus-Eaters. See User talk:CitationTool#How do I make it go? Is anyone willing to take over this bot and get it going again? A standalone tool with the same functions would also be useful. Will Beback talk 00:07, 4 November 2010 (UTC)
Talk header
The usage instructions for Template:Talk header indicate that the template "should not be added to otherwise empty talk pages". Could a bot check transclusions of the template (and its redirects, see here) and generate two lists:
- All pages which contain no content other than the template and have only one revision in the page history; and
- All pages which contain no content other than the template and have two or more revisions in the page history.
Furthermore, could the bot delete all pages in the first list (i.e., where the only edit was to add the template to an otherwise empty page)?
Thank you, -- Black Falcon (talk) 03:38, 4 November 2010 (UTC)
SELECT IF(rev_parent_id=0, "Single", "Many") AS "revs",
CONCAT('[[Talk:', page_title, ']]')
FROM page
JOIN templatelinks ON tl_from=page_id
JOIN revision ON rev_id=page_latest
WHERE tl_namespace=10 /*Template:*/ AND tl_title="Talk_header"
AND page_namespace= 1 /*Talk: */ AND page_len < 30;
- The above query will list the single and multiple revision pages your asking for. I can provide the results from the finished query too. Somewhat similar results can be obtained by using Catscan.
- I should mention that the general usage is quite inconsistent with the instructions. Often this template is added when people add project banners. Other times it is used as a substituted for {{archives}}/{{archive box}}. — Dispenser 04:38, 5 November 2010 (UTC)
- Wow, I didn't know Catscan could do that... If you could provide the results for all talk namespaces (except User talk:), that would be really helpful. Or, I could just use the Catscan results if they are fairly complete... are they?
- I know that the template is often added to pages which don't "need" it (as defined by the usage instructions), and it is probably not worth removing it from talk pages which contain additional content (except on a case-by-case basis). Thanks, -- Black Falcon (talk) 05:28, 5 November 2010 (UTC)
Cedilla to Comma below bot for articles under Romanian place names and people
The Romanian Wikipedia has implemented a new policy this year of transitioning to the correct versions of ș and ț. The transitions has practically been completed on the ro.wiki. I have started my own miniproject to modify the diacritics manually whenever I find them, but it's way to tedious for articles belongings to categories like Romanian place names, personalities, etc. where there are a lot of characters to change and pages to move. I recommend building a bot based on Strainu's bot for the en.wiki (and possibly extended to other wikis) that will correct the diacritics in articles within certain categories and move them if necessary. Ayceman (talk)
- You may want to review WP:Bot requests/Archive 36#Make redirects from titles with correct Romanian diacritics to the currently used diacritics for some of the issues encountered in just creating redirects for the S/T-comma names. The major problem is determining if each word or name is Romanian rather than Turkish, Kurdish, Zazaki, Azerbaijani, Crimean Tatar, Gagauz, Tatar, or Turkmen (based on the list at Cedilla#Use of the cedilla with the letter S); a wholesale replacement is likely to have too many false positives. Anomie⚔ 20:27, 5 November 2010 (UTC)
- The chances of any Turkic words being present in articles found within the category Romania and it's subcategories is extremely low. A bot that only modifies these articles (moving and correcting) can be deactivated after the operation is complete. Other articles containing ș and ț can be slowly modified manually, as well as any corrections regarding false positives in articles modified by the bot. --Ayceman (talk) 20:21, 6 November 2010 (UTC)
- Did you know that you can get from Category:Romania to Category:Turkey following subcategory links in 5 steps? Category:Romania → Category:Geography of Romania → Category:Landforms of Romania → Category:Black Sea → Category:Black Sea countries → Category:Turkey. More on-topic, on what do you base your assertion that those chances are "extremely low"? Anomie⚔ 05:09, 7 November 2010 (UTC)
- For example within Geography of Romania (Towns, Lakes, etc.), Turkic names are likely to be mentioned in places where there has been a historical Turk/Tartar presence. These are mostly found in Dobrogea, where the place names inherited from Turkish are almost always (if not always) based on Turkish words that contain no s-cedilla. Any names of ethnic Turks with Romanian citizenship that may be present in the article as mayors or heads of local councils and have an Ș in their name will also have their name written with the Romanian s-comma. Large articles regarding cities (municipalities), do have a higher chance of having genuine names with s-cedilla in them (history, sports, etc.), but they are not a problem, as they can be modified manually. The main issue is with the myriad of small articles. --Ayceman (talk) 12:51, 7 November 2010 (UTC)
- Did you know that you can get from Category:Romania to Category:Turkey following subcategory links in 5 steps? Category:Romania → Category:Geography of Romania → Category:Landforms of Romania → Category:Black Sea → Category:Black Sea countries → Category:Turkey. More on-topic, on what do you base your assertion that those chances are "extremely low"? Anomie⚔ 05:09, 7 November 2010 (UTC)
- The chances of any Turkic words being present in articles found within the category Romania and it's subcategories is extremely low. A bot that only modifies these articles (moving and correcting) can be deactivated after the operation is complete. Other articles containing ș and ț can be slowly modified manually, as well as any corrections regarding false positives in articles modified by the bot. --Ayceman (talk) 20:21, 6 November 2010 (UTC)
Hi. The mentioned infobox went through a big history of renamings and mergers. All articles using the old names and boxes use old fields which needs to be updated. There are five old templates (which its links are now replaced):
- Infobox nuclear power station
- Infobox nuclear power plant
- Infobox Nuclear power plant
- Infobox NPP
- Infobox UK power station
These templates use old parameters with are now included in {{Infobox power station}} as DEPRECIATED (in edit mode). These parameters should be "find and replaced" by given parameters. Could this be done?
If so, could someone join Template talk:Infobox power station to help discuss the updates required? Kind regards. Rehman 13:05, 7 November 2010 (UTC)
- Done. Plastikspork is handling the issue. Rehman 09:19, 8 November 2010 (UTC)
Croatian road article titles
Per Talk:Highways in Croatia#Recent title changes, a mess has been created by Nono64 (talk · contribs), which is a huge pain to undo manually. For example, A1 (Croatia) was moved to A1 road (Croatia) and then to A1 autocesta (Croatia). Now that I moved it back to A1 (Croatia), the talk page move also needs to be reverted manually because the destination one already exists, doubling the amount of work. I'd appreciate it if someone make a bot that would just roll back all moves done by Nono64 to articles named "[A-Z][0-9]+ (Croatia)". Afterwards we might actually use the same kind of bot to pursue a similar move, but with the distinct difference that it would be a product of consensus and not a random user's whim. --Joy [shallot] (talk) 16:17, 20 November 2010 (UTC)
Replacement for RSElectionBot
User:RSElectionBot was employed for the purposes of maintaining this voter log for last year's ArbCom elections. We'd like to have the same thing running this year, but the bot's operator, User:Rspeer, seems to have taken a leave of absence. I imagine it's a relatively simple bot to run; would someone here be able to volunteer? Skomorokh 14:00, 15 November 2010 (UTC)
- I will see if I can load the right modules and get it to work. -- DQ (t) (e) 12:21, 18 November 2010 (UTC)
- Thanks, DQ! Skomorokh 12:22, 18 November 2010 (UTC)
- Approved by BAG, just setting up now. -- DQ (t) (e) 21:35, 22 November 2010 (UTC)
- Thanks, DQ! Skomorokh 12:22, 18 November 2010 (UTC)
Newsletter delivery
On behalf of the Indian WikiProject, I'd like to request if a bot could be made available to perform the delivery of the project's newsletter to its participants. The newsletter is located here and the list of participants on whose talk pages it is to be delivered is here. Any help would be appreciated, since our regular bot is down for a while. Regards, SBC-YPR (talk) 13:07, 22 November 2010 (UTC)
- Done. עוד מישהו Od Mishehu 10:48, 24 November 2010 (UTC)
- Thanks! :-) Regards, SBC-YPR (talk) 14:08, 24 November 2010 (UTC)
NameWatcherBot
Hello. Some time ago, SoxBot stopped monitoring new usernames for potential username violations. The bot-op has been pinged a couple of times (here), but there hasn't been a response. Is it possible to have a new bot monitor usernames? Thanks! TNXMan 15:59, 17 November 2010 (UTC)
- Ive been running an IRC based bot for just this purpose for years. ΔT The only constant 03:16, 19 November 2010 (UTC)
- Would it be difficult to post the output to WP:UAA/BOT? Thanks for looking at this. TNXMan 12:41, 19 November 2010 (UTC)
- Hey Tnx, I would be willing to build one for this, just need a blacklist to use. -- DQ (t) (e) 21:51, 22 November 2010 (UTC)
- There's already a list that SoxBot used- it's probably still up to date. TNXMan 21:59, 22 November 2010 (UTC)
- Ok I will throw that into my userspace and start coding. :) -- DQ (t) (e) 12:26, 24 November 2010 (UTC)
- Please note the related pages - the whitelist (so that names with words like "grape" , "brass" and "Dumbledore" don't get reported) and the homoglyph list.
- Ok I will throw that into my userspace and start coding. :) -- DQ (t) (e) 12:26, 24 November 2010 (UTC)
- There's already a list that SoxBot used- it's probably still up to date. TNXMan 21:59, 22 November 2010 (UTC)
- Hey Tnx, I would be willing to build one for this, just need a blacklist to use. -- DQ (t) (e) 21:51, 22 November 2010 (UTC)
- Would it be difficult to post the output to WP:UAA/BOT? Thanks for looking at this. TNXMan 12:41, 19 November 2010 (UTC)
Unlinking of pages via XfD
Would it be possible to create a bot that can unlink pages that have been deleted via an XfD discussion without having to resort to using Twinkle to do it instead? :| TelCoNaSpVe :| 00:52, 21 November 2010 (UTC)
- Is there some specific XfD discussion you're referring to? עוד מישהו Od Mishehu 14:27, 21 November 2010 (UTC)
- No, not really, just as a general XfD unlinking bot. Ayna Corporation, for example would be automatically unlinked after Wikipedia:Articles for deletion/Ayna Corporation was closed as delete. :| TelCoNaSpVe :| 01:32, 26 November 2010 (UTC)
- For TFD, we have a couple bot-ops who patrol WP:TFD/H. Plastikspork ―Œ(talk) 16:14, 26 November 2010 (UTC)
- And for CFD, we have Cydebot, which handles requests at WP:Categories for discussion/Working. עוד מישהו Od Mishehu 09:30, 28 November 2010 (UTC)
- No, not really, just as a general XfD unlinking bot. Ayna Corporation, for example would be automatically unlinked after Wikipedia:Articles for deletion/Ayna Corporation was closed as delete. :| TelCoNaSpVe :| 01:32, 26 November 2010 (UTC)
RFC input request bot
Per the discussion at Wikipedia:VPR#Input_randomisation (and others in the past which were rather more enthusiastic, but I forget where), can we please have a bot which will, in essence, randomly assign requests for input to RFC to members of a list of people who want such requests.
Some elements of how this will work:
- a category of Editors Willing To Be Asked For Input (subdivided by various topics, probably using the current RFC subdivisions)
- a bot which monitors the RFC feeds, and when a new one comes up, randomly picks X editors from the relevant category (let's say 5) and after checking for eligibility (let's say > 100 Talkspace edits, and account created at least 7 days before RFC started) drops messages on those users' talk pages requesting their input at that RFC.
- the bot's message would use an onwiki template, like Cluebot does, so it can be easily developed.
- If possible (perhaps later on) a nice enhancement would be for the bot to use an (onwiki) list of standard talkpage templates, and when they're found on the RFC page, sets parameters when it writes the template onto the user talk page, enabling the template to customise its output appropriately. The relevant templates would be things like Arbitration Enforcement, {{controversial}}, etc.
- If editors could have the option to be notified by email in addition to user talk, that would be fabulous for bringing in the less active editors who would otherwise miss the notice until it's too late.
Any takers? Rd232 talk 01:17, 27 November 2010 (UTC)
- I already run the RFC bot, so I think I could add this to the bot's duties. harej 01:54, 27 November 2010 (UTC)
- Makes sense, so if you can do this, many thanks! Rd232 talk 02:01, 27 November 2010 (UTC)
The service will manifest itself in the form of Wikipedia:Requests for comment/Comment duty. harej 03:28, 27 November 2010 (UTC)
- Coding done, initial tests successful. Now I just want to test it for a little while longer, and then I will request approval. harej 09:29, 27 November 2010 (UTC)
Convert release, start and similar dates to templates
This bot request is still outstanding, the orginal offer to complete it having lapsed. If its too big a job for one person, perhaps it could be sub-divided? Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 12:08, 28 November 2010 (UTC)
- Since you have previously stressed that the main goal of these templates is microformat emission, then per Wikipedia:Requests_for_comment/Microformats these have to be reviewed on case-by-case basis. I believe there has not been any expansion of the WP:uF or WP:Microformats to "provide guidelines for the examination of appropriate use and deployment of microformats"? — HELLKNOWZ ▎TALK 12:20, 28 November 2010 (UTC)
- There is no such requirement. Microformats are already emitted, this is to make the data they emit more complete. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 21:34, 28 November 2010 (UTC)
- Microformats already being fait accompli does not supersede the RfC future action consensus of "looking at the use of microformats on a case by case basis". Please provide a specific field to be changed to a specific uF emitting template so this can be discussed case-by-case. — HELLKNOWZ ▎TALK 21:50, 28 November 2010 (UTC)
- Your comment appears to bear no relation to my preceding statement. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 21:51, 29 November 2010 (UTC)
- Microformats already being fait accompli does not supersede the RfC future action consensus of "looking at the use of microformats on a case by case basis". Please provide a specific field to be changed to a specific uF emitting template so this can be discussed case-by-case. — HELLKNOWZ ▎TALK 21:50, 28 November 2010 (UTC)
- There is no such requirement. Microformats are already emitted, this is to make the data they emit more complete. Andy Mabbett (User:Pigsonthewing); Andy's talk; Andy's edits 21:34, 28 November 2010 (UTC)
Start box without End box
Okay, so some of these are false positives, but I have been finding many uses of {{start box}}
without the corresponding {{end box}}
or {{end}} or whatever. This often goes unnoticed, since it is at the end of the page, but it is a real problem if it happens before the PERSONDATA template, since it then makes the "person data" information visible. Now, there are many different "end" templates which can effectively close the {{start box}}
, which is why finding the open ones is a bit difficult. However, I was able to come up with this as a first start at a list. Perhaps someone can refine it. I would imagine with such a list, it would be a relatively straight forward task with AWB. Plastikspork ―Œ(talk) 04:50, 2 December 2010 (UTC)
- FYI, here is an example of what I am talking about Goldman Sachs Tower (diff). Plastikspork ―Œ(talk) 04:54, 2 December 2010 (UTC)
- Whoever does this, could you also take a look at Wikipedia:Bot requests/Archive 38#Succession tables cleanup ? Headbomb {talk / contribs / physics / books} 05:15, 2 December 2010 (UTC)
- Sounds like something for Wikipedia:WikiProject Check Wikipedia. — Dispenser 05:22, 2 December 2010 (UTC)
- Okay, I completed the first pass, which cleaned up about 2500 articles. If anyone can come up with a list of articles with other box problems, I can run the bot over those as well. Plastikspork ―Œ(talk) 09:24, 5 December 2010 (UTC)
Template:FeaturedPicture update
I just updated Template:FeaturedPicture to include a {{{type}}}
parameter to distinguish animations and videos from pictures, as they are all nominated via WP:FPC but there are separate categories: Category:Featured videos, Category:Featured animations, and Category:Featured pictures. Could a bot go through Category:Featured videos and Category:Featured animations and add |type=video or |type=animation to Template:FeaturedPicture? /ƒETCHCOMMS/ 03:37, 3 December 2010 (UTC)
- Should the bot also remove the category? Will the template be adding categories automatically based on the
|type=
parameter? - EdoDodo talk 07:58, 4 December 2010 (UTC)- No, categories should not change. I didn't add them to the template when I changed it, as the files already have the categories. Although that probably would have been smart. /ƒETCHCOMMS/ 03:21, 5 December 2010 (UTC)
Soxbot and edit filters
Is there a way to possibly modify User:SoxBot so that it catches and reports inappropriate usernames that trip one or more of the edit filters listed near the top of WP:UAA and make a note of it in the bot's report instead of waiting till the user(s) edit? :| TelCoNaSpVe :| 02:30, 5 December 2010 (UTC)
- I am, with slight delay, coding a replacement for the bot (see above) as SoxBot is dead at the moment. I think I will do one or the other, I will see what I can code in better. -- DQ (t) (e) 13:12, 6 December 2010 (UTC)
Dab and taxa update bots
Is there a bot that can tag a talk page of the next higher taxon when a new article taxon article is added? It would have to scan for the taxa articles every week or so. Say I add the Senecio biglovii article to wikipedia one day. Later on that week, the bot finds that article, scans its taxobox, finds out the article I added was a species article, then tags the genus article talk page (Talk:Senecio with a simple one line alert, with proper edit summary "New species article," in a new section:
New Senecio species article
"The species Senecio biglovii article was created on 2010-12-03."
For new family article, the bot would tag the order page if it exists, if not, the class page, if not the phylum page, if not the kingdom page.
The same with dabs. An article with a word in the name is created, say Manija Fairbanks, and the bot adds the line to either Talk:Fairbanks (disambiguation) or Talk:Fairbanks (surname) (creating the page in the latter case) if the article is included in a biography category and the surname dab exists:
New Fairbanks surname article
"The Manija Fairbanks article was created on 2010-12-01."
Not necessary it be the same bot for both tasks, and maybe these bots exist already? --Kleopatra (talk) 18:58, 5 December 2010 (UTC)
reformatting 7,610 reference URLs
I don't know whether this task would be possible for a bot & I've really come to ask the experts before developing a full request. Today English Heritage appears to have withdrawn the Images of England web site - see discussion at Wikipedia talk:WikiProject UK geography#www.imagesofengland.org.uk. A quick Google search for http://www.imagesofengland.org.uk within the domain http://en.wikipedia.org/ finds 7,610 wikipedia articles (some of which eg lists may link to more than one Images of England page)!. The format of the new URL is very different but all keep the same reference number within it & add resourceID=5 on the end. Example:
- From http://www.imagesofengland.org.uk/Details/Default.aspx?id=302261
- To http://www.heritagegateway.org.uk/Gateway/Results_Single.aspx?uid=302261&resourceID=5
Is this the sort of task a bot could handle?— Rod talk 21:09, 3 December 2010 (UTC)
- Fairly trivial, there are not 7k pages, google likes to lie. *.imagesofengland.org.uk has 380 links on wikipedia (330 mainspace). ΔT The only constant 21:15, 3 December 2010 (UTC)
- (edit conflict) Wikipedia:Bots/Requests for approval/H3llBot 5. Though AWB can probably handle it since it's only 380. — HELLKNOWZ ▎TALK 21:17, 3 December 2010 (UTC)
- I don't understand the different number of searches on google, but whatever the number could someone do it with whichever tools is best. I've just thought that "| work=Images of England" would also need to be changed to "| work=Heritage Gateway" but the publisher should stay = English Heritage. What is the next stage of requesting the work?— Rod talk 21:41, 3 December 2010 (UTC)
- Could it be that a number of these come through templates? Keith D (talk) 19:39, 4 December 2010 (UTC)
- I don't know if those using {{IoE}} get included in the numbers & don't know how to find out. I personally don't use it because it doesn't include accessed date, publisher etc required for GA, FA, FL etc.— Rod talk 19:49, 4 December 2010 (UTC)
- Could it be that a number of these come through templates? Keith D (talk) 19:39, 4 December 2010 (UTC)
- I don't understand the different number of searches on google, but whatever the number could someone do it with whichever tools is best. I've just thought that "| work=Images of England" would also need to be changed to "| work=Heritage Gateway" but the publisher should stay = English Heritage. What is the next stage of requesting the work?— Rod talk 21:41, 3 December 2010 (UTC)
- This request may have become unnecessary as the links now work correctly. Peter E. James (talk) 10:15, 7 December 2010 (UTC)
All daviscup.com links need to be replaced
As per a discussion at WP:Tennis I would like you to help in fixing all the daviscup.com links on English Wikipedia. The official Davis Cup site got revamped causing hundreds of referring links to go dead. Fortunately most of these links have their equivalent on the new site. We've already put together a list of these links and I filtered them to two major categories that encompass 75% of aforementioned links and are still repairable.
- 59 dead links starting with "www.daviscup.com/results" [1].
- 90 dead links starting with "www.daviscup.com/ties" [2]
- What the bot should do
- Replace [results/tieresult.asp?tie=] part within the URL to [en/results/tie/details.aspx?tieId=] (e.g the obsolete http://www.daviscup.com/results/tieresult.asp?tie=10000503 to new http://www.daviscup.com/en/draws-results/tie/details.aspx?tieId=10000503)
- Replace [ties/tie.asp?tie] part within the URL to [en/results/tie/details.aspx?tieID] (http://www.daviscup.com/ties/tie.asp?tie=100014710 to http://www.daviscup.com/en/draws-results/tie/details.aspx?tieId=100014710 for example)
- Replace http://www.daviscup.com/results/index.asp to http://www.daviscup.com/en/draws-results/world-group.aspx
Lajbi Holla @ me • CP 23:24, 5 December 2010 (UTC)
- Okay, I am working on it. I made a first pass, but am now looking through the ones that were missed. Plastikspork (talk) 02:19, 6 December 2010 (UTC)
- Thank you. Appreciate it. A question : how did you find the www.daviscup.com/news category because I couldn't draw up a replace-plan for the bot? (full list of problematic wikipages at WP:Tennis) Lajbi Holla @ me • CP 10:05, 6 December 2010 (UTC)
- Wow they have a great webmaster: If you request a non existing id [3], a 500 status code is returned because of an InfexOutOfRangeException instead a catching it and returning a 404.
- @Lajbi are there other paths you already replaced manually on enwiki that i could replace on other projects using the bot? How can i rewrite http://www.daviscup.com/results/index.asp?lstYear=1984 ?
- I'll run the job globally:
- /results/tieresult.asp?tie=* (87 links) or /ties/tie.asp?tie=* (123 links) to /en/results/tie/details.aspx?tieId= or /es/resultados/serie/detalles.aspx?tieId= (depends on the wiki language).
- /teams/team.asp?team=* (319 links ) to /en/teams/team/profile.aspx?id= or /es/equipos/equipo/perfil.aspx?id=
- /results/index.asp (33 links) to /en/results/world-group.aspx or /es/resultados/grupo-mundial.aspx.
- /teams/player.asp?player= (2791 links) to /en/players/player/profile.aspx?playerid= or /es/jugadores/jugador/perfil.aspx?playerid=
- Merlissimo 15:26, 6 December 2010 (UTC)
- Nice work. I noticed the same thing about http://www.daviscup.com/results/index.asp?lstYear=1984. The best solution that I came up with was to replace it with http://www.daviscup.com/en/draws-results/world-group.aspx , which has a pull down menu where you can select 1984. Unfortunately, I was not able to figure out how to make http://www.daviscup.com/en/draws-results/world-group.aspx?year=1984 work. I tried looking through the page source, but could not see an easy way to pass the year. Plastikspork ―Œ(talk) 04:02, 7 December 2010 (UTC)
- Yes that was the 3. suggestion I came up with. For the year variable specific in the URL, I came to recognize the same, that everybody has to to do search on his own in the main World-Group page. There is also a bigger database available in the new site. The old had a capacity of searchable draws to 1995 while the new one to 1981 or so (that also triggers the question what backed up the Davis Cup wiki articles before 1995). The draw PDF-s are missing as well at daviscup.com (or just been hidden very well) that could make some draws referenceable. It's a bit confusing. Lajbi Holla @ me • CP 09:30, 7 December 2010 (UTC)
- Nice work. I noticed the same thing about http://www.daviscup.com/results/index.asp?lstYear=1984. The best solution that I came up with was to replace it with http://www.daviscup.com/en/draws-results/world-group.aspx , which has a pull down menu where you can select 1984. Unfortunately, I was not able to figure out how to make http://www.daviscup.com/en/draws-results/world-group.aspx?year=1984 work. I tried looking through the page source, but could not see an easy way to pass the year. Plastikspork ―Œ(talk) 04:02, 7 December 2010 (UTC)
- Thank you. Appreciate it. A question : how did you find the www.daviscup.com/news category because I couldn't draw up a replace-plan for the bot? (full list of problematic wikipages at WP:Tennis) Lajbi Holla @ me • CP 10:05, 6 December 2010 (UTC)
MerlLinkBot: als-10: 1 Page; fr-0: 3 Pages; it-0: 92 Pages; pt-0: 108 Pages; es-0: 93 Pages; he-0: 38 Pages; cs-0: 191 Pages; fi-0: 68 Pages; en-0: 29 Pages; nl-0: 15 Pages; de-0: 20 Pages; th-0: 2 Pages; ja-0: 167 Pages; bg-0: 115 Pages; vi-0: 1 Page; ast-0: 3 Pages; ka-0: 1 Page; no-0: 3 Pages; sv-0: 60 Pages; ru-0: 4 Pages; es-0: 4 Pages; ar-0: 15 Pages; ko-0: 5 Pages; kn-0: 2 Pages; ca-0: 5 Pages; sr-0: 23 Pages; tr-0: 4 Pages; ro-0: 12 Pages; zh-0: 7 Pages; pl-0: 36 Pages; ml-10: 1 Page; es-10: 1 Page; nl-10: 1 Page; ca-10: 1 Page; eo-10: 1 Page; fr-0: 71 Pages; pt-0: 24 Pages; eo-0: 1 Page; hr-0: 9 Pages; it-0: 18 Pages; nds-0: 9 Pages; en-0: 145 Pages; id-0: 8 Pages; lv-0: 7 Pages; eu-0: 6 Pages; cv-0: 5 Pages; pl-0: 1 Page; sl-0: 5 Pages; da-0: 4 Pages; uk-0: 3 Pages; mk-0: 2 Pages; ta-0: 2 Pages; te-0: 2 Pages; sh-0: 2 Pages; el-0: 2 Pages; hu-0: 1 Page; gu-0: 1 Page; cy-0: 1 Page; lb-0: 1 Page; hi-0: 1 Page; jv-0: 1 Page; also fixed serveral templates on other wikis. Merlissimo 05:08, 7 December 2010 (UTC)
Adopt-a-user bot
Per the discussion at Wikipedia talk:Adopt-a-user#Bot to track progress and pairs, would it be possible to have a bot like that created? Note that there really aren't that many people active in the behind the scenes tasks at adopt-a-user, so those comments are pretty much all we're going to get from the program. The bot's task would be to periodically check a user's contributions for the latest timestamp and report it to a centralized tracking page at adopt-a-user. The centralized page would include every involved user's last edit information (date) and ideally, whether or not they are blocked. Users in the category Category:Wikipedians adopted in Adopt-a-user and other critical categories would be checked for activity so we can know whether or not the pairs are active. Thanks. Netalarmtalk 04:12, 8 December 2010 (UTC)
- I am familiar with the Adoption program and have some time to look into coding a bot for you. I'll discuss any further questions with you on your\the projects talk page. Hopefully I will have something finished this weekend! Thanks, Matthew Yeager (talk) 04:53, 8 December 2010 (UTC)
Canada Roads WikiProject
Currently, CRWP articles are being tagged with {{WikiProject Canada}} with the |roads=yes
parameter. We're resurrected and expanded {{Canada Roads WikiProject}}. We could use a bot to run through the articles and add the CRWP-specific template using the province, class and importance information from the Canada template. If an article is tagged with multiple provinces, the CRWP template can handle them as province1=XX through province10=XX where the input is the postal abbreviation. The Ontario articles are tagged with {{WikiProject Ontario Roads}} as well, and those tags can be removed. (That template is being retired in this update.) Any assistance is appreciated. Imzadi 1979 → 01:14, 3 December 2010 (UTC)
Undated Articles
How about a bot that goes through all of the Maintenance categories that have "undated articles/files" sections (i.e. Category:Articles lacking sources), finds the relevant tag in all of the undated articles, and adds the current month to each of them. --vgmddg (look | talk | do) 18:41, 2 November 2010 (UTC)
- If I understand you correctly, SmackBot already does this. Anomie⚔ 18:58, 2 November 2010 (UTC)
- It doesn't seem to be doing it right now. See its talk page. Either way, based on the description I think it just fixes ones that it happens to find while scanning through random articles looking for other stuff. I want something that actively goes into the categories with a purpose and adds dates to all of the undated pages in the list.
It would also be a good idea for it to move most boilerplate tags to the top of the article rather than having them strewn throughout the article. (i.e. It moves all {{Unreferenced}} tags to the top.)--vgmddg (look | talk | do) 23:01, 4 November 2010 (UTC)
- It doesn't seem to be doing it right now. See its talk page. Either way, based on the description I think it just fixes ones that it happens to find while scanning through random articles looking for other stuff. I want something that actively goes into the categories with a purpose and adds dates to all of the undated pages in the list.
Hello? Anybody there? --vgmddg (look | talk | do) 20:30, 9 November 2010 (UTC)
- Well, as said above, SmackBot already does this. Are you looking for another bot to do the same and focus especially on this? — HELLKNOWZ ▎TALK 20:42, 9 November 2010 (UTC)
- Yes. It can be a part of another bot, but it should use the category listing as its source (rather than the equivalent of clicking "Random Article"), and be focused on maintaining the category instead of just fixing ones it happens to find while doing other things. Ideally there should be no pages directly in the parent category Category:Articles lacking sources etc. --vgmddg (look | talk | do) 01:19, 14 November 2010 (UTC)
Hello? (again) --vgmddg (look | talk | do) 20:32, 19 November 2010 (UTC)
- It appears that user:SmackBot is temporarily blocked from making some of these edits.LeadSongDog come howl! 22:53, 19 November 2010 (UTC)
- Yobot doing Category:Articles lacking sources right now. -- Magioladitis (talk) 00:21, 20 November 2010 (UTC)
- It appears that user:SmackBot is temporarily blocked from making some of these edits.LeadSongDog come howl! 22:53, 19 November 2010 (UTC)
- That's what I was trying to say! Either way though, what I get from its user page is that it seems to only fix templates that it happens to find while searching random pages throughout the whole of Wikipedia. I want something that goes straight to Category:Articles lacking sources, Category:Articles that need to be wikified, etc. and adds dates to the templates of all of the pages directly in the category. Whenever it detects that a new page has been added to the category, it immediately pounces on it and adds a date to the template. Below is a sample workflow of what I basically want it to do:
- Go to category in question. (Category:Articles lacking sources, Category:Articles with topics of unclear notability, etc.)
- Get every list item under section "Pages in category '{{PAGENAME}}'"
- Weed out whitelisted pages that are meant to be at the top level. (For example, the page Wikipedia:Requests for expansion is supposed to be in Category:Articles to be expanded)
- Repeat with every article in the remaining list:
- Go to article in question.
- Scan through entire article for template(s) that don't have dates on them that are supposed to have them. (You could probably talk to User:Rich Farmbrough (maker of SmackBot) for a list of the templates to look for.)
- Add date parameter to offending template(s).
- Return to category and repeat.
Once all of the categories are cleared, the bot will run in the background and wait for another article to be added to one of the categories on its list. The bot would be used for pretty much any category in Category:Wikipedia maintenance categories sorted by month that that is added to via template. I have laid out my plans. All that is needed is for someone to convert it into executable code. --vgmddg (look | talk | do) 01:09, 20 November 2010 (UTC)
- I am not planning to do SmackBot's bot. SmackBot works many hours per day to catch-up with the backlog. I am already busy doing other stuff for wikipedia. We have to wait SmackBot to be unblocked or another person to take over. -- Magioladitis (talk) 01:15, 20 November 2010 (UTC)
- So it is. I really like what you did with it though. You did exactly what I wanted, and very quickly too. If you could do this for all of the relevant categories it would make things a lot more streamlined. I wanted something that went straight to the category, and User:Yobot did exactly that. SmackBot seems to only fix a template if it passes by one while going through random articles. SmackBot currently has 42 tasks on its list. I'm just asking you to do one of them, and using a different method than SmackBot. This is exactly what I wanted! --vgmddg (look | talk | do) 01:44, 20 November 2010 (UTC)
- I guess I could do the easy part then. At least until SmackBot runs again. SmackBot does much more fixing broken dates too. -- Magioladitis (talk) 08:15, 21 November 2010 (UTC)
- More than 1,000 undated dead links are caused by {{lfpfr}} which doesn't work anymore. Check Template talk:Lfpfr. -- Magioladitis (talk) 11:28, 21 November 2010 (UTC)
- As of rev 7419, AWB can also fix
|date=month XXXX
to|date=Month XXXX
and|date=Day Month XXXX
to|date=Month XXXX
making method more efficient. -- Magioladitis (talk) 14:32, 22 November 2010 (UTC)
- As of rev 7419, AWB can also fix
- Would it be possible to make Yobot run automatically, dating articles as they come in? Thanks a lot. --vgmddg (look | talk | do) 23:51, 22 November 2010 (UTC)
- I'll see what I can do. I am working in forming some changes to WP:AWB to cover as many cases as possible. -- Magioladitis (talk) 18:19, 23 November 2010 (UTC)
- Thanks a lot. --vgmddg (look | talk | do) 23:32, 28 November 2010 (UTC)
- I'll see what I can do. I am working in forming some changes to WP:AWB to cover as many cases as possible. -- Magioladitis (talk) 18:19, 23 November 2010 (UTC)
- More than 1,000 undated dead links are caused by {{lfpfr}} which doesn't work anymore. Check Template talk:Lfpfr. -- Magioladitis (talk) 11:28, 21 November 2010 (UTC)
- I guess I could do the easy part then. At least until SmackBot runs again. SmackBot does much more fixing broken dates too. -- Magioladitis (talk) 08:15, 21 November 2010 (UTC)
Wikipedia:Bots/Requests for approval/KarlsenBot 6 asks to perform the task. AWB already fixes some of common mistakes found on dated templates, improving success of this task. -- Magioladitis (talk) 00:00, 29 November 2010 (UTC)
- ArbCom blocked KarlsenBot and SmackBot is blocked again. -- Magioladitis (talk) 14:07, 9 December 2010 (UTC)
- I'll take a look at it. What exactly needs to be done? Go through articles in the direct subcategories of Category:Wikipedia maintenance categories sorted by month and check uses of the templates listed here, correcting where people put the date in an unnamed parameter 1 or put in a dmy or mdy or ymd format date and if all else fails adding
|date=
for the current date? Anomie⚔ 21:31, 9 December 2010 (UTC)- Yes. Latest version of AWB does 99% of them. I am working in coopertion with Rjw in extending TagUpdater's logic to handle more cases found in Category:Articles with invalid date parameter in template. Running AWB with general fixes should work. I can do the task but only occasionally since it really takes time. I suggest that we have 2 or 3 bots doing it. -- Magioladitis (talk) 10:04, 12 December 2010 (UTC)
- I'll take a look at it. What exactly needs to be done? Go through articles in the direct subcategories of Category:Wikipedia maintenance categories sorted by month and check uses of the templates listed here, correcting where people put the date in an unnamed parameter 1 or put in a dmy or mdy or ymd format date and if all else fails adding
I spent my entire day to add more dated templates, more redirects and fix many templates, like this one, which were supposed to be dated but they don't. I even fixed Multiple issues a bit. The bot part isn't the only one. We need someone to check all the templates and see if they work fine. -- Magioladitis (talk) 20:14, 12 December 2010 (UTC)
- Me too. BRFA filed Anomie⚔ 00:39, 13 December 2010 (UTC)
Year category request
Per Wikipedia:Categories for discussion/Log/2010 November 29#Category:2100, I am formally requesting that categories Category:2031 through Category:2098 be created as follows: Category:20''ab'' should have:
{{portal|History}}
{{yearcat}}
{{Decade category header|decade=20a0}}
[[Category:Years in the future]]
[[Category:Years]]
I argued for deletion of Category:2100, but if it is to be there, let's make it part of a pattern: Category:2099 was created (with incorrect sort tags) during the process, and I corrected the sort tags. I have been unable to get AWB to work for some time, or I'd attempt to do this, myself.
Once this is done, Category:Years in the future should be removed from the decade categories Category:2030s through Category:2090s. — Arthur Rubin (talk) 16:41, 13 December 2010 (UTC)
- I created the categories. Will look at the other parts if someone else hasn't already done it. Plastikspork ―Œ(talk) 01:19, 14 December 2010 (UTC)
Removing obsolete Wildbot tags
I have generated a list of talk pages with obsolete WildBot tags. They can be safely removed with the following regex:
\{\{(User:WildBot/m01|User:WildBot/msg)\|([^{}]|\{\{User:WildBot/[^{}]*\}\})*\}\}\n?
If nothing is left you may wish to speedy delete the page (WildBot used G7-author). — Dispenser 01:21, 24 November 2010 (UTC)
- I can do it in 8 hours or so. -- Magioladitis (talk) 01:44, 24 November 2010 (UTC)
- I am actively trying to bring WildBot up, but its internal database is quite complex and not documented at all so I'm having to rebuild it by hand. I also can't guarantee that it will try to correct past-found dabs so it may be best to delete these, assuming that the list is actually accurate (has this been verified?) --Shirik (Questions or Comments?) 01:46, 24 November 2010 (UTC)
- I believe Josh missed out on a key SQL optimizations. My pure SQL query took ~88 minutes for 13K articles. The results should be correct, it accounts for non-redirects and redirects to dabs. I have verified a small sample with Dablinks tool on the template. — Dispenser 04:17, 24 November 2010 (UTC)
- I am starting in a few hours unless they are disagreements. -- Magioladitis (talk) 08:15, 24 November 2010 (UTC)
- Doing... Magioladitis (talk) 14:22, 24 November 2010 (UTC)
- Done 3,618 pages. -- Magioladitis (talk) 01:14, 25 November 2010 (UTC)
- I've update the report (in 26 minutes). Should I make this into a weekly report or should I turn my code over to Shirik? — Dispenser 05:37, 25 November 2010 (UTC)
- Where to I get the new list? -- Magioladitis (talk) 15:26, 25 November 2010 (UTC)
- Same place as before, though your browser may have cached it. — Dispenser 20:30, 25 November 2010 (UTC)
- Done 164 pages. -- Magioladitis (talk) 17:57, 26 November 2010 (UTC)
- I've setup a script to update the list daily. — Dispenser 22:32, 5 December 2010 (UTC)
- Maybe User:Anomie or User:Rich Farmbrough could do this task in daily basis? -- Magioladitis (talk) 12:43, 13 December 2010 (UTC)
- Done 264 pages. I recall there was bot request for deleting empty pages occurring from this process. -- Magioladitis (talk) 12:47, 13 December 2010 (UTC)
- I've setup a script to update the list daily. — Dispenser 22:32, 5 December 2010 (UTC)
- Done 164 pages. -- Magioladitis (talk) 17:57, 26 November 2010 (UTC)
- Same place as before, though your browser may have cached it. — Dispenser 20:30, 25 November 2010 (UTC)
- Where to I get the new list? -- Magioladitis (talk) 15:26, 25 November 2010 (UTC)
- I've update the report (in 26 minutes). Should I make this into a weekly report or should I turn my code over to Shirik? — Dispenser 05:37, 25 November 2010 (UTC)
- Done 3,618 pages. -- Magioladitis (talk) 01:14, 25 November 2010 (UTC)
- Doing... Magioladitis (talk) 14:22, 24 November 2010 (UTC)
- I am starting in a few hours unless they are disagreements. -- Magioladitis (talk) 08:15, 24 November 2010 (UTC)
- I believe Josh missed out on a key SQL optimizations. My pure SQL query took ~88 minutes for 13K articles. The results should be correct, it accounts for non-redirects and redirects to dabs. I have verified a small sample with Dablinks tool on the template. — Dispenser 04:17, 24 November 2010 (UTC)
Wildbot would be doing this itself, but I think this edit turned it off. Could someone who is bot-clueful turn it back on? (Or explain what's involved in having it get back to work?)--W☯W t/c 20:13, 15 December 2010 (UTC)
Webcite requests for BLPs
In dealing with WP:LINKROT, one of the suggestions is to use the on-demand web archiving service WebCite. A fine idea, but how often does it really happen that people do that? Linkrot is a particularly pressing problem for BLPs. Our article about Chante Jawan Mallard is a good example. Three of the five references have gone bad, leaving a CNN article and a Snopes page to confirm that she was convicted of an unusual murder. Could a script be written to trawl the external links in BLPs and request archiving for all of them at WebCite? A bot could then place a comment or template on the talk page noting that the webpages have been webcited. I'm not sure yet how we would need to coordinate with WebCite, given the demands this might make on their program.--Chaser (talk) 10:44, 12 December 2010 (UTC)
- There was WebCiteBOT, but its inactive right now. I did manage to find the source code [4] [5] but I don't know how up-to-date it is. LegoKontribsTalkM 22:12, 12 December 2010 (UTC)
- I have a working program, if you want to request that I take a look at an article see: tools:~betacommand/webcite.html. Im working on integrating {{webCite}} right now it works with only {{Cite}} based templates. ΔT The only constant 14:18, 14 December 2010 (UTC)
- WebCiteBOT was what I was thinking. I'll ask around about a bot operator that might be interested in re-starting it. Thanks all.--Chaser (talk) 17:34, 15 December 2010 (UTC)
- That's excellent news, ΔT. I'd just about despaired of seeing WebCiteBot revived. Thank you! LeadSongDog come howl! 22:26, 15 December 2010 (UTC)
- I have a working program, if you want to request that I take a look at an article see: tools:~betacommand/webcite.html. Im working on integrating {{webCite}} right now it works with only {{Cite}} based templates. ΔT The only constant 14:18, 14 December 2010 (UTC)
Infobox cleaner
Amongst my routine cleanup, I often come across poorly declared infoboxes (see example). This makes them very editor-unfriendly (for both new and expert editors). So I was wondering if it would be possible to create an "infobox cleaner bot" that would crawl across infoboxes and do the following fixes
- "{{Infobox Foobar" on its own line
- One parameter = One line
- Final }} on it's own line
- Place the pipe on the left of the new parameter, rather than at the end of the previous parameter's input
- Remove stray pipes
- Strip the whitespace at the right of the parameters input
- Align the "=" of the various parameters
That is, convert
{{Infobox disease | Name = Bob the Magic Disease | Image = FrankAvruchasBozo.JPG | Caption = Magic Caption of Doom | eMedicineSubj = 528 | }} '''Bob the Magic Disease''' is a magic disease named "Bob".
into
{{Infobox disease | Name = Bob the Magic Disease | Image = FrankAvruchasBozo.JPG | Caption = Magic Caption of Doom | eMedicineSubj = med | eMedicineTopic = 528 }} '''Bob the Magic Disease''' is a magic disease named "Bob".
Now most of these fixes could be incorporated into AWB for general cleanup, and would not warrant a edit on their own, but I think a bot restricting itself to articles which don't have pipes on the left, and/or which have more than one parameter per line, goes beyond general cleanup. A bot editing them would makes the infoboxes much more editor-friendly and much MUCH less intimidating to newcomers, and would go a long way in preventing the propagation of horribleness through copy-pasting across different articles. Headbomb {talk / contribs / physics / books} 15:47, 13 December 2010 (UTC)
- Bots making minor formatting changes often turn out to be controversial, as some complain that it clogs their watchlists and the article history for no reader-visible purpose. Before such a bot could be approved, there would have to be a strong community consensus that it was desired. Take it to WP:VPR and spam it on WP:CENT for a start. Anomie⚔ 17:12, 13 December 2010 (UTC)
- Each editor indeed has a certain preference of how to style the infoboxes (or any template for that matter). There is no single guideline or policy about this. While an editor could manually guess which infoboxes can be gussied up without anyone objecting, a bot would not be able to make such decisions and will run into angry editors very soon. In addition to this, as Anomie pointed out already, bots should not be making minor edits that do not change the rendered output of the page. — HELLKNOWZ ▎TALK 17:19, 13 December 2010 (UTC)
- Agree with H3ll and Anomie. It would possibly be an idea to create a js script to do this. This would only run for the editors who opt-in, and only when they were editing that article anyway. Like I say, just an idea, I haven't thought about it in-depth. - Kingpin13 (talk) 17:22, 13 December 2010 (UTC)
- Ok then perhaps a general framework could be designed, and then limit to scope of the bot to a selection of infoxes? I'm currently working my way through {{infobox disease}} (manually) where this does a tremendous amount of cleanup (~750 cases out of ~5500 transclusions), and I know {{Infobox album}}/{{infobox single}} would immensely benefit from this. Maybe this is not so much of a problem with other infoboxes, but I know of none which have the pipes on the right of the parameters. Also as previously mentionned, these are not trivial edits, and would only affect very badly formatted pages. Headbomb {talk / contribs / physics / books} 17:36, 13 December 2010 (UTC)
- I'm willing to take a stab at it. But I will try a more general approach, have a stencil for how to layout the template and a mapping component for renaming parameters. With this approach we can add and update parameters to the template whilst keeping a consistant formatting across pages. — Dispenser 21:41, 13 December 2010 (UTC)
I have a framework for this somewhere that I used for Infobox Album and Infobox French commune. However I was looking for somewhat more thorough clean up. I will keep a watching brief on this while I do other stuff. Rich Farmbrough, 21:20, 15 December 2010 (UTC).
- Two thoughts:
- For Infobox disease, perhaps the place to get a consensus is WP:MED, not VPP. WPMED folks are the ones most likely to be affected, after all.
- If the watchlist effects are serious problems, then why don't we just do this very slowly? One edit an hour can't possibly overwhelm anyone's watchlist, and while the bot would need to keep running for a month to fix 750 templates, it would eventually get done. WhatamIdoing (talk) 22:30, 15 December 2010 (UTC)
I'm going to have to put this off until sometime in January, there's too much I need to still get done. There are some rules already implemented in the commonfixes library (used by reflinks) to move pipes, but I intend to finish this up eventually. — Dispenser 12:58, 17 December 2010 (UTC)
Taxonomy of gastropods update
2010 taxonomy of Heterobranchia update
Update articles in certain category/subcategories. Example of work needs to be done: [6]. Instructions: Wikipedia_talk:WikiProject_Gastropods#Taxonomy_of_Heterobranchia. --Snek01 (talk) 12:10, 16 December 2010 (UTC)
Error in Littorinimorpha articles by Ganeshbot
Add missing taxon. Follow the link: Wikipedia_talk:WikiProject_Gastropods#Error in Littorinimorpha articles by Ganeshbot. --Snek01 (talk) 20:00, 20 December 2010 (UTC)
Messages for retired users
We're not going to require reconfirmation edits. This was a silly idea. Moving on. --MZMcBride (talk) 19:01, 21 December 2010 (UTC) |
---|
The following discussion has been closed. Please do not modify it. |
Can we construct some way for messagebots to stop sending messages to retired users? For example, if there was some way to turn off EdwardsBot so that it could just stop sending useless Signpost editions to some retired users' talkpage and clutter the entire page with useless threads? I find it wastes a lot of resources and the talkpage would then become some dumping ground for these editions, and furthermore, it hinders someone's navigation when they are looking for a particular thread related to the users' histories (e.g. an ANI incident) and have to shuffle through the multiple Signpost articles to look for the right thread to link to ANI. To compensate for these, I would prefer that we use some sort of confirmation page to ensure that the recipients of the message bots are still active enough to respond to them. :| TelCoNaSpVe :| 16:16, 6 December 2010 (UTC)
Uh, the bot is definitely sending this message to active users, ex. Brad101 and Catalan. May want to tweak your coding. Ed [talk] [majestic titan] 10:13, 21 December 2010 (UTC) |
Stale tag removal
The WPMED project has a list of ~1400 articles currently tagged with {{unref}} (including redirects and the multiple issues template). Recent experience indicates that about 20% of these actually contain a reference, but nobody bothered to remove the unref tag. Per Template:Unreferenced/doc, the template should not be placed on any page containing any sort of citation.
Would it be easy to have a bot remove the unref template from any article in my list that (1) contains a <ref> tag or (2) contains any URL (http://)? WhatamIdoing (talk) 23:53, 7 December 2010 (UTC)
- So just any url of any sort OR any citation tag? Should there be any verification that the website actually exists? LegoKontribsTalkM 04:56, 8 December 2010 (UTC)
- If you are referring to any article, then having an URL is not indicative of being referenced. It can be in external links section, can be official website in infobox, can be an inline link to a video, etc. Having a non-empty
<ref></ref>
, however, is most likely indicative of article having a real reference. There was a bot proposal some time ago about this, cannot find it now. — HELLKNOWZ ▎TALK 13:02, 8 December 2010 (UTC)
- Legoktm, verifying that the website exists would be nice, but on the ones I checked manually, I don't remember encountering any dead links. And, yes, I think that articles containing any ref tag or any URL can be safely de-tagged.
- Hellknowz, the presence of any URL in an article does indicate a citation, to a first and even second approximation. It doesn't indicate that the citation is to a reliable, independent, third-party source—but, then, the presence of ref tags doesn't prove that the "citation" is a citation to a real source (I've pulled "citations" to 'editor's personal experience' before), or that it is a citation rather than an explanatory footnote. Official websites, for example, tend to do an excellent job of verifying basic information about a company. URLs currently listed under ==External links== might well be WP:General references that deserve a {{no footnotes}} tag.
- I've already crawled through about 10% of the list by hand, so I've got a good idea of what's going on with this batch. If were were talking primarily about a batch of BLPs, I'd be more conservative, but I think this is reasonably safe. I'm looking for a one-time run, with no reverting if someone disagrees. If it made you happier, though, I'd be happy to get a list from the bot about which articles were changed, and to double-check them all by hand afterwards. WhatamIdoing (talk) 23:53, 10 December 2010 (UTC)
- Coding... Is there a list specifically for WPMED? Or was the request just in general tagging? LegoKontribsTalkM 02:20, 11 December 2010 (UTC)
- Femto Bot puts up a list at lacking sources (not a small page; today's list has 1,432 articles in the targeted category). I'm willing to turn that into any format that happens to work for you. WhatamIdoing (talk) 05:48, 11 December 2010 (UTC)
- I'd prefer the description incorrect or outdated tags over stale tags - if anything the issue doesn't get less serious with time and with some of these backlogs its the oldest tags that are most likely to have someone work on them. {{unreferencedBLP}} currently goes back to Dec 2008 but not that long ago went back to early 2007. Also some of the Urls can not be trusted, I've seen refs to other wikipedia articles before now - and those should be turned into internal links not used as an excuse to remove an unreferenced tag. I'm loathe to unleash a bot on the unreferenced tags as I believe the error rate would be excessive, but there are some prospect lists that would be useful if there are volunteers to go through them. In particular unreferenced articles where a subsequent edit has added a citation, and unreferenced articles which contain a link to one of the more common reliable sources such as the BBC website.. ϢereSpielChequers 10:02, 11 December 2010 (UTC)
- Seeing as there is controversy over such a task (which makes sense imo), it might be a better idea to run a bot which just outputs a list of page for human review. Maybe as a database report? (Don't worry, I'm still working on coding the first task right now. xP) LegoKontribsTalkM
- As I said above, I'm perfectly happy to review changes; whether that review happens before or after the tags are removed is immaterial to me. I would, however, like to avoid the RSI issues involved in manually removing the tag from a couple hundred articles. WhatamIdoing (talk) 20:11, 13 December 2010 (UTC)
- My proposal: I can write a toolserver tool which would list articles fitting the criteria, and set up the edit for the user to make, but then it would have to be confirmed by the user. (Similar to what Dispenser uses for checklinks). This would enforce human oversight but still making it much easier to identify stale tags and remove them. Opinions? LegoKontribsTalkM 22:47, 16 December 2010 (UTC)
- This sounds sensible, but my ultimate goal is to reduce the carpal-tunnel-inducing repetitive motions involved in manually removing the tags, so it's not solving my actual problem. What you're proposing works like this: Look at the first article in the list (one click to open all articles in tabs using Linky), see that the tag is stale, click (ouch) the edit button, triple-click the tag (ouch), or click and drag to select (ouch) the relevant parameter for the multiple issues template, and save the page—and then repeat, times a couple of hundred articles.
- Basically, you're proposing "five hundred ouch per hour" as an alleged improvement over "five hundred ouch per three hours".
- I want "zero ouch". How can we get to zero, or at least close to it? WhatamIdoing (talk) 20:29, 18 December 2010 (UTC)
- My proposal: I can write a toolserver tool which would list articles fitting the criteria, and set up the edit for the user to make, but then it would have to be confirmed by the user. (Similar to what Dispenser uses for checklinks). This would enforce human oversight but still making it much easier to identify stale tags and remove them. Opinions? LegoKontribsTalkM 22:47, 16 December 2010 (UTC)
- As I said above, I'm perfectly happy to review changes; whether that review happens before or after the tags are removed is immaterial to me. I would, however, like to avoid the RSI issues involved in manually removing the tag from a couple hundred articles. WhatamIdoing (talk) 20:11, 13 December 2010 (UTC)
- It would have been trivial to add this to RefBot, but that was blocked by a bunch of Arbitrators who wouldn't follow their rules[7], didn't define what the issues were, didn't know a bot was involved[8], and didn't discuss their decisions (or they would have found that the bot was not relevant). Good luck solving this without a bot that understands citations well. -- SEWilco (talk) 00:47, 19 December 2010 (UTC)
- Determining reference status is not a good task for a bot. While the presence of a ref tag is, in probably 99% of cases, an indication that the article has some reference(s), ref tags can also be used for explanatory footnotes. Further, one needs to verify that the source is a usable reference, not a broken link, a reference to another Wikipedia article, the official site of the subject, something totally irrelevant, etc. Even then, there's no way for a bot to determine whether the tag should be removed outright, or replaced with a different tag like {{refimprove}}. Mr.Z-man 19:28, 21 December 2010 (UTC)
Remove |article=yes from {{Multiple issues}}
Similar to the previous request, could someone please have a bot remove the depricated |article=yes
from {{Multiple issues}}? Thanks! GoingBatty (talk) 01:32, 20 December 2010 (UTC)
I 'll need a list of all unknown parameters. I worked a bit with the list I got in August 24. Check also User_talk:Magioladitis#FR:_.23New_alert:_Unknown_parameters_to_Multiple_issues. I think we need to remove the |article=
by bot (not only with value yes) and fix the rest manually. I'll start doing it. -- Magioladitis (talk) 10:28, 20 December 2010 (UTC)
- See [9] :) -- Magioladitis (talk) 10:34, 20 December 2010 (UTC)
- I am working on it. -- Magioladitis (talk) 08:57, 21 December 2010 (UTC)
Doing... Removing article, do-attempt. Renaming OR and or. -- Magioladitis (talk) 13:40, 21 December 2010 (UTC) Done Magioladitis (talk) 01:05, 22 December 2010 (UTC)
Remove |bot=yes
from {{BLP unsourced}}
Could someone please have a bot remove the deprecated |bot=yes
from {{BLP unsourced}}? This would allow AWB to make fixes to articles with this template, including merging it into {{Multiple issues}}. Thanks! GoingBatty (talk) 22:31, 19 December 2010 (UTC)
- This should not be done. These all need to be reviewed by hand to see whether the bot was correct. That's why bot=yes was added in the first place. — Carl (CBM · talk) 22:37, 19 December 2010 (UTC)
Do we have an estimate of how many pages that this parameter? Is there some tracking category? -- Magioladitis (talk) 22:41, 19 December 2010 (UTC)
- Apparently it was removed. Adding it again to get a count of the articles affected would be reasonable. — Carl (CBM · talk) 22:45, 19 December 2010 (UTC)
- It would indeed. But it is not worth a bot edit to remove the parameter, AWB can do that as a general fix (as GB initially requested) - and since AWB is doing the merge with MI it will get all those that matter (and more). To remove it on articles with BLP unsourced AND Multiple issues at this stage would be fine, with an AWB run (I'll check out how big that would be). Rich Farmbrough, 00:59, 22 December 2010 (UTC).
- Hi Rich - thanks for working on this. AFAIK, this is not one of the AWB general fixes, which is why I submitted the AWB feature request before being directed here. Thanks! GoingBatty (talk) 19:00, 22 December 2010 (UTC)
- It would indeed. But it is not worth a bot edit to remove the parameter, AWB can do that as a general fix (as GB initially requested) - and since AWB is doing the merge with MI it will get all those that matter (and more). To remove it on articles with BLP unsourced AND Multiple issues at this stage would be fine, with an AWB run (I'll check out how big that would be). Rich Farmbrough, 00:59, 22 December 2010 (UTC).
Replacement for the late Article Alert Bot
Could someone make another bot to replace the late Article Alert Bot? It would be extremely helpful. Wikipedia:Article alerts/Specification looks like a good reference. Arlen22 (talk) 20:12, 31 August 2010 (UTC)
- I have the source code! Now, who will volunteer to take over the bot? Arlen22 (talk) 23:32, 31 August 2010 (UTC)
- Could you post a link to the source code? Smallman12q (talk) 00:59, 1 September 2010 (UTC)
- I don't have anywhere to post it. It was emailed to me. Arlen22 (talk) 01:40, 1 September 2010 (UTC)
- OMGOMGOMGOMGOMG!!11!! Headbomb {talk / contribs / physics /books} 09:23, 1 September 2010 (UTC)
- Is someone working on it? Arlen22 (talk) 11:30, 1 September 2010 (UTC)
- I'm looking into the issue. Let you know what I find. -- Crazysane(T/C\D) 14:04, 1 September 2010 (UTC)
- All I have to say is, FINALLY. ResMar 17:09, 1 September 2010 (UTC)
- Is someone working on it? Arlen22 (talk) 11:30, 1 September 2010 (UTC)
- OMGOMGOMGOMGOMG!!11!! Headbomb {talk / contribs / physics /books} 09:23, 1 September 2010 (UTC)
- I don't have anywhere to post it. It was emailed to me. Arlen22 (talk) 01:40, 1 September 2010 (UTC)
- Could you post a link to the source code? Smallman12q (talk) 00:59, 1 September 2010 (UTC)
Oh, and to give a hint at what needs fixing, it was due to an API change back in April. User:Legoktm may know more about that. Headbomb {talk /contribs / physics / books} 03:35, 2 September 2010 (UTC)
If I could get the source emailed to me, via the link on my userpage or talk page I would gladly fix the issues mentioned above and take over the bot if someone has already not volunteered. Joe Gazz84user•talk•contribs•Editor Review 15:31, 4 September 2010 (UTC)
- I haved talked to Crazysane and they said since they have not looked into it and since I know how to fix the issue that I could take over the bot, so I will be taking over the bot and need the source code emailed to me. Joe Gazz84user•talk•contribs•Editor Review 19:14, 6 September 2010 (UTC)
- That is just great! After the bot is up and running [priority #1], would you have time to do some other updates? Headbomb {talk /contribs / physics / books} 21:38, 7 September 2010 (UTC)
- Please list the updates since I cannot locate the API and am awaiting a response of as to where it is I can possible do those now. Joe Gazz84user•talk•contribs•Editor Review 20:35, 11 September 2010 (UTC)
- Check out... WP:AAlerts/Bugs and WP:AAlerts/FR. Drop me a message if you need information on those. Headbomb {talk / contribs / physics /books} 23:56, 11 September 2010 (UTC)
I am sorry to burden you but please create a ticket for each on that has not been solved here so I can take care of them in an orderly fashion.Joe Gazz84user•talk•contribs•Editor Review 17:13, 14 September 2010 (UTC)
- Gettin' there. Arlen22 (talk) 18:55, 17 September 2010 (UTC)
- Nudge =) ResMar 01:25, 24 September 2010 (UTC)
- Hopefully by end of this week or beginning of next. Arlen22 (talk) 20:41, 30 September 2010 (UTC)
- A problem with toolserver is waiting for fixing. Arlen22 (talk) 17:23, 6 October 2010 (UTC)
- Still looking forward to the bot being fixed. Gage (talk) 04:45, 16 October 2010 (UTC)
- It seems we've stalled out? What's the status? ResMar 15:01, 16 October 2010 (UTC)
- Go to https://jira.toolserver.org/browse/DBQ-106. Any toolserver experts are begged to check it out. Arlen22 (talk) 20:33, 16 October 2010 (UTC)
- It seems we've stalled out? What's the status? ResMar 15:01, 16 October 2010 (UTC)
I don't know, this may take while. Some other WikiProjects have removed Article Alerts because they feel that ArticleAlertBot is no longer active because of Wikipedia API changes. JJ98 (Talk) 04:02, 18 October 2010 (UTC)
- And Joe Gazz is planning on retiring, so we might need another coder/operator. Headbomb {talk / contribs / physics /books} 04:16, 18 October 2010 (UTC)
- You could always ask User:Anomie. She has two bots at the moment, so she is experienced in coding and running a bot. - Neutralhomer • Talk • 10:26, 18 October 2010 (UTC)
- And Joe Gazz is planning on retiring, so we might need another coder/operator. Headbomb {talk / contribs / physics /books} 04:16, 18 October 2010 (UTC)
- Always something...grrr. Maybe the Foundation should dole out a little bit of money and hire someone to fix it. Lately, they've been spending on some weird stuff, for those of us who read the post. =p ResMar 02:01, 20 October 2010 (UTC)
- And where, pray tell, are these posts? Arlen22 (talk) 11:38, 20 October 2010 (UTC)
Almost ready! Drummers, get ready! Arlen22 (talk) 23:16, 22 October 2010 (UTC)
- Studies on controversial content etc. Post meaning the Signpost, where I am a contributing writer. Once the bot is back up can you talk to Jarry? We can celebrate with a nice story in the weekly tech =) ResMar 22:52, 24 October 2010 (UTC)
- Don't forget the WikiProject Report! We'd love to share the announcement that ArticleAlertBot is back. -Mabeenot (talk) 15:58, 29 October 2010 (UTC)
- Held up by a few java related problems. Arlen22 (talk) 14:40, 3 November 2010 (UTC)
- Okay, the java issues are, manifest not being regognized when compile is taking place, minor fixes to already existing code. The users who would like to present that the bot is up and going should contact me so I know when to alert you. Please note: I will not alert you until it is completed trial and has the bot flag. JoeGazz ▲ 01:50, 9 November 2010 (UTC)
- Are we almost there? D: ResMar 23:57, 15 November 2010 (UTC)
- Okay, the java issues are, manifest not being regognized when compile is taking place, minor fixes to already existing code. The users who would like to present that the bot is up and going should contact me so I know when to alert you. Please note: I will not alert you until it is completed trial and has the bot flag. JoeGazz ▲ 01:50, 9 November 2010 (UTC)
- Held up by a few java related problems. Arlen22 (talk) 14:40, 3 November 2010 (UTC)
Can we put the code publicly somewhere (meta wiki...?) so that if it goes down again it will be easy to get the source again? --Piotr Konieczny aka Prokonsul Piotrus| talk 20:02, 17 November 2010 (UTC)
- Hmm, wouldn't it belong on the code wiki? ResMar 23:10, 21 November 2010 (UTC)
- If there is such a thing, sure (never heard of it - link?). --Piotr Konieczny aka Prokonsul Piotrus| talk 17:13, 22 November 2010 (UTC)
- We can't. I have been giving out the code to certain people on the condition that it stays private. The code was never released under a license byB. Wolterding so it would be wrong to release his work on a license he never agreed to. LegoKontribsTalkM 21:11, 28 November 2010 (UTC)
- If there is such a thing, sure (never heard of it - link?). --Piotr Konieczny aka Prokonsul Piotrus| talk 17:13, 22 November 2010 (UTC)
Still working on Java related problems. Arlen22 (talk) 12:41, 24 November 2010 (UTC)
- If you're stuck on Java issues, ping User:Jorm. He may be able to help. Kaldari (talk) 01:41, 1 December 2010 (UTC)
Since it's been three months now, and there's been plenty of time to update the bot, and it still isn't running, H3llkn0wz and I have decided to begin rewriting the bot from scratch (he codes, I comment). If the original bot is back up again before we're done, great. The new bot will be open-sourced and could be deployed on the other Wikis regardless of whether we use it on the English wikipedia. Headbomb {talk /contribs / physics / books} 02:11, 1 December 2010 (UTC)
- <applause><whoops and cheers><applause><flowers thrown><applause> Rd232 talk 10:21, 1 December 2010 (UTC)
- <Refresh> Arlen22 (talk) 03:40, 17 December 2010 (UTC)
We have given up on the old code. All stops pulled, let's go! Arlen22 (talk) 00:41, 23 December 2010 (UTC)
Deleting old Wildbot pages Yobot blanked
See above for initial discussion of this problem. User:Magioladitis's Yobot blanked old WildBot tags from, as you can read above, 3782 talkpages. Many of these talkpages contained only the Wildbot tags, which leaves them completely blanked now. Some examples of this behavior from my watchlist:[10] [11] [12] [13]. I originally brought this to M's talkpage, and he asked me to post this here. If the only edits are by Wildbot and then by Yobot, and the page is blanked, the page should be deleted (There is no reason to have a page if there are no tags or discussion headings). If there are intermediate edits, but the page is still blanked, (Like this) then maybe the pages could be listed in a holding tank for review (Which I would be happy to do). I would estimate that a little less than 2000 pages would be affected, but my estimate could be far off. --Fiftytwo thirty (talk) 22:00, 1 December 2010 (UTC)
- I agree that these pages should be deleted. In fact, we can check if only bots edited the page. Then we can delete them. Another way, would be to check if the history has only 2 edits. Or we could just delete all empty pages. -- Magioladitis (talk) 22:18, 1 December 2010 (UTC)
- Note: User:7SeriesBOT used to do this when WildBot blanked (it would only work if only WildBot had edited the page, so wouldn't in these cases as it is currently set up, but I suppose it could be easily made to pretend Yobot is WildBot). - Kingpin13 (talk) 22:19, 1 December 2010 (UTC)
/* SLOW_OK - Empty talk pages with mostly bot edits */
SELECT CONCAT('[[Talk:',page_title,']]'), COUNT(*) AS Edits, COUNT(ug_group) AS BotEdits
FROM page
JOIN revision ON rev_page=page_id
LEFT JOIN user_groups ON ug_user=rev_user AND ug_group="bot"
WHERE page_namespace = 1
AND page_is_redirect = 0
AND page_len IS NOT NULL
AND page_len<5
GROUP BY page_id
HAVING BotEdits > 0 AND BotEdits + 1 >= Edits
- That query to finds currently nearly empty talk-pages with at most 1 non-bot edit (or page move). Note: Deflagged bots are treated as regular users. — Dispenser 03:54, 2 December 2010 (UTC)
- 5351 talk pages with some additional extra info. Most, if not all, should be safe to delete. — Dispenser 04:31, 3 December 2010 (UTC)
- What exactly should I delete? All pages in this list? -- Magioladitis (talk) 00:17, 14 December 2010 (UTC)
- 5351 talk pages with some additional extra info. Most, if not all, should be safe to delete. — Dispenser 04:31, 3 December 2010 (UTC)
- That query to finds currently nearly empty talk-pages with at most 1 non-bot edit (or page move). Note: Deflagged bots are treated as regular users. — Dispenser 03:54, 2 December 2010 (UTC)
Many of these are talk pages of redirects. A simple (and sane) option is to make them redirect to the appropriate talk page. Rich Farmbrough, 21:15, 15 December 2010 (UTC).
- Another, more time consuming option is to tag these pages with project banners. — Dispenser 05:59, 17 December 2010 (UTC)
- We usually just delete the talk page of the redirect. -- Magioladitis (talk) 13:18, 17 December 2010 (UTC)
match {{collapsible option}} with {{navbox}}
Recently I came across a template where {{collapsible option}} was included in the documentation, but the template was not relaying its {{{state}}} parameter over to {{navbox}}. It would be a good idea for a bot to monitor and auto-fix or untag these. --Joy [shallot] (talk) 10:40, 24 December 2010 (UTC)
How many categories have an endash in their title?
There is a discussion at CFD on renaming a large group of categories to use endashes in their titles per WP:DASH.
To put this change in perspective, it would be useful to know how many category titles already use an endash. I tried using a search to get a figure, but it finds nothing at all.
Can anyone with access to bot-like tools do a count, without too much work? --BrownHairedGirl (talk) • (contribs) 15:35, 23 December 2010 (UTC)
- This would probably be best done by someone with toolserver access. Anomie⚔ 16:14, 23 December 2010 (UTC)
- I think you're right. I was hoping that a request here might be spotted by some kind person who has such access :) --BrownHairedGirl (talk) • (contribs) 16:27, 23 December 2010 (UTC)
- There's a good chance of that. Anomie⚔ 17:32, 23 December 2010 (UTC)
- A quick sampling from the Oct 11, 2010 SQL category dump suggests there are around 3000 of those categories. Processing them all to get an exact count in the crappy way I'm doing it would be pretty slow but I guess I can do that if you want. 67.117.130.143 (talk) 06:33, 24 December 2010 (UTC)
- I did it a different way and got about 5100, which
should be pretty close, but still inexactmay be a little high. I just counted en-dashes in the file, but some catnames will have multiple dashes. The earlier method probably missed stuff. 67.117.130.143 (talk) 07:28, 24 December 2010 (UTC)
- I did it a different way and got about 5100, which
- I think you're right. I was hoping that a request here might be spotted by some kind person who has such access :) --BrownHairedGirl (talk) • (contribs) 16:27, 23 December 2010 (UTC)
- Using Nikola's grep and counting </a> (minus 3) it is 3,794. If you want to exclude date ranges:
[^0-9 ] *– * [^0-9 ]
. — Dispenser 13:23, 24 December 2010 (UTC)- Thanks, folks. That's very helpful! Nikola's grep is a particularly useful tool. --BrownHairedGirl (talk) • (contribs) 22:26, 24 December 2010 (UTC)
Could some one write, set up, and run, a bot equivalent to that that generated WP:AFDSUM ? There have been several requests for AFDSUM to be restarted since it stopped running, on the related talk pages. 65.95.13.158 (talk) 07:51, 21 December 2010 (UTC)
- tools:~betacommand/AFD.html been up for quite a while. ΔT The only constant 15:13, 21 December 2010 (UTC)
- I've added a hatnote for it, for the perennial requestors of AFDSUM to see it at the WP:AFDSUM location.
- Though... it's not date ordered. Perhaps a sortability fuction could be included?
- Just click on any header and it will be sorted via that column. ΔT The only constant 11:10, 22 December 2010 (UTC)
- I clicked on each of the headers
AfD Keep # Delete # Merge # Redirect # Neutral # Expired Filed
for tools:~betacommand/AFD.html and nothing happened (no sort). Can that sort feature be added to/fixed for AFD.html? -- Uzma Gamal (talk) 01:42, 26 December 2010 (UTC)- It appears that IE does not like my JavaScript that sorts the table try using Firefox. ΔT The only constant 01:59, 26 December 2010 (UTC)
- I clicked on each of the headers
- Just click on any header and it will be sorted via that column. ΔT The only constant 11:10, 22 December 2010 (UTC)
- Though... it's not date ordered. Perhaps a sortability fuction could be included?
Add ISO 639-3 codes to disambiguation pages
User:Jasy jatere has requested to add ISO 639-3 language codes to three letter disambiguations, e.g. a link to Amal language from AAD (aad
is the ISO code for this language). Here is a list of dabs missing such entries. Lines with no link in front of the colon mean that my bot didn't find a dab; they can be ignored (maybe I'll create redirects for these).
My problem is that a bot (as far as I can see) cannot find out which section in a given dab is the right one to add the link. Any ideas how this could done (semi)automatically? Thanks, ἀνυπόδητος (talk) 09:04, 20 December 2010 (UTC)
- The approach I was planning on using in dabfix was to enumerate a list each linked page categories and its parents. It then should be possible to find the largest overlap group and insert the link there. The hardest part is finding good weights and cutoff points. If you get it working; send me a copy to integrated with dabfix. — Dispenser 01:03, 25 December 2010 (UTC)
- That might be a bit too big for me. Perhaps I'll try it when I've got a bit more confidence with bot programming. Thanks for the idea anyway. --ἀνυπόδητος (talk) 07:41, 26 December 2010 (UTC)
Bot to help generate List of featured articles by category and significant contributor
I was thinking of writing a film article and thought of asking a Wikipedia for help. One of the best Wikipedians to ask for help would seem to be the ones who have contributed to the most Category:FA-Class film articles. There are a bunch of articles in the FA-Class film article category and I wasn't sure of the easiest way to find those Wikipedians who have contributed to the most FA-Class film articles. Then I thought that it would be great to have such a list. We do have Wikipedia:List of Wikipedians by featured article nominations, but that doesn't associate the editor with the topic. Here is where you guys come in as I think a bot can help generate such a table. The bot would:
- Go through each article in the subcategories listed in Category:FA-Class articles (e.g., 300 (film) in Category:FA-Class film articles)
- From the article history page (e.g., 300 (film) history), open the External tools: Revision history statistics (e.g. Contributors)
- Grab the top one or two names of the Wikipedias with the most edits (e.g., 507 (491/16) Arcayne and 360 (347/13) Alientraveller)[14] if their edit numbers are significantly more than other editors. Feel free to use you own judgement in this.
- Add the info to a table (see below)
- Repeat until done.
There might be an easier way to do this, so please feel free to take that route. Here is how I see the table being formatted from the above 300 (film) example (if you can think of other stats to add to the table, please do so).
Featured Article | FA-Class film category | Significant contributor to the article | Notes |
---|---|---|---|
300 (film) | FA-Class film articles | Arcayne | |
300 (film) | FA-Class film articles | Alientraveller |
Please place the results in a new subpage in my user space. I'll add it to project space once it is ready. Also, my request at Tips for writing filmologies inspired this bot reuest as well. -- Uzma Gamal (talk) 13:26, 24 December 2010 (UTC)
- This is certainly doable, though it may at times be misguiding. If anyone else supports, I could code this for next year.Smallman12q (talk) 14:57, 25 December 2010 (UTC)
- Thanks Smallman12q. Behind each FA article is at least one to two editors who had the skills to bring that article to FA status. Usually, that will be the editors with the most edits to the article. There might be situations where an editor with the top edits to an article has a special interest in a topic but is not the editor who revised the article to meet Wikipedia:Featured article criteria. Once the bot generates the list, I can look at the each article to see whether the top number of edits editors is/are the editors who revised the article to meet Wikipedia:Featured article criteria. If they are, I can add a note in the note column and, if not, figure out who is and replace the listed name. Perhaps the bot can compare each of the two editors having the overall top number of edits to the article with the top two editors to the article between the period of when the article was promoted to GA status and when the FA status promotion discussion was closed and add a note in the fourth column of the table where there is a match. Per Talk:300 (film), that period would be from May 19, 2007 through August 3, 2007 for 300 (film). Feel free to adjust my proposed bot search return criteria to minimize any misguiding results. I'd be happ to take what every output he bot provides and manually go through each entry to refine the results. -- Uzma Gamal (talk) 22:53, 25 December 2010 (UTC)
- BRFA filed. Smallman12q (talk) 13:09, 26 December 2010 (UTC)
- Thanks Smallman12q. Behind each FA article is at least one to two editors who had the skills to bring that article to FA status. Usually, that will be the editors with the most edits to the article. There might be situations where an editor with the top edits to an article has a special interest in a topic but is not the editor who revised the article to meet Wikipedia:Featured article criteria. Once the bot generates the list, I can look at the each article to see whether the top number of edits editors is/are the editors who revised the article to meet Wikipedia:Featured article criteria. If they are, I can add a note in the note column and, if not, figure out who is and replace the listed name. Perhaps the bot can compare each of the two editors having the overall top number of edits to the article with the top two editors to the article between the period of when the article was promoted to GA status and when the FA status promotion discussion was closed and add a note in the fourth column of the table where there is a match. Per Talk:300 (film), that period would be from May 19, 2007 through August 3, 2007 for 300 (film). Feel free to adjust my proposed bot search return criteria to minimize any misguiding results. I'd be happ to take what every output he bot provides and manually go through each entry to refine the results. -- Uzma Gamal (talk) 22:53, 25 December 2010 (UTC)
WildBot 2
Could someone create a new version of WildBot? The bot hasn't run since September, and Josh Parris, who maintained it, hasn't been around since July. The source code is available at the Toolserver, so you wouldn't need to write everything anew. Nyttend (talk) 01:53, 17 December 2010 (UTC)
- What is the point of resurrecting this bot? To promote my tools? As it currently stands over 9000 pages its tagged still need disambiguation (#Removing obsolete Wildbot tags). Aside from clogging up recent changes, the core functionality of the bot is better implemented as tool. — Dispenser 05:13, 17 December 2010 (UTC)
- To promote fixing dabs, and to review Wikipedia books for syntax errors and problems. It would still be useful without your tools, they just make the job easier (much easier). Last I hear Shirik was handling a new version of WildBot, but I haven't heard of him in a while. He's graduating this week and and will presumably be less busy after that.Headbomb {talk / contribs / physics / books} 05:16, 17 December 2010 (UTC)
- Why not create a centralized page with links to the tools and link that from MediaWiki:Histlegend? It seems like a more effective way to get it done. — Dispenser 07:05, 28 December 2010 (UTC)
- But Dispenser, according to Wildbot's documentation, Wildbot removes any obsolete tags it's added; I believe this is true, but the bot has been disabled, and that's the only reason there are obsolete tags. (If someone's creating a new version, great; RF seems uninterested: he didn't respond to a note on his talk page, but responded to other notes there.)--W☯W t/c 07:24, 21 December 2010 (UTC)
- Josh last log in the Toolserver on June 3 and failed to renew his account recently. According to his pywikibot logs, WildBot last ran on Dec 1. While possible that he's been operating WildBot from home, the inactivity since September suggests otherwise. — Dispenser 07:05, 28 December 2010 (UTC)
- To promote fixing dabs, and to review Wikipedia books for syntax errors and problems. It would still be useful without your tools, they just make the job easier (much easier). Last I hear Shirik was handling a new version of WildBot, but I haven't heard of him in a while. He's graduating this week and and will presumably be less busy after that.Headbomb {talk / contribs / physics / books} 05:16, 17 December 2010 (UTC)
USGS publications mass upload
Would a mass upload of USGS publications which are PD to the commons be feasible? (I'm asking here since the material is primarily English)?Smallman12q (talk) 16:15, 25 December 2010 (UTC)
- IMO that wouldn't be appropriate per NOTHOST even if it's technically feasible. 67.117.130.143 (talk) 23:19, 25 December 2010 (UTC)
- Would WP:NOTHOST apply? It states "Please upload only files that are used (or will be used) in encyclopedia articles or project pages; anything else will be deleted." These are reliable, sourced, government publications that could incorporated into articles.Smallman12q (talk) 03:51, 26 December 2010 (UTC)
- It doesn't apply, because WP:NOT doesn't apply to commons, which is where this discussion should be. Mr.Z-man 06:21, 26 December 2010 (UTC)
- What he said. Kaldari (talk) 00:52, 29 December 2010 (UTC)
- It doesn't apply, because WP:NOT doesn't apply to commons, which is where this discussion should be. Mr.Z-man 06:21, 26 December 2010 (UTC)
- Would WP:NOTHOST apply? It states "Please upload only files that are used (or will be used) in encyclopedia articles or project pages; anything else will be deleted." These are reliable, sourced, government publications that could incorporated into articles.Smallman12q (talk) 03:51, 26 December 2010 (UTC)
Local file descriptions for no/Commons files
Would it be possible to have a bot go through Filespace and identify local file description pages for a)Commons files and b)non-existent files? The pages should then be added to Category:Description pages missing files. The vast majority of these will be eligible for speedy deletion under WP:CSD#F2 - however, they do need to be reviewed by a human, as there are some legit uses ({{FeaturedPicture}} and so forth). Most, though, are either created in error or are simple vandalism. Kelly hi! 18:18, 29 December 2010 (UTC)
Remove {{coord missing}} if {{coord}} exists
Per Wikipedia_talk:AutoWikiBrowser/Feature_requests#Remove_.7B.7Bcoord_missing.7D.7D_if_.7B.7Bcoord.7D.7D_exists, "Speaking of Bharati Bhavan Library, could a feature be added to AWB to remove {{coord missing}} from an article if the article also contains {{coord}}? GoingBatty (talk) 03:34, 29 December 2010 (UTC)"
Since this is a trivial bot job, I made list of pages bot having {{coord}} and {{coord missing}}. I found 2,569 pages and I am fixing right away. -- Magioladitis (talk) 19:29, 29 December 2010 (UTC)
Done Magioladitis (talk) 23:56, 29 December 2010 (UTC)
Redirect Bot
I have no idea how to create or use bots, but after receiving a comment at a Peer Review I wanted to ask if it would be possible to engineer a bot that could locate and fix links that go to a redirect page so that when users/visitors click on a link in a wikipedia article they go to the correct page the first time. Since this would only effect links in article pages it would (in theory, anyway) not effect pages created as redirects since those would still need to exist to make sure that terms entered into the search box go to the right page. I am aware of WP:NOTBROKEN, but think this could help Wikipedia in the long run by providing a degree of spell checking and by providing increased accuracy to make sure that articles links do not encounter double redirects or redirects that simply return a user or visitor back to the page in the first place. In the interest of fairness I will concede that the bot does not have to be automatic, if at all possible it could be designed to run only on request pages, but I think that it could be helpful to Wikipedia as whole. TomStar81 (Talk) 00:01, 30 December 2010 (UTC)
- Here is some background for you: WildBot's BRfA 2 was for bypassing unprintworthy redirects, e.g. tagged with {{R from misspelling}}. wikEd has bypass/fix redirects button and can be limited to a selection of text. I've written a program that bypasses redirect if they are similar enough. Another takes a link to section and finds a redirect that points to the same section, since it is easier finding broken redirect than broken section links. — Dispenser 00:48, 30 December 2010 (UTC)
Hi. I recently learned of the existence of this log. Alison made a request on AN that any time IPBE is granted, it be logged. Easy enough, but I was wondering if there's a way to either a) have a bot get the list up-to-date, b) update the list as the right is granted, or c) both. Thanks! TNXMan 16:28, 24 December 2010 (UTC)
- I can take care of this when I get back on tuesday easily enough, and I will try to get a manual update for it before I go. -- DQ (t) Merry Chrismasand a Happy New Year! 16:38, 24 December 2010 (UTC)
- Thanks to Chzz (talk · contribs) for getting this up before I run out the door. User:DeltaQuad/Lists/IPBE. -- DQ (t) Merry Chrismasand a Happy New Year! 17:06, 24 December 2010 (UTC)
- Using a bot for that seems clunky. Better to open a bugzilla request to have the server log this stuff automatically, with a Special: page similar to the global block log etc. 67.117.130.143 (talk) 23:19, 24 December 2010 (UTC)
- The best idea would be to add to the Special:Log functionality, and be able to sort the log by added rights. (X! · talk) · @171 · 03:06, 25 December 2010 (UTC)
- Well, until this is accomplished, it's a pretty good idea. I will have something done soon enough. -- DQ (t) Merry Chrismasand a Happy New Year! 01:52, 29 December 2010 (UTC)
- Also one last note this was for CUs to comment about the grants. Almost done coding FYI. -- DQ (t) Merry Chrismasand a Happy New Year! 04:30, 30 December 2010 (UTC)
- Well, until this is accomplished, it's a pretty good idea. I will have something done soon enough. -- DQ (t) Merry Chrismasand a Happy New Year! 01:52, 29 December 2010 (UTC)
- The best idea would be to add to the Special:Log functionality, and be able to sort the log by added rights. (X! · talk) · @171 · 03:06, 25 December 2010 (UTC)
- Done With a Request for Approval. DQ.alt (t) (e) 19:48, 4 January 2011 (UTC)
"Hot articles" bot for Wikiprojects
I need a bot that will automatically update a "Hot articles" list for a Wikiproject once a day. The bot should look through all of the articles under the purview of the project and compile a list of the 5 articles receiving the most edits in the last 3 days. For an example of a manually-updated version see Wikipedia:WikiProject_United_States_Public_Policy/Leaderboard/What's_hot. This will more than likely be run on the toolserver. It would also be nice (but not required) if the bot could be configured per project. For example, to show the top 10 articles receiving the most edits in the past 14 days. Kaldari (talk) 22:42, 28 December 2010 (UTC)
- I made a webtool for this, I got all the inner bits working, now I just need to format it for the web. The nice thing is that it updates dynamically, instead of once per 24 hours. Tim1357 talk 06:41, 1 January 2011 (UTC)
- Done. Tim1357 talk 22:01, 1 January 2011 (UTC)
- I'm looking for a push service (to be used on WikiProject pages) rather than a pull service. Would it be possible to set up an interface for your webtool that returns results in XML rather than HTML? That would make setting up a push version trivial. Kaldari (talk) 22:44, 1 January 2011 (UTC)
- Also is there any chance that you could post your source code somewhere? (So that what happened with Article Alert bot won't happen with this tool as well). Thanks! Kaldari (talk) 22:46, 1 January 2011 (UTC)
- Well, if you want to use it on a Wikiproject page, then I can output it in wikitext. I could do it in xml too, if you want. Tim1357 talk 00:31, 2 January 2011 (UTC)
- Update: you can get xml results by appending
&xml=1
to the end of the url. Tim1357 talk 02:12, 2 January 2011 (UTC)- That is excellent! Thank you! Kaldari (talk) 02:23, 2 January 2011 (UTC)
- Update: you can get xml results by appending
- Well, if you want to use it on a Wikiproject page, then I can output it in wikitext. I could do it in xml too, if you want. Tim1357 talk 00:31, 2 January 2011 (UTC)
- Done. Tim1357 talk 22:01, 1 January 2011 (UTC)
Just be warned, the longer the timespan and the bigger the wikiproject, the longer the tool will take. Tim1357 talk 03:06, 2 January 2011 (UTC)
- So where's that tool and how does it work? Headbomb {talk / contribs / physics / books} 21:20, 4 January 2011 (UTC)
- Right now, it's just a simple php script and cron job on the toolserver. I have it generating the Hot Articles section of Wikipedia:WikiProject Feminism right now as a proof of concept. I'm waiting to hear back from Frank Schulenburg on some ideas for expanding this to be available to all WikiProjects. If you have any thoughts for expanding it, let me know. Kaldari (talk) 21:26, 4 January 2011 (UTC)
- Aight, well when it's ready for deploying, let me know, I'll spam it in the WP:AALERTS/NEWS and mention it in WP:AALERTS##Tips and tricks.Headbomb {talk / contribs / physics / books} 21:50, 4 January 2011 (UTC)
- Will do. Stay tuned. Kaldari (talk) 01:07, 5 January 2011 (UTC)
"not be able" -> be unable
I'm not sure where to post this suggestion, so please move if it isn't fit for bots. I'd like to suggest a bot run to change "not be able" to "be unable" in articles.Smallman12q (talk) 00:10, 2 January 2011 (UTC)
- If this was a typo, I'd suggest Wikipedia:AutoWikiBrowser/Typos. But what's wrong with "not be able", as in "Bots should not be able to change correct wording"? GoingBatty (talk) 00:39, 2 January 2011 (UTC)
- Not a good task for a bot. for the same reason spelling- and grammar-checking bots are not allowed. And I would recommend not doing this manually as some sort of drive-by change, either (changing it as part of a full copyedit is a different matter), as both are correct but have subtly different emphasis. Anomie⚔ 01:47, 2 January 2011 (UTC)
Bot to create pages for individual USG diplomatic cables published by Wikileaks
- The following discussion is closed. Please do not modify it. Subsequent comments should be made in a new section. A summary of the conclusions reached follows.
- The discussion has been moved to Wikipedia:Village pump (proposals)#Create articles about individual cables published by WikiLeaks, please comment there.
250,000 leaked "cables" are expected to be published in the next few years by WikiLeaks. Main story: United States diplomatic cables leak. I checked the whole discussion about the policy for linking to the leaks, discussing their contents etc. There's a huge work going on to keep Contents of the United States diplomatic cables leak updated with summaries of the most relevant cables.
In my opinion, each of these cables is going to trigger important discussions worldwide and affect the future of world diplomacy. All mainstream media are giving frequent reports on the latest releases referring to individual items of the list.
Each cable has a unique ID (e.g. "10MADRID86") that follows a well-defined grammar (see Template:Cablegate). This is the way those cables are referred to in citations by secondary and tertiary sources. In my opinion, it makes sense to create an article for each cable with the ID as the article name. All such articles would contain basic information about the leak, like its origin, the date it was sent and the date it was leaked, its secrecy classification, a link to the WikiLeaks page and a few wikilinks (not the cable content). This is information a bot can fetch from WikiLeaks. Humans can add more information like a summary, background data and reactions to the leak, for those cables that end up impacting more on the public opinion.
I can also take care of creating the bot if there is consensus. I also have a few ideas about the details but let's discuss them only if the community thinks that this is a good idea. It is pretty unusual for me to google for an individual cable on the basis of its "official name" and not find a Wikipedia article explaining what it is. --MauroVan (talk) 16:19, 3 January 2011 (UTC)
- I seriously doubt this will ever get consensus, judging by some of the views at Wikipedia:Requests for comment/Use of classified documents. But if you really want to try, take the discussion to WP:VPR and list it at WP:CENT; come back here if you actually get consensus. Anomie⚔ 18:00, 3 January 2011 (UTC)
- Hi! I had read Wikipedia:Requests for comment/Use of classified documents before writing this, and I drew completely opposite conclusions. :-) Consider that I'm not proposing to cite any part of the leaks here, neither am I suggesting that we should state that those documents are genuine or that their content is a correct depiction of reality. I just believe that those documents exist, because I've seen them, and that they are encyclopedic enough because a lot of people is talking about them; therefore I think that we should mention them and say what each of them is and provide a link that proves our statements. The best source for the statement "10MADRID86 is a document published by WikiLeaks; WikiLeaks states that it is the leak of a classified US embassy cable" is a link to the WikiLeaks page for 10MADRID86. Notice that the legal team of WMF has declared that linking to WikiLeaks is not illegal in Florida and that there are already loads of links to WikiLeaks in several articles of the English Wikipedia.
- In my opinion, my proposal is consistent with the consensus emerging from the discussion you linked. The issues on which I am looking for consensus are different, i.e. (1) "Do we need individual articles for individual leaks?" and (2) "Is it OK to use a bot to create them?". But I'd like to listen to the opinion of the people who's been discussing on this subject, since you seem to have interpreted it in a completely different way; do you suggest that we link this page from there? Thanks for your attention. --MauroVan (talk) 20:40, 3 January 2011 (UTC)
- No, I suggest you start a discussion at WP:VPR and link that discussion from there. Consider first how you will convince the community that each individual cable passes WP:N. Anomie⚔ 21:38, 3 January 2011 (UTC)
- In my opinion, my proposal is consistent with the consensus emerging from the discussion you linked. The issues on which I am looking for consensus are different, i.e. (1) "Do we need individual articles for individual leaks?" and (2) "Is it OK to use a bot to create them?". But I'd like to listen to the opinion of the people who's been discussing on this subject, since you seem to have interpreted it in a completely different way; do you suggest that we link this page from there? Thanks for your attention. --MauroVan (talk) 20:40, 3 January 2011 (UTC)
- Correct, I did it here. I realise that the each part is the most objectionable one. In my opinion, it makes more sense to include them all but I want to read other people's point of view before making my own mind for good. Please give your opinion there if you have time. --MauroVan (talk) 22:03, 3 January 2011 (UTC)
I oppose the use of bots to create stubs. It's created a quagmire of non-notables specks from atlases, and there's no need to create another set of articles about memos. After each memo has been discussed in multiple independent sources, human editors can weigh those sources and create meaningful articles about them.—Kww(talk) 06:54, 5 January 2011 (UTC)
NRIS references replacement request
Discussed out at Wikipedia talk:WikiProject National Register of Historic Places#Please change the standard citation to omit the link, there's a need to replace several variations of references to the National Register of Historic Places (NRHP)'s NRIS database, by a call to template:NRISref instead. All background work done, consensus on the change accomplished. Draft articles being started using two outside systems (Elkman's NRHP infobox generator and my own batches-of-NRHP-articles-generator) are coming in with the template calls. User:Kumioko was helpful in the discussion and might comment here.
What's needed in general is to address 20,000-30,000 instances in wikipedia of references like:
<ref name="nris"> {{cite web|url=http://www.nr.nps.gov/ |title=National Register Information System |date=2009-03-13 |work=National Register of Historic Places |publisher=National Park Service}} </ref>
or in most cases all in one line:
<ref name="nris">{{cite web|url=http://www.nr.nps.gov/ |title=National Register Information System |date=2009-03-13|work=National Register of Historic Places |publisher=National Park Service}}</ref>
and having various dates. Half or more will have 2009-03-13 date. Most of the rest will have 2008-04-14 or 2007-01-23.
These are to be replaced by:
- <ref name=nris>{{NRISref|2009a}}</ref> if date=2009-03-13 appears in what is being replaced, or
- <ref name=nris>{{NRISref|2008a}}</ref> if date=2008-04-15 appears in what is being replaced, or
- <ref name=nris>{{NRISref|2007a}}</ref> if date=2007-01-23 appears in what is being replaced, or
- <ref name=nris>{{NRISref|error}}</ref> otherwise
where 2009a, 2008a, 2007a are versions specifically programmed in the template:NRISref. If the bot operator could identify any other commonly used dates, then versions for those could be added to the template, but this is most of them.
In many of the articles, there are later invocations of the reference by <ref name=nris/> or by <ref name="nris"/>, which should be left unchanged. If there are multiple outright definitions of the NRIS reference in one article, that's an error to be noted (listed in an errors page?) and fixed manually.
Also there are some cases where a URL other than url=http://www.nr.nps.gov/ is provided. These should be treated as errors and listed or put into a category somehow, too.
Can i provide any more info? Your consideration is appreciated. --Doncram (talk) 16:45, 30 December 2010 (UTC)
- FYI, the link http://www.nr.nps.gov in all these is bogus now, it yields an error eventually if a reader clicks on it. So there are 20-000 to 30,000 bad links in wikipedia that would be fixed by this bot. It would be a great service if this could be done! :) --Doncram (talk) 16:32, 6 January 2011 (UTC)
- Coding... Anomie⚔ 23:04, 6 January 2011 (UTC)
- It seems there are about 27285 links to http://www.nr.nps.gov/ in all namespaces and about 1820 links to other addresses under that domain; it would probably be best, though for the bot to run through mainspace only as there are only 823 total links in other namespaces and that way we don't have to worry about talk archives and such. Dates 2008-04-24, 2007-06-30, and 2006-03-15 seem to have a fair degree of use, as does no date; a full list is at [15].
- I would also suggest <ref name=nris>{{NRISref|error (date)}}</ref> or the like so people don't have to check the history to find the old date, and perhaps a template like {{NRIS dead link}} to mark bare links and such to make it easier for people to find them. Anomie⚔ 03:05, 7 January 2011 (UTC)
- Coding... Anomie⚔ 23:04, 6 January 2011 (UTC)
- Thanks! I updated template:NRISref to now include versions covering the 3 other dates you found that had over 250 usages. The new version list stated there now includes:
- version 2010a: July 9, 2010 - Version released December, 2010, with database changes up to just July 9.
- version 2009a: March 13, 2009 – Used in probably 10,000+ wikipedia articles. This version was available for download from the National Park Service until about July 2010. It was soon reuploaded on the NPS Focus website at http://nrhp.focus.nps.gov/natreg/docs/All_Data.html.
- version 2008b: April 24, 2008 - Used in 1,206 articles that still included this as of January 2011 (according to this report)
- version 2008a: April 15, 2008 - Used in several thousand wikipedia articles, some later updated to use a later version.
- version 2007b: June 30, 2007 - Used in 296 wikipedia articles that still included this as of January 2011
- version 2007a: January 23, 2007 - Used in several thousand wikipedia articles, some later updated to use a later version.
- version 2006a: March 15, 2006 - Used in 315 wikipedia articles that still included this as of January 2011
- Various scattered dates given which reflect no actual NRIS version, but rather an editor's usage of the date field as an accessdate rather than an NRIS date. To be coded as version=Error and to be considered errors, and to display "Error" rather than a date, and to be later addressed by updating the NRIS information in those articles.
- About no date that should be put in to show as error, perhaps by
- <ref name=nris>{{NRISref|Error}}</ref>
- which i have programmed to be displayed there, but i see your suggestion of <ref name=nris>{{NRISref|error (date)}}</ref> "so people don't have to check the history to find the old date". However i don't immediately see how to program the NRISref template to display that. I'll ask Dudemanfellabra to comment here. But if u see how this can be done please just go ahead (including feel free to change template:NRISref). Actually, though, i don't think that's necessary: we will simply visit and fix all of these articles to use current NRIS data. It's not important what date was once stated.
- About {{NRIS dead link}} to mark bare links and such, i am also not sure, or rather think that anything complicated is not needed. Just using "version=Error" should suffice, and we will revise the template to put in a category for these articles and then go update the NRIS info in all of them. Thanks! --Doncram (talk) 13:53, 7 January 2011 (UTC)
- I just updated the NRISref template to allow for any input. If you put in
|version=Error (date)
or anything else besides the specific dates we have identified, the template will spit out exactly what you put in. If we want to categorize these later, we can just tack a category onto the end of the code where I made the note. Really, they don't even have to be tagged with an error at all.. if they don't have one of the prescribed dates, they will trigger an error anyway.--Dudemanfellabra (talk) 17:16, 7 January 2011 (UTC)- Dudemanfellabra meant or anything besides the specific version codes like 2009a, 2007b, etc. that we have identified..."
- That works for me, thanks. Category:NRISref errors now categorizes articles having no valid version. These articles can/will all be updated to use recent NRIS info, even if a thousand or two are put into it by the bot run. Thanks! --Doncram (talk) 17:35, 7 January 2011 (UTC)
- The idea behind {{NRIS dead link}} is to give some indicator that the link needs to be checked (like {{dead link}}), for placing after links that aren't in a {{cite web}} at all and after cite web templates that link to any other page on the NRIS website. A list of those other links (which may or may not be in {{cite web}}) is now here, if you want to review them. Anomie⚔ 18:47, 7 January 2011 (UTC)
- I just updated the NRISref template to allow for any input. If you put in
- Thanks, Anomie, that's helpful. It shows about 1,000 bad other links, a third of which are links for multiple property submission (MPS) items, whose documents were all moved by the National Park Service from URLs like http://www.nr.nps.gov/multiples/64000086.pdf to URLs like https://npgallery.nps.gov/NRHP/GetAsset/NRHP/64000086_text. The related documents are listed at wp:MPS. I find i had noted there that "a bot run went through seemingly all NRHP articles to update MPS links' URLs during September, 2010. The URL problem should appear in only newly generated infoboxes." That's too many to be all new articles; i think the previous bot run (which i didn't request myself, don't have a handy link for) must have missed some. Others among the 1,000 are due to other changes; all those articles oughta be revisited and cleaned up.
- I'd be happy if anyone would set up a "NRIS dead link" properly and put these all into it. The ones where MPS URL is now wrong could easily then be fixed by AWB i suppose, and some other subtypes could also be AWB'd perhaps. I'll try editing at template:NRIS dead link, trying to adapt the template:dead link template. Any fixes to what i try would be appreciated. Thanks! --Doncram (talk) 15:45, 8 January 2011 (UTC)
- P.S. I created that template and then associated category Category:All NRHP articles with dead external links, currently having 0 members. I chose a slightly broader category name to allow for other NRHP-related dead links that might get added. I added that article counter to the TO DO checklist posted at the WikiPRoject NRHP main talk page. So any articles given "NRIS dead link" by this bot run should eventually get some attention. Thanks! --Doncram (talk) 16:02, 8 January 2011 (UTC)
- So can the bot replace any link starting "http://www.nr.nps.gov/multiples/" with the corresponding link starting "http://pdfhost.focus.nps.gov/docs/NRHP/Text/", or is it more complicated? Anomie⚔ 19:46, 8 January 2011 (UTC)
- Simple answer: yes, that would be great. There are a few known examples where the document used to be available at the old-style URL, where it did not get transferred to the new-style URL, so the change won't help as much as one would hope. But all the old-style URLs no longer work. Changing from old-style to new-style will always be okay/good. --Doncram (talk) 21:16, 8 January 2011 (UTC)
- So can the bot replace any link starting "http://www.nr.nps.gov/multiples/" with the corresponding link starting "http://pdfhost.focus.nps.gov/docs/NRHP/Text/", or is it more complicated? Anomie⚔ 19:46, 8 January 2011 (UTC)