User talk:Cyberpower678/Archive 14
This is an archive of past discussions about User:Cyberpower678. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 10 | ← | Archive 12 | Archive 13 | Archive 14 | Archive 15 | Archive 16 | → | Archive 20 |
Accident
Earlier today, I took a blow to my head from a collapsed roof gutter. Since I likely won't be able to think straight for a while, I will be drastically reducing my skimmer.
- (The above is an unsigned edit made a few hours ago by Cyberpower678) Hope you have a full and speedy recovery … you've been doing good work and we need you, Cyberpower678! :-) --Elvey (talk) 07:00, 26 September 2013 (UTC)
- Hope you get better soon! Legoktm (talk) 17:43, 26 September 2013 (UTC)
- Yeah - look after that head Cyber, it's valuable to you, and to us. Seriously hope you're ok - and hope to see you back soon. Begoon talk 17:47, 26 September 2013 (UTC)
- Agreed. Take it easy and hopefully you'll be "back on the horse" soon. Walter Görlitz (talk) 20:04, 26 September 2013 (UTC)
- grunting* Thanks guys. I'm starting to feel better. I still have a massive headache, though, and my ears are still ringing, but I'm a little less hazy in my mind at least, so I think I'll return to Wikipedia now. I will apologize in advance if I lash out at someone, since I'm still very groggy and grumpy. To clarify, I'm still leaving my activity reduced on Wikipedia.—cyberpower ChatOnline 22:58, 26 September 2013 (UTC)
- Sleep it off, seriously. I hope very much you have taken medical advice re concussion. If not, better late thannever. Fiddle Faddle 23:02, 26 September 2013 (UTC)
- My mom is a doctor and she is monitoring me. She said by the looks of it, that it's not a severe concussion. That it should go away in a week or two.—cyberpower ChatOnline 23:03, 26 September 2013 (UTC)
- That, at least, is good news. Please take it easy. Fiddle Faddle 23:07, 26 September 2013 (UTC)
- Sleep it off, seriously. I hope very much you have taken medical advice re concussion. If not, better late thannever. Fiddle Faddle 23:02, 26 September 2013 (UTC)
- Agreed. Take it easy and hopefully you'll be "back on the horse" soon. Walter Görlitz (talk) 20:04, 26 September 2013 (UTC)
- Hi guys, just an update. I'm starting to feel better, headache is getting better and my ears have stopped ringing. :-) I sometimes still think 2+2 is 5 though, so hopefully that will clear up sometime soon. :/—cyberpower ChatOnline 23:07, 28 September 2013 (UTC)
CyberBot II PCBot task
Hey, is there any way you could slow down this task to where the bot waits a period of time, say three minutes, before assuming the admin forgot to add the pending changes template? I very rarely forget to add it and it's somewhat of an inconvenience for the bot to try adding it for me so quickly when I'm already in the process of doing so. Ks0stm (T•C•G•E) 20:01, 2 October 2013 (UTC)
- I really ought to fix that. It keeps slipping my mind.—cyberpower ChatOnline 20:36, 2 October 2013 (UTC)
FYI your bot is editing logged out
[1] and other contribs on the IP. See m:October 2013 private data security issue. Please use &assert=user in the future. Legoktm (talk) 06:07, 3 October 2013 (UTC)
- I'm aware. These are non-peachy based scripts. Peachy didn't appear to have this problem. I've reinstated my bots password, and it looks like Pywikipdedia has assert issues of it's own. :p—cyberpower ChatOnline 10:49, 3 October 2013 (UTC)
- No it doesn't. pywikibot-core requires you to be logged in to edit. Legoktm (talk) 14:15, 3 October 2013 (UTC)
- Obviously not since an edit went through as an IP. But I hardly know anything about Pywikibot, so I'm not going to make an argument of it.—cyberpower ChatOnline 14:19, 3 October 2013 (UTC)
- The issue is that you're using pywikibot-compat instead of pywikibot-core. If/when you switch to core, it forces the account to be logged in and other improvements. Legoktm (talk) 18:39, 3 October 2013 (UTC)
- Obviously not since an edit went through as an IP. But I hardly know anything about Pywikibot, so I'm not going to make an argument of it.—cyberpower ChatOnline 14:19, 3 October 2013 (UTC)
- No it doesn't. pywikibot-core requires you to be logged in to edit. Legoktm (talk) 14:15, 3 October 2013 (UTC)
Blacklisted url
Hi, Cyberbot II, I was just wondering if quote=k-kristie.quazen.com/arts/architecture/the-largest-buildings-in-the-world on Kennedy Space Center is still on your blacklist. Lotje (talk) 16:14, 3 October 2013 (UTC)
- Sorry. I don't control the blacklist. Please see Template:Blacklisted-links for more information on blacklists.—cyberpower ChatOnline 16:18, 3 October 2013 (UTC)
Penang
Hey Cyber - how's the head...? Better I hope...
This just came up on my watchlist: https://en.wikipedia.org/w/index.php?title=Penang&diff=575609318&oldid=575217841
Something seems to make the tag not include the link, and weirdly truncate stuff.
Just a heads up. Begoon talk 17:51, 3 October 2013 (UTC)
- It's the pipe that's causing it. I need to convert it to an html character to fix that. And thank you, I'm 95% better. Just a mild headache at this point.
- Cool - I knew it would be something simple. But more importantly, glad you're ok. Still take care though...Begoon talk 19:13, 3 October 2013 (UTC)
Blacklist - bot request
First, I hope you're all fine now.
I hope in any case that this is not giving you headache. In the tag that you leave, would you mind to add two (un-used) parameters, merely for our convenience:
- the actual rule(s) that block the link
- which blacklist it is
.. that is, if they are coming out of the db-requests. It would namely help greatly with finding the why of the blacklisting for those who are considering the de-blacklisting/whitelisting. Some of the links are hitting on strange rules which are not easily found. Thanks. --Dirk Beetstra T C 08:27, 29 September 2013 (UTC)
Two related requests:
- Seen the AN/I .. could you program the bot so that it does not add the template more than once every ## hours (I would suggest 48 or 72). Some persistence is good, but we don't need to push it too hard either. If editors then indicate that the bot is re-tagging while they are waiting for the whitelist to come through, please set the bot to ignore for that page (but preferably not earlier, it is better that the issue is resolved at some point).
- Related to the bot-ignore - maybe that should have a 'deadline' like 3 months. If whitelisting is not done within, e.g. three months, then another solution needs to be found. You could add a date-tag to the ignore, and have a deadline on that, or just manually with a remark line every month, kill everything of 3 months ago.
Thanks for the hard work .. now hope that the issue gets resolved at some point. This is also good for finding the too-wide-net cases. --Dirk Beetstra T C 12:51, 29 September 2013 (UTC)
- (talk page stalker)I agree with the unused parameters, but I disagree that we should ever ignore blacklisted links long-term. They need to be either whitelisted or removed. Jackmcbarn (talk) 15:28, 29 September 2013 (UTC)
- I don't think I suggested to ever ignore blacklisted links long-term. --Dirk Beetstra T C 15:44, 29 September 2013 (UTC)
- 3 months would be way to long a figure if whitelisting hasnt happened in that length of time it aint ever going to happen. Again though editors equally shouldn't be edit warring with the bot so would an automatic talk page message be worth whilst if the bot is reverted.Blethering Scot 17:13, 29 September 2013 (UTC)
- @Beetstra: I think it's a great idea, and would love to add those additional bells and whistles. No for the bad news. This script already eats. 1.1 GB of my allotted 2 GB on labs and constantly pushes the CPU usage beyond 100%. I wouldn't mind adding it, but I'm concerned about the additional resources it's going to consume. I'm already at 1.5 GB for all continuous tasks. I also have scripts that are executed by the cron. I will certainly look into it, but I can say for certain it will slow down the bot, and eat more memory.—cyberpower ChatOnline 00:20, 30 September 2013 (UTC)
- Most are just features, the one of the once every ## hours-editing however should not add to much overhead, I'd consider that as the more important one. --Dirk Beetstra T C 04:05, 30 September 2013 (UTC)
- No it shouldn't but it's really finicky. The bot's run consists of scan local database, about 3-4 hours, and remove any links that aren't blacklisted, check for new regex additions, to the blacklist, or removals from the whitelist, if none proceed to tagging phase, if there are, scan 60000000+ links on Wikipedia, about 16-20 hours, tag pages, about 3 hours, remove invalid tags, about 30 minutes, and then sleep for 15 minutes. Once every ## hours would be very difficult to enforce in this case. As time needed to execute a part of the script constantly changes.—cyberpower ChatOnline 11:40, 30 September 2013 (UTC)
- Most are just features, the one of the once every ## hours-editing however should not add to much overhead, I'd consider that as the more important one. --Dirk Beetstra T C 04:05, 30 September 2013 (UTC)
- Surely the option in this case is just to run the bot once every three days. Liamdavies (talk) 15:06, 30 September 2013 (UTC)
- Huh? Where did you pull three days from? That means the tag removal part of the script will be running once every three days as well, which also an excessive time for the bot to just sit there and wait. Not to mention that when the bot runs a scan of the database, it will be more than 3 days before the bot does anything again. If anything, the bot should have no more than a 48 hour gap in editing.—cyberpower ChatOnline 15:11, 30 September 2013 (UTC)
- Fine, every two days, the point is that limiting how often the bot runs - rather than in a loop - is a low tech way of solving the problem in the interim. Liamdavies (talk) 15:37, 30 September 2013 (UTC)
- Surely the option in this case is just to run the bot once every three days. Liamdavies (talk) 15:06, 30 September 2013 (UTC)
- @Beetstra: As you can see, I've managed to come up with a resource conserving method of identifying the rule. But due to it's design, I'd like to see the log output first before I unleash the bot to update all the tags.—cyberpower ChatOnline 21:13, 2 October 2013 (UTC)
- Bot war: Cyberbot insists on the tag being at the very beginning ([2], [3]), other bots put it beneath hatnotes with the other maintainance templates ([4]).
I would recommend to a) replace an existing tag in-place if it's already on the page, that should be very simple and the least objectionable by other bots and editors. b) add new tags beneath hatnotes, to comply with guidelines (WP:HNP etc.) -- I would recommend to use the Twinkle-regex ([5]) as a guideline on how to recognize them.
Amalthea 16:55, 4 October 2013 (UTC)- I'll look into fixing that.—cyberpower ChatLimited Access 17:04, 4 October 2013 (UTC)
What is the point in having the bot tag a page which discusses the link?--Launchballer 21:43, 4 October 2013 (UTC)
- The link is blacklisted regardless. Submit a whitelisting request for that page if the link belongs. Jackmcbarn (talk) 22:34, 4 October 2013 (UTC)
facts is still a trigger
Thanks for updating the bot and it's clear that this is a false positive on 2nd Chapter of Acts. Walter Görlitz (talk) 18:47, 3 October 2013 (UTC)
- I see there is an open request on the whitelist concerning this link. I have added this to the exceptions list for the duration of the request. The page should no longer be tagged for the time being.—cyberpower ChatOffline 18:55, 3 October 2013 (UTC)
- Thanks!
- As a personal note, it's good to see you're feeling well enough to be on Wikipedia. Hoping for continued recovery and a speedy one at that. You do great work for the project. Walter Görlitz (talk) 19:04, 3 October 2013 (UTC)
- Yep. 95% there. Nothing an Aspirin can't handle.—cyberpower ChatOffline 19:08, 3 October 2013 (UTC)
- It seems that the 2ndchapterofacts.com is still triggering the blacklist today and canada-soccer.com is as well. Walter Görlitz (talk) 13:31, 4 October 2013 (UTC)
- I can't de-blacklist or whitelist links. I can only tell the bot to ignore blacklisted links while a pending request exists. Sorry.—cyberpower ChatOffline 13:47, 4 October 2013 (UTC)
- The issue is that there are links from this domain on all these pages and the bot has been tagging many of them; the exception list is either incomplete or not working. Is there no way to make the bot ignore all links from a domain globally? Liamdavies (talk) 14:32, 4 October 2013 (UTC)
- That would require a regex setup. I'll add the remaining links you provided to the exceptions list. It usually helps me to provide the links needing ignoring.—cyberpower ChatOnline 14:39, 4 October 2013 (UTC)
- Links added.—cyberpower ChatOnline 14:49, 4 October 2013 (UTC)
- It was just triggered again here.
- The site for the first is www.2ndchapterofacts.com. The site for the second is www.canadian-soccer.com as seen here and here. Walter Görlitz (talk) 19:23, 4 October 2013 (UTC)
- That's a normal, and currently unavoidable side-effect. It takes some time before the exceptions go into effect. It shouldn't tag the first again. As for the latter, can you point me to the whitelist, or de-blacklist request? Simply ignoring the blacklisted links isn't the solution.—cyberpower ChatOnline 20:25, 4 October 2013 (UTC)
- I've whitelisted 2ndchapterofacts.com, that was a clear collateral damage of a global blacklist entry. I'll check out the second one, that's almost certainly a mistake in the blacklist entry -- I'll get back here. Amalthea 22:06, 4 October 2013 (UTC)
- canadian-soccer.com should also cause no further problems, tags can be removed manually or will be removed automatically in the near future, thanks for the report -- but please request de-blacklisting (if you think the blacklist entry is wrong) or whitelisting (if you think the blacklist entry is right but should have a specific exemption) next time. Amalthea 22:38, 4 October 2013 (UTC)
- ... however, these are both links to forum entries (which is not reachable ATM): I strongly doubt those are sources we want to use for anything! Amalthea 22:55, 4 October 2013 (UTC)
- And forums aren't RSes, which is what I thought was triggering the issue. When it started explaining soccer.+\.com was the trigger I sat up and took notice. There were a few on canada-soccer that were flagged earlier that weren't in the forum section. I'll have to find them.
- Thanks again. Walter Görlitz (talk) 23:22, 4 October 2013 (UTC)
- But be warned, like I mentioned above, the bot may come back to tag it once more before it realizes it's not blacklisted.—cyberpower ChatOffline 10:38, 5 October 2013 (UTC)
- This is completely unacceptable, under no circumstances should the bot tag a page that isn't blacklisted. I would also draw your attention to a discussion at Wikipedia talk:Bots/Requests for approval#Cyberbot II concerning the bot, someone has requested approval be removed (someone really should have told you earlier, but better late than never). Liamdavies (talk) 12:00, 6 October 2013 (UTC)
- But be warned, like I mentioned above, the bot may come back to tag it once more before it realizes it's not blacklisted.—cyberpower ChatOffline 10:38, 5 October 2013 (UTC)
- ... however, these are both links to forum entries (which is not reachable ATM): I strongly doubt those are sources we want to use for anything! Amalthea 22:55, 4 October 2013 (UTC)
- canadian-soccer.com should also cause no further problems, tags can be removed manually or will be removed automatically in the near future, thanks for the report -- but please request de-blacklisting (if you think the blacklist entry is wrong) or whitelisting (if you think the blacklist entry is right but should have a specific exemption) next time. Amalthea 22:38, 4 October 2013 (UTC)
- The issue is that there are links from this domain on all these pages and the bot has been tagging many of them; the exception list is either incomplete or not working. Is there no way to make the bot ignore all links from a domain globally? Liamdavies (talk) 14:32, 4 October 2013 (UTC)
- I can't de-blacklist or whitelist links. I can only tell the bot to ignore blacklisted links while a pending request exists. Sorry.—cyberpower ChatOffline 13:47, 4 October 2013 (UTC)
- It seems that the 2ndchapterofacts.com is still triggering the blacklist today and canada-soccer.com is as well. Walter Görlitz (talk) 13:31, 4 October 2013 (UTC)
- Yep. 95% there. Nothing an Aspirin can't handle.—cyberpower ChatOffline 19:08, 3 October 2013 (UTC)
Admin stats
Hi Cyberpower. I just noticed that the {{adminstats}} transclusion on my user page now includes the message "Bot will die in 15 days. Please contact the operator." That would be you. :) Any ideas what might be causing the bot's impending demise? — Mr. Stradivarius ♪ talk ♪ 00:56, 6 October 2013 (UTC)
- it won't die. That's just a leftover from toolserver since accounts there expire every six months. Since they don't on labs, that message is no longer necessary. I've removed it.—cyberpower ChatOnline 01:27, 6 October 2013 (UTC)
- Good to know. Thanks! — Mr. Stradivarius ♪ talk ♪ 01:29, 6 October 2013 (UTC)
Cyberbot I isn't upto something
Cyberbot I isn't clerking RFPP currently, I've not seen anyplace where it's been mentioned so is it a known issue? tutterMouse (talk) 07:27, 4 October 2013 (UTC)
- Yes, seems to have stopped for at least a day. I manually archived a big chunk yesterday, and it's starting to stack up again. GedUK 11:24, 4 October 2013 (UTC)
- I'm really starting to dislike Pywikipedia.—cyberpower ChatLimited Access 12:49, 4 October 2013 (UTC)
- It's been four days or so now without the bot, is there an issue we should be aware of and how long might it be until it resumes clerking? tutterMouse (talk) 07:06, 7 October 2013 (UTC)
- I'm really starting to dislike Pywikipedia.—cyberpower ChatLimited Access 12:49, 4 October 2013 (UTC)
- What issues with Pywikibot are you having? Legoktm (talk) 07:33, 7 October 2013 (UTC)
- I'm guessing it's not logged in anymore. How do I fix that?—cyberpower ChatLimited Access 12:21, 7 October 2013 (UTC)
- I tried logging Pywikipedia back in. Let's see if that fixes it.—cyberpower ChatLimited Access 12:38, 7 October 2013 (UTC)
- Err. I forgot to reboot the scripts after re-logging Pywikpedia in. But I'm still going to convert the scripts.—cyberpower ChatOffline 13:48, 7 October 2013 (UTC)
- I tried logging Pywikipedia back in. Let's see if that fixes it.—cyberpower ChatLimited Access 12:38, 7 October 2013 (UTC)
- I'm guessing it's not logged in anymore. How do I fix that?—cyberpower ChatLimited Access 12:21, 7 October 2013 (UTC)
- Grr. It doesn't seem to be doing anything. I just moved convert python script to PHP to the top of my todo list.—cyberpower ChatOffline 13:45, 7 October 2013 (UTC)
- Probably should if it's more stable than pywikipedia, I know Scottywong had issues when he was responsible for it. tutterMouse (talk) 13:58, 7 October 2013 (UTC)
- My experiences with Pywikipedia have been less than pleasant. I kicked the scripts in the butt so they are now working again. But I feel my Peachy framework, is much easier to use, and more stable, and less cluttered with unnecessary junk.—cyberpower ChatOnline 14:03, 7 October 2013 (UTC)
- Port away then, if Peachy is better in most circumstances then we shouldn't be leaving it to the flaky whims of a bad infrastructure, might even stop me thinking we need a second bot. tutterMouse (talk) 18:20, 7 October 2013 (UTC)
- Probably should if it's more stable than pywikipedia, I know Scottywong had issues when he was responsible for it. tutterMouse (talk) 13:58, 7 October 2013 (UTC)
Spam blacklist false positive
An actual false positive: MediaWiki talk:Spam-whitelist/Archives/2023/05#examiner.com.au. Tagged a few hours ago, "Triggered by unknown
on the unknown blacklist". Amalthea 23:38, 8 October 2013 (UTC)
- Wait for it... The bot's making another run.—cyberpower ChatOnline 23:39, 8 October 2013 (UTC)
- Ok, but why did that happen? Amalthea 23:42, 8 October 2013 (UTC)
- No clue honestly. The rule finder is new, so it might be buggy somewhere. I'll have to take a look to see what happened.—cyberpower ChatOnline 23:49, 8 October 2013 (UTC)
- Ok. I think the rule finder had the right of it since it isn't blacklisted (see the thread I liked to). I did tweak that specific blacklist entry recently (removed the examiner.com.au whitelist entry and fixed the blacklist entry to no longer match that), so my first impression was that it ran with an outdated blacklist, but per this diff it did have the current examiner.com blacklist entry. Could it have crashed and tried to recover with outdated data? Amalthea 23:55, 8 October 2013 (UTC)
- It could be a flaw in the crash recovery system. But it's supposed to restore the bot to its exact previous state. It's possible the database fell asleep and didn't the existing entries. Let's see what happens on the next run. I could just be a glitch in the program, that might fix itself.—cyberpower ChatOnline 23:59, 8 October 2013 (UTC)
- Ok. I think the rule finder had the right of it since it isn't blacklisted (see the thread I liked to). I did tweak that specific blacklist entry recently (removed the examiner.com.au whitelist entry and fixed the blacklist entry to no longer match that), so my first impression was that it ran with an outdated blacklist, but per this diff it did have the current examiner.com blacklist entry. Could it have crashed and tried to recover with outdated data? Amalthea 23:55, 8 October 2013 (UTC)
- No clue honestly. The rule finder is new, so it might be buggy somewhere. I'll have to take a look to see what happened.—cyberpower ChatOnline 23:49, 8 October 2013 (UTC)
- Ok, but why did that happen? Amalthea 23:42, 8 October 2013 (UTC)
- @Amalthea: You can see what the bot is doing here.—cyberpower ChatOnline 01:05, 9 October 2013 (UTC)
← Not sure what to make of this. Judging by how the prognosis progresses we should know whether the mistaken tags are removed tomorrow evening or so? It seems to have recovered during the current run so I assume you should be able to look into the transitional files you place on disc to check for any problems.
From the experiences of the last days, and the problems with resource usage and response time we had on a huge wiki like here, I'd actually recommend to mke changes to the bot to specifically improve it in those areas. If I may suggest, here's an algorithm that I could see work better on a huge wiki like this one:
- Get largest current page ID
- Iterate through all page IDs in chunks (e.g. 0-9999, 10000-19999, ...). For each of those page ID chunks:
- Get all external links of those pages from the externallinks table in one query, ordered by page ID (it's indexed by page ID, so that's a fast and cheap query).
- Get all transclusions of the "blacklisted links" template of those pages from the templatelinks table, in one query, ordered by page ID (should again be indexed).
- If the blacklist regexes have not been built yet or have been built more than e.g. 5 minutes ago, query the blacklist whitelist pages and build the regular expressions
- Iterate through the database results. For each page ID with data:
- Match the external links against the regexes. If a blacklisted link is found, get the page content, add/replace the blacklisted-links template, and if the page content has changed save the page.
- If no external links are blacklisted but it currently transcludes the blacklisted-links template, get the page content, remove the template, and if the page content has changed save the page.
This should require few resources since you always only hold the data of a single page, be responsive to changes since you never operate on days-old data, speed and resource usage can be tweaked by changing the size of the page ID chunks processed at a time. A full run will probably still take some time, but the logic can easily be made interruptible/resumable by storing the start of the currently processed chunk and initializing the chunk loop with that ID during restart
It'd also has the advantage that you could quickly react to mistakenly blacklisted links: If the chunk processing routine were written as a function that gets the page IDs as a parameter, you normally pass in something like "BETWEEN 10000 AND 19999" to process a page ID range. If you want to re-evaluate e.g. all pages containing a examiner.com.au link you could get those page IDs from the table and pass them to the processing function as "IN (1038255, 6893822, 17769549, ...)" and get the tags re-evaluated.
Amalthea 10:19, 9 October 2013 (UTC)
- Hmmm. This looks like it will run oom. It will also take longer because now it's processing pages, and then links, rather than just processing the full table at once and sorting out the data by page ID. I'm somewhat confused what you're going at here.—cyberpower ChatOnline 11:37, 9 October 2013 (UTC)
- Well, I think I've listed aspects where the bot could be improved above. My goal is always and only to improve Wikipedia, and in the case at hand I think this would help on several levels. :| Amalthea 14:16, 9 October 2013 (UTC)
- I'm always open to improvements, but I'm honestly confused by this approach.—cyberpower ChatOnline 14:43, 9 October 2013 (UTC)
- Well, I think I've listed aspects where the bot could be improved above. My goal is always and only to improve Wikipedia, and in the case at hand I think this would help on several levels. :| Amalthea 14:16, 9 October 2013 (UTC)
Discussion on template
Please be careful about WP:EW edit warring and ownership concerns over the template you are adding, and your bot is edit warring to keep adding to articles containing links on the spam blacklist. I will start a discussion on the template page, but it is not proper for somebody running a disputed and problematic bot function to be edit warring personally against editors affected by the bot. Your approach here, and in discussions over the bot, has been less than collaborative. If you wish to keep running this function you are going to have to work with the community of editors and not use your bot authorization to ignore and override them. - Wikidemon (talk) 00:25, 9 October 2013 (UTC)
- If I had ownership issues with that template, I would've been reverting everyone, not just you (and likely gotten myself blocked in the process). I know when I'm discussing a lost cause. If I had ownership issues, I would've have insisted on leaving that big bulky red box, because I like that version better, but I didn't. I am very careful with edit warring, hence the discussion I opened on the talk page of the template after I reverted for the second time, per WP:BRD. You are one of few seeing the bot as problematic. The reason why I haven't participated in those threads a lot is because other users essentially said what I wanted to say. No need repeat what's already said. I do work with the community to the best of my abilities, and try to be civil at all times. The bot is functioning the way it was approved to. The script was reviewed by several BAGgers and uninvolved editors. They have all contributed to perfecting the code. That lash out at you was a result of a medical issue. I am happy to discuss the template.—cyberpower ChatOnline 00:58, 9 October 2013 (UTC)
- Hey, I wrote this before I noticed you had started the discussion so better discuss there. Thanks for the reply — I was pretty stern there but looking forward to keeping things civil and constructive. The "nobody but you is complaining" argument doesn't do much for me, btw. The sample is not representative and the population distribution is anything but uniform. I admire your other bot functions :) - Wikidemon (talk) 02:08, 9 October 2013 (UTC)
- Same here. Have a great evening.—cyberpower ChatOnline 02:10, 9 October 2013 (UTC)
- Hey, I wrote this before I noticed you had started the discussion so better discuss there. Thanks for the reply — I was pretty stern there but looking forward to keeping things civil and constructive. The "nobody but you is complaining" argument doesn't do much for me, btw. The sample is not representative and the population distribution is anything but uniform. I admire your other bot functions :) - Wikidemon (talk) 02:08, 9 October 2013 (UTC)
Misclick?
What? Was that a misclick? Don't worry, I've reverted it. Bishonen | talk 11:17, 10 October 2013 (UTC).
- Oopsies.—cyberpower ChatLimited Access 11:21, 10 October 2013 (UTC)
- Cyberpower must hate Canadians, eh? It's understandable, but it's not according to procedure: you are to make fun of them using a South Park-inspired voice. Drmies (talk) 14:02, 10 October 2013 (UTC)
- Can I use Simpsons instead?—cyberpower ChatOnline 14:04, 10 October 2013 (UTC)
- Simpsons never did "Blame Canada". — Crisco 1492 (talk) 14:06, 10 October 2013 (UTC)
- What about Barney Stinson?—cyberpower ChatOnline 14:07, 10 October 2013 (UTC)
- Me: Total deer in headlights moment. — Crisco 1492 (talk) 14:10, 10 October 2013 (UTC)
- Huh?—cyberpower ChatOnline 14:12, 10 October 2013 (UTC)
- Meant the reference flew over my head. Haven't really payed attention to that show. Anyways, no worries! — Crisco 1492 (talk) 14:15, 10 October 2013 (UTC)
- OkaysoBarneythinksCanadasucksbutthenhefoundhe'sonequarterCanadianbuttotallychosetoignoreituntilhedecidedtomarryRobin. *gasps*—cyberpower ChatOnline 14:20, 10 October 2013 (UTC)
- (After finding several related articles). Ah! (disappears into 1960s Indonesian pop songs) — Crisco 1492 (talk) 14:22, 10 October 2013 (UTC)
- You all have time for TV? And stop dropping your bullets all over the floor. Drmies (talk) 17:49, 10 October 2013 (UTC)
- What bullets?—cyberpower ChatLimited Access 17:57, 10 October 2013 (UTC)
- Look carefully for the stray bullet. Drmies (talk) 18:04, 10 October 2013 (UTC)
- Not my fault. :p—cyberpower ChatOffline 18:18, 10 October 2013 (UTC)
- Absolutely. Bullets, though, even stray ones, are notoriously immune to that kind of unassailable logic. Just sayin'... Begoon talk 19:09, 10 October 2013 (UTC)
- Not my fault. :p—cyberpower ChatOffline 18:18, 10 October 2013 (UTC)
- Look carefully for the stray bullet. Drmies (talk) 18:04, 10 October 2013 (UTC)
- What bullets?—cyberpower ChatLimited Access 17:57, 10 October 2013 (UTC)
- You all have time for TV? And stop dropping your bullets all over the floor. Drmies (talk) 17:49, 10 October 2013 (UTC)
- (After finding several related articles). Ah! (disappears into 1960s Indonesian pop songs) — Crisco 1492 (talk) 14:22, 10 October 2013 (UTC)
- OkaysoBarneythinksCanadasucksbutthenhefoundhe'sonequarterCanadianbuttotallychosetoignoreituntilhedecidedtomarryRobin. *gasps*—cyberpower ChatOnline 14:20, 10 October 2013 (UTC)
- Huh?—cyberpower ChatOnline 14:12, 10 October 2013 (UTC)
- Me: Total deer in headlights moment. — Crisco 1492 (talk) 14:10, 10 October 2013 (UTC)
- What about Barney Stinson?—cyberpower ChatOnline 14:07, 10 October 2013 (UTC)
- Can I use Simpsons instead?—cyberpower ChatOnline 14:04, 10 October 2013 (UTC)
- Cyberpower must hate Canadians, eh? It's understandable, but it's not according to procedure: you are to make fun of them using a South Park-inspired voice. Drmies (talk) 14:02, 10 October 2013 (UTC)
Bot down again
The bot maintaining the summary at User:Snotbot/Requests for unblock report appears to be down again. Kuru (talk) 11:39, 10 October 2013 (UTC)
- The more I use Pywikipedia, the more I hate it. I'm in the process of converting the scripts to Peachy. In the meantime, I'll give it another kick in the butt.—cyberpower ChatOffline 12:43, 10 October 2013 (UTC)
- SQL server fell asleep.—cyberpower ChatOffline 13:03, 10 October 2013 (UTC)
Suggestion, that template should be placed below the {{for}} and similar templates. (AWB always moves those templates to the very top and I agree that they should be above the template) Take a look at Dog park where it looks screwy. Werieth (talk) 13:19, 10 October 2013 (UTC)
- I've been working on an update.—cyberpower ChatOffline 13:20, 10 October 2013 (UTC)
SPAMBot issues
- diff 1: Matching rule is missing from tag. Any updated tag seems to look like this.
- diff 2: Tag content is missing completely. Any newly tagged page seems to look like this.
- diff 3: False negative, link still exists and still is blacklisted.
Since every edit today seems to show one of the first two issues and I don't know what kind of schedule it's on I've disabled the bot until you've had a change to fix those.
Cheers, Amalthea 09:17, 11 October 2013 (UTC)
- Labs went down for maintenance. The bot no longer has access to vital files. It may have become screwy.—cyberpower ChatOnline 11:08, 11 October 2013 (UTC)
- By the looks of it, that's what happened. Disabling the bot was a good call. I'll reboot the bot without the recovery file to make it do a fresh run.—cyberpower ChatOnline 11:13, 11 October 2013 (UTC)
- I've installed an update and clean rebooted the bot. Labs' maintenance corrupted the bot and the data files. A clean run should purge the bad files. The update should allow the bot to update tags in place and tags underneath other tags based on the regex you provided me. I will monitor to make sure the bot is doing what it's supposed to. You can see what the bot is doing at the moment by looking here.—cyberpower ChatOnline 11:44, 11 October 2013 (UTC)
- If host maintainance can cause this then please put in sufficient assertions so that the bot halts itself in such circumstances from now on (database access failed, an unknown/missing rule, no blacklisted links for a page, ...). And I think the third diff listed above appears to be a separate problem. Amalthea 12:12, 11 October 2013 (UTC)
- I think this is an exceptional case. The bot appears to have been running normally during maintenance. The bot was then restarted once labs restarted and old data files were replaced. With the mix of data the bot became confused. --Cyberpower678 (talk • contribs) 13:14, 11 October 2013 (UTC)
- Yes, that is a core principle of defensive programming: exceptional cases happen (remember the last one was only three days ago), and any program of complexity contains bugs. By adding assertions you can abort early and usually prevent data damage, reduce time spent in recovery, and just as importantly recognize a problem early instead of having it propagate through the logic where it usually ends up disguised as an unexplained "glitch".
Just a suggestion, but doing so would make the bot appear more robust and reliable in my book.
Amalthea 15:11, 11 October 2013 (UTC)- Very well. I will work on a way to improve reliability.—cyberpower ChatOnline 15:14, 11 October 2013 (UTC)
- Yes, that is a core principle of defensive programming: exceptional cases happen (remember the last one was only three days ago), and any program of complexity contains bugs. By adding assertions you can abort early and usually prevent data damage, reduce time spent in recovery, and just as importantly recognize a problem early instead of having it propagate through the logic where it usually ends up disguised as an unexplained "glitch".
- I think this is an exceptional case. The bot appears to have been running normally during maintenance. The bot was then restarted once labs restarted and old data files were replaced. With the mix of data the bot became confused. --Cyberpower678 (talk • contribs) 13:14, 11 October 2013 (UTC)
- If host maintainance can cause this then please put in sufficient assertions so that the bot halts itself in such circumstances from now on (database access failed, an unknown/missing rule, no blacklisted links for a page, ...). And I think the third diff listed above appears to be a separate problem. Amalthea 12:12, 11 October 2013 (UTC)
- I've installed an update and clean rebooted the bot. Labs' maintenance corrupted the bot and the data files. A clean run should purge the bad files. The update should allow the bot to update tags in place and tags underneath other tags based on the regex you provided me. I will monitor to make sure the bot is doing what it's supposed to. You can see what the bot is doing at the moment by looking here.—cyberpower ChatOnline 11:44, 11 October 2013 (UTC)
- By the looks of it, that's what happened. Disabling the bot was a good call. I'll reboot the bot without the recovery file to make it do a fresh run.—cyberpower ChatOnline 11:13, 11 October 2013 (UTC)
A few questions / issues
- First, can you tell what's up here?[6] It appears that Cyberbot II re-added the blacklist tag despite a {{nobots}} tag. My understanding is that Cyberbot II is supposed to be compliant with the nobots tag. Is there some kind of malfunction or is the nobots tag inapt?
- Second, you have argued recently that the bot is authorized to restore tags (repeatedly, if necessary) after a human editor has removed them. Can you please point to the basis for that claim? I don't see that in the bot approval history, and as far as I know it is beyond BRFA's power to authorize a bot to do that.
- Finally, can you please confirm that Cyberbot II is not making administrative edits itself? A few times recently, the process of restoring older versions of articles, it appeared that I could not re-add the {{Blacklisted-links}} template because the blacklisted link contained in the template was itself caught on the blacklist. I'm not sure what's going on, but please confirm that you're not circumventing the blacklist int the process of adding the blacklist notice.
Thanks, - Wikidemon (talk) 08:52, 13 October 2013 (UTC)
- It looks like a bug.
- I said maintain the tags.
- It can't do that. No user on Wikipedia can circumvent the blacklist.—cyberpower ChatOffline 10:19, 13 October 2013 (UTC)
- Cheers.—cyberpower ChatOffline 10:19, 13 October 2013 (UTC)
- Thanks for the response, noted. - Wikidemon (talk) 18:13, 13 October 2013 (UTC)
False Positive Blacklist links
Hi, your bot appears to be classifying links to del.ico.us as blacklisted:
https://en.wikipedia.org/w/index.php?title=User:Mecredis&diff=576989172&oldid=318325871
Seems like a regex might need a tweak? Fred Benenson (talk) 14:04, 13 October 2013 (UTC)
- icio.us is blacklisted globally since 2009, but user pages aren't supposed to be tagged? Amalthea 14:28, 13 October 2013 (UTC)
- Oy, see recent contributions, tags all namespaces now apparently. Disabled. Amalthea 14:30, 13 October 2013 (UTC)
- Aww... That's a shame. I got excited when I saw it tagging article talkpages, thinking it was sticking the tag there for links in the article, which a few people, including me, have mentioned they think would be better. Turns out it was tagging for links on the talkpage itself - now I'm just disappointed... sniff... Begoon talk 17:38, 13 October 2013 (UTC)
- Apparently, several API bugs somehow managed to slip through API query checks. It's been returning bad data that PHP can't read. As a result, the query was interpreted to be an empty string. So instead of any content being returned from the exceptions page, "" was received and stored, and interpreted as an excpetionless page. I'll need to add some more filters to Peachy to prevent these bugs from happening again. Peachy already deals with lots of API issues/failures, apparently it completely missed these bugs.—cyberpower ChatOnline 18:26, 13 October 2013 (UTC)
- Aww... That's a shame. I got excited when I saw it tagging article talkpages, thinking it was sticking the tag there for links in the article, which a few people, including me, have mentioned they think would be better. Turns out it was tagging for links on the talkpage itself - now I'm just disappointed... sniff... Begoon talk 17:38, 13 October 2013 (UTC)
- Oy, see recent contributions, tags all namespaces now apparently. Disabled. Amalthea 14:30, 13 October 2013 (UTC)
A couple questions:
- What happens if a site to which we normally would not link (for instance, Fucked Company) is the subject of an article?
- Does it make sense to stick these templates on pages which are not articles (talk, user, user talk...)? K7L (talk) 15:58, 13 October 2013 (UTC)
- I don't get this question.
- No it does not.—cyberpower ChatOnline 17:56, 13 October 2013 (UTC)
- To answer the first question: We would whitelist the home page of the company to allow usage of that particular link. Amalthea 08:50, 14 October 2013 (UTC)
Format of Blacklisted-links template
Hi Cyberpower678! Some of the {{blacklisted-links}} templates your bot is adding do not display properly, such as those on Jesse Levine and Linyphiidae. I'm not sure if this is an issue with the template of the parameters you're using. Could you please look into this? Thanks! GoingBatty (talk) 16:10, 13 October 2013 (UTC)
- Could you check if this is resolved for you? I made a change to workaround an apparent rendering issue in Firefox which non-deterministically had the show/hide link overlap the text (re-layouting by e.g. scaling made the error disappear every second change). Amalthea 17:15, 13 October 2013 (UTC)
- @Amalthea: - Looks great now in Firefox - thanks! GoingBatty (talk) 17:44, 13 October 2013 (UTC)
Talk pages
Back in August, you said that you'd added Talk space to the bot's exceptions list but Talk:BDSM just got flagged again. – RobinHood70 talk 22:16, 13 October 2013 (UTC)
- I haven't removed it. There's a bug that's preventing the bot from properly reading the exceptions list. There seems to be an API issue and the way the bot is communicating with it.—cyberpower ChatOnline 23:25, 13 October 2013 (UTC)
AN/I
Just letting you know, I've asked at AN/I that the blacklist function be shut down again for now. We'll deal with the larger problem separately. Thanks, - Wikidemon (talk) 08:28, 14 October 2013 (UTC)
Stomp
Hi Cyberpower, it looks like you stomped on a bunch of other replies with your edit here, could you please fix that? Edit conflicts, ugh... Zad68
15:26, 14 October 2013 (UTC)
Please don't be discouraged
Hi,
please try and don't be discouraged by the current escalation. I (and everyone else) realize how much energy and dedication you put into that task (and all your bot tasks). If you plan to work as a programmer that's really something you'll have to get used to, I throw away way more code than goes productive simply due to a change in specifications once the result is visible. :) That's not something you should take personal, iteration is often part of the workflow.
The same is true here: any bot's only task is to improve Wikipedia, but what that means is of course open to interpretation and iteration. Since this task got that much community pushback despite BAG approval it is certainly prudent to take a while and reevaluate the assumptions that went into formulating the bot task. Independent of any unintended edits, there are valid and reasoned concerns about whether tagging in article space isn't more damaging than was foreseen when the task was first put to paper. This is not your fault as the bot operator, but you need to be open to discussion and ready to stop and talk about it until consensus is reestablished.
As I've said elsewhere, if you are interested I would be available to work with you on this task (even though my experience with PHP is limited and I hate it with a passion). I'm sure with the experience from the last weeks and the improvements proposed we can get it to a point where everyone is happy with the task, and I believe we can both learn from it and become better programmers.
Cheers, Amalthea 21:42, 14 October 2013 (UTC)
- I'm not down. I'm a little frustrated, yes, and scared that I may be walking into a block, but certainly not giving up on this script. I've managed to get it this far. I'm willing to take it the rest of the way. I'm not usually open to publishing source code, but in this case, I will post the script onto GitHub, you can then fork and make modifications on the spot, and request a merge.—cyberpower ChatOffline 22:07, 14 October 2013 (UTC)
- Nobody is even considering a block -- you should of course not re-enable the task, at this point the discussion is likely going to result in a changed specification of the task.
As I also said though, "not giving up on code" will really not make you a happy programmer or bot operator: a script is not a goal, it's a tool to make the encyclopedia better. You should never give up on improving the project, but the way to do that in this community project is through consensus; if a task is not backed by (implicit or explicit) consensus it needs to be changed or given up. That goes for any work, gnoming, clerking, and most of all article writing: Article writing would appear to be the most constructive and straightforward form of improving the encyclopedia we have, but their stuff gets rewritten and deleted all the time, and they have no option but to give up their work if consensus points the other way. Amalthea 22:41, 14 October 2013 (UTC)- I meant the task in general. Not the specific code. :p If consensus wants me to change it, I will, but so far, consensus for that hasn't been established, but consensus was established in the last discussion to run the bot.—cyberpower ChatOnline 23:09, 14 October 2013 (UTC)
- "[…] so far, consensus for that hasn't been established, but consensus was established in the last discussion to run the bot": I disagree. Consensus on Wikipedia is not a passed law, consensus can and does change, implicitly and explicitly. This is a central part of the collaborative editing we do here.
Here is my summary of the current situation: In light of objective, nameable problems that were not foreseen in the BRFA, the task as specified back then no longer has sufficient support to be run. Consensus there was based on assumptions, and those assumptions are now in question, all in good faith and with reasonable arguments, and until this can be resolved the consensus is void. In addition to that, the bot made too many faulty edits that need to be prevented, no matter the cause (in the end it's solely the responsibility of the bot to avoid errors, and frankly I did not see you doing your utmost to prevent the problems).
As I interpret it, (I'd have to re-read the recent discussions), these two points (robustness & how to tag/list/notify) must be addressed, thoroughly, and then the task should be re-BRFAed. I expect that to take at least a month.
I say this based on all my experience here on Wikipedia. I have obviously trouble convincing you myself or getting my points across so I would recommend you ask the two mentors you list at the top of your page, whose judgement I assume you do trust, to comment here.
Amalthea 11:17, 15 October 2013 (UTC) - "[…] consensus was established in the last discussion to run the bot" Where? Point to a discussion that has consensus, one person authorising a bot a consensus does not make. Liamdavies (talk) 13:25, 15 October 2013 (UTC)
- The previous ANI discussion had many in favor of the bot resuming its run.—cyberpower ChatOffline 13:36, 15 October 2013 (UTC)
- I don't think so, this discussion has 12 users calling for changes to be made (some have, some have not; the editwarring has not been fixed) with six calling for quick resumed operation. While the current ANI has almost no support for speedy resumed operation. I hardly call that consensus. I hate to be that guy, but I told you almost two months ago that you should get these problems sorted out, and you dismissed my concerns, pretty much saying there was no problem (discussion here). Liamdavies (talk) 15:00, 15 October 2013 (UTC)
- The previous ANI discussion had many in favor of the bot resuming its run.—cyberpower ChatOffline 13:36, 15 October 2013 (UTC)
- "[…] so far, consensus for that hasn't been established, but consensus was established in the last discussion to run the bot": I disagree. Consensus on Wikipedia is not a passed law, consensus can and does change, implicitly and explicitly. This is a central part of the collaborative editing we do here.
- I meant the task in general. Not the specific code. :p If consensus wants me to change it, I will, but so far, consensus for that hasn't been established, but consensus was established in the last discussion to run the bot.—cyberpower ChatOnline 23:09, 14 October 2013 (UTC)
- Nobody is even considering a block -- you should of course not re-enable the task, at this point the discussion is likely going to result in a changed specification of the task.
This seems like an entirely fixable problem to me. cyberpower ChatOnline, I'd also like to encourage you to let the process work. It sounds like further tweaks will help this bot be even more effective. Liz Read! Talk! 13:46, 15 October 2013 (UTC)
- If you are referring to the API issue, yes it is fixable. The problem is with the framework. And I am working on a fix. But Leaky Caldron's comments are totally uncalled for. As a matter of fact, I tend to be able find and fix problems very quickly. This one is more of a puzzle, since it started completely at random, and keeps happening completely at random, despite no change having been made to the program itself.—cyberpower ChatOffline 13:50, 15 October 2013 (UTC)
- Even supposing that all of his/her criticism is valid, Leaky's comments were over-the-top. It appeared to me that the goal was to shut down the bot not fix it. I wish there was as much effort given to fixing problems as there is to blaming users.
- Bots make mistakes. But they can also be fixed and mistakes can be reverted (as long as their creators aren't MIA). There is no need to personalize these problems. Liz Read! Talk! 15:30, 15 October 2013 (UTC)
- Leaky was totally out of order as have been a few others regarding this and they still are in my view. Lets be honest the problem isn't with the bots task its the blacklist in general and no matter how well working the bot was there would still be issues created. The community need to get far tougher with it and if something is on the blacklist that shouldn't be on it then again its not the bots fault and the burden should lie with the user disputing the blacklist. Very tempted to take this to an RFC and put this to bed as its getting ridiculous and the blacklist is there for a reason and should be enforced.Blethering Scot 16:13, 15 October 2013 (UTC)
Incoming Break - notice to my stalkers
Starting Saturday, I will be taking a month long break from Wikipedia. My RL stress has gotten too high to continue to be active on Wikipedia. With the recent incident, my mailbox having been driven over by a bus, as well as college, work, and trying to sell a house life, there is just simply too much on my plate. SPAMbot will remain suspended for the duration of my break. All the other bots will remain active. If there is a bug, please send me an email. I will be more likely to see it since I am disabling all notifications of Wikipedia during my break. I have also noticed that my judgement may have become flawed from this stress. Several concerns have been raised and have fought back. I also may not be judging consensus correctly, which has resulted in frustration on my end and a death spiral for my judgement. I think the best thing I can do to contribute to the community is to take a break, and come back when I am no longer stressed. The reason why I am not starting it now, is because of open ends that need closing.—cyberpower ChatOnline 14:03, 16 October 2013 (UTC)
- Dont think you were misjudging consensus at all. Its not about number of people its about strength and validity of argument as well. However I do think you need to take a break and come back stronger but i would ask is there anyone you feel you could maybe work in conjunction with and give a bit of support in fixing the bot as two heads are definitely better than one and maybe thats all you need. Anyway you take care and don't let others get you down.Blethering Scot 16:11, 16 October 2013 (UTC)
- Feel free to send me an email if you want to vent / get some advice. WormTT(talk) 07:22, 17 October 2013 (UTC)
- See, writing code and bots can be a thankless task. If everything works, hardly anyone notices, because it just works. If any of it breaks, lots of people notice, and lots of them have an opinion about how it should be fixed, or why it shouldn't have broken. Of all the great work you do, most of it just chugs away to everyone's benefit with no incident. Nobody has much idea of the work that has gone into creating or maintaining it, or if they do they don't think about it much. Then this, and you feel the world is at your door telling you how to do your "job".
- This I know, because I've lived through it many times myself in that other, "real" life, and it's never fun.
- Of all the comments on this page, I'd be looking at Amalthea's as the most useful, and someone who can genuinely help, and is obviously willing to.
- Whatever you decide, come back refreshed, and you have my respect for taking the decision to take time to clear your head. Many people do appreciate the work you do. Perhaps we should say so more often. Begoon talk 07:50, 17 October 2013 (UTC)
- Sorry for any part I have played in you making this decision, I wish you all the best with your schooling during your wikibreak and look forward to ironing out the problems when you get back. Aside from all I have said, I do support this bots operation, I just think its behaviour needs modification. Yet again I apologise for being less than WP:CIVIL about this matter, and hope you accept my apology. Liamdavies (talk) 07:45, 18 October 2013 (UTC)
- Have a good break, Cyberpower and come back refreshed.Slp1 (talk) 23:41, 26 October 2013 (UTC)
- Thanks for all your bot work, Cyberpower. Enjoy your break and we'll be glad to have you back. Seraphimblade Talk to me 00:55, 27 October 2013 (UTC)
RFPP bot
In case you'd already not seen it, I'd like to ask that you check the "Template editor" section of Wikipedia talk:Requests for page protection. We're going to need to add an extra kind of protection option to RFPP, so I thought you might want your bot to know about it. Thanks for operating it! Nyttend (talk) 16:22, 17 October 2013 (UTC)
- Of course. I've been meaning to request access to that right myself. Do you thing you could grant me access to the template editor right?—cyberpower OnlineTrick or Treat 16:29, 17 October 2013 (UTC)
- Done. I was on the phone for a bit, but I've finished checking your contributions to the point that I'm confident in giving you this right. Nyttend (talk) 16:58, 17 October 2013 (UTC)
- Any full protected templates you want to edit right away? Let me know and I'll downgrade them. Mark Arsten (talk) 18:48, 17 October 2013 (UTC)
- Template:Blacklisted-links is my primary concern right now. That's the only template at the moment. Thanks a lot.—cyberpower OfflineTrick or Treat 18:50, 17 October 2013 (UTC)
- Alright, got it. Mark Arsten (talk) 19:11, 17 October 2013 (UTC)
- Thanks.—cyberpower OfflineTrick or Treat 19:12, 17 October 2013 (UTC)
- Alright, got it. Mark Arsten (talk) 19:11, 17 October 2013 (UTC)
- Template:Blacklisted-links is my primary concern right now. That's the only template at the moment. Thanks a lot.—cyberpower OfflineTrick or Treat 18:50, 17 October 2013 (UTC)
A kitten for you!
Hi Cyberpower, I saw your wikistress level and thought that I would bring a little kitten to help you out since it's proven that kittens help reduce stress.
— dainomite 21:31, 17 October 2013 (UTC)
- Looks like I'll need a wiki-Zyrtec. :p Thanks for the kitten. :-)—cyberpower OnlineTrick or Treat 00:50, 18 October 2013 (UTC)
- Not a problem. I know we haven't interacted very much on WP but if there's anything I can do to help, please don't hesitate to ask. — dainomite 01:59, 18 October 2013 (UTC)
A kitten for you!
I love kittens and they're cute and they take away my stress of real life. Seriously, have a good wiki break and come back feeling refreshed.
Sportsguy17 (click to talk • contributions) 23:35, 17 October 2013 (UTC)
- Two kittens. Wow. Hopefully they're both male. :-)—cyberpower OnlineTrick or Treat 01:00, 18 October 2013 (UTC)
- All kittens are cute. I have a female cat: a Siberian cat, whom is adorable. Enjoy your WikiKittens....you can hug them, squeeze them, and call them George Enjoy. Sportsguy17 (click to talk • contributions) 01:07, 18 October 2013 (UTC)