In 2009, The Age alleged that foreign agents working on behalf of Securency offered kickbacks to central bank officials in order to win note printing contracts.[3][6][7]
^McKenzie, N and Baker, R (25 May 2009). "The Money Makers". The Age. Retrieved 19 July 2010.{{cite web}}: CS1 maint: multiple names: authors list (link)
(bonnevillecrater.pdf) Squyres; et al. (2004). "Surficial Deposits at Gusev Crater Along Spirit Rover Traverses". Science. 305 (5685): 807–810. doi:10.1126/science.1099849. PMID15297659. {{cite journal}}: Explicit use of et al. in: |author= (help)
(cloud1.pdf) Benson; et al. (2006). "Interannual variability of water ice clouds over major martian volcanoes observed by MOC". Icarus. 184 (2): 365–371. doi:10.1016/j.icarus.2006.03.014. {{cite journal}}: Explicit use of et al. in: |author= (help)
(cloud2.pdf) Benson; et al. (2003). "The seasonal behavior of water ice clouds in the Tharsis and Valles Marineris regions of Mars: Mars Orbiter Camera Observations". Icarus. 165: 34–52. doi:10.1016/S0019-1035(03)00175-1. {{cite journal}}: Explicit use of et al. in: |author= (help)
(eagle1.pdf) Squyres; et al. (2004). "In Situ Evidence for an Ancient Aqueous Environment at Meridiani Planum, Mars". Science. 306 (5702): 1709–1714. doi:10.1126/science.1104559. PMID15576604. {{cite journal}}: Explicit use of et al. in: |author= (help)
(eagle2.pdf) Squyres; et al. (2004). "Soils of Eagle Crater and Meridian Planum at the Opportunity Rover Landing Site". Science. 306 (5702): 1723–1726. doi:10.1126/science.1105127. PMID15576606. {{cite journal}}: Explicit use of et al. in: |author= (help)
(eagle3.pdf) Squyres; et al. (2004). "The Opportunity Rover's Athena Science Investigation at Meridiani Planum, Mars". Science. 306 (5702): 1698–1703. doi:10.1126/science.1106171. PMID15576602. {{cite journal}}: Explicit use of et al. in: |author= (help)
(erebus1.pdf) Grotzinger; et al. (2006). "Sedimentary textures formed by aqueous processes, Erebus crater, Meridiani Planum, Mars". Geology. 34 (12): 1085–1088. doi:10.1130/G22985A.1. {{cite journal}}: Explicit use of et al. in: |author= (help)
(homeplate.pdf) Squyres; et al. (2008). "Detection of Silica-Rich Deposits on Mars". Science. 320 (5879): 1063–1067. doi:10.1126/science.1155429. PMID18497295. {{cite journal}}: Explicit use of et al. in: |author= (help)
(homeplate2.pdf) Squyres; et al. (2007). "Pyroclastic Activity at Home Plate in Gusev Crater, Mars". Science. 316 (5825): 738–742. doi:10.1126/science.1139045. PMID17478719. {{cite journal}}: Explicit use of et al. in: |author= (help)
(impactgardening1.pdf) Hartmann; et al. (2001). "Martian Cratering 7: The role of Impact Gardening". Icarus. 149: 37–51. doi:10.1006/icar.2000.6532. {{cite journal}}: Explicit use of et al. in: |author= (help)
(northpolarbasin1.pdf) Andrews-Hanna; et al. (2008). "The Borealis basin and the origin of the Martian crustal dichotomy". Nature. 453 (7199): 1212–1215. doi:10.1038/nature07011. PMID18580944. {{cite journal}}: Explicit use of et al. in: |author= (help)
(northpolarbasin2.pdf) Marinova; et al. (2008). "Mega-impact formation of the Mars hemispheric dichotomy". Nature. 453 (7199): 1216–1219. doi:10.1038/nature07070. PMID18580945. {{cite journal}}: Explicit use of et al. in: |author= (help)
(northpolarbasin3.pdf) Nimmo; et al. (2008). "Implications of an impact origin for the Martian hemispheric dichotomy". Nature. 453 (7199): 1220–1223. doi:10.1038/nature07025. PMID18580946. {{cite journal}}: Explicit use of et al. in: |author= (help)
(secondarycraterdating.pdf) McEwen, Alfred S.; Bierhaus, Edward B. (2006). "The importance of secondary cratering to age constraints on planetary surfaces". Annual Review of Earth and Planetary Sciences. 34: 535–567. doi:10.1146/annurev.earth.34.031405.125018.{{cite journal}}: CS1 maint: date and year (link)
A spam page, naturally, is one created with the sole intention of advertising some business or site. I divide spam pages into three loose categories:
Blatantly promotional spam pages look as though they're a verbatim copy of an "about us" section on a company website. These are relatively easy to spot, as they contain patently unencyclopedic phrases like "our products" or "for more information, visit http://example.com". You can use Google to find them, for example the query site:en.wikipedia.org "our products" throws up a number of pages that have been edited by spammers.
Exercise 1.1: Try finding a few spam pages for yourself. It shouldn't be that hard.
The second type of spam page is what's best described as corporate vanity, see this mailing list post for details. Essentially, if an account that has a business name as a username creates an article on said business, then everyone knows why they did it. Promotional usernames are against policy. You can easily find the two above types of spam pages through new pages patrol.
The third type of spam page is typically associated with linkspamming campaigns (see below). Here, the page is created for the purpose of housing external links to the spammed site(s). These often have salvageable content and aren't obviously advertising, so they aren't deletable as spam. However, these pages may still be copyvios, have non-notable subjects or suffer from other problems. For some examples, have a look at these article creations, more precisely the contents of the external link section in each: [16][17][18][19][20].
When you find a deletable spam page (i.e. not of the third type), tag it for speedy deletion using {{db-spam}} or (admins only) delete it yourself. Report the creator to the admins or block indefinitely because they're obviously not here to improve the encyclopedia. For images used exclusively on spam pages, check to see whether they are copyvios or have missing source/license information. If this is so, tag the image with {{db-imagevio}}, {{subst:nsd}} and {{subst:nld}} for the respective problems.
There are three things we need to consider in order to determine whether an external link addition is spam:
The "what", or the content of the site. Does the site fall under any of the categories at WP:ELNO (i.e. links to avoid)?
The "how" - how is the link added? Are the links added haphazardly by well-meaning editors or are the additions part of a systematic campaign to insert links to various sites?
The "who" - who is adding the links? Are the links added by established editors or by single purpose accounts? Are the usernames related to the site or its maintainers? Where do the IPs used resolve to?
I will focus on the "how" - identifying systematic campaigns and dealing with them, but the two other aspects will come into play. Firstly, some background reading.
Exercise 2.1: Familiarize yourself with the following policies, guidelines and articles.
It is highly recommended that you have rollback abilities because spammers will add their links to multiple articles. If you don't, you may seek them at Wikipedia:Requests for permissions.
OK, you want to revert some spam. But how do we find it? The most effective way of trawling the wiki is through recent changes, or a filtered version of it. There are two Internet Relay Chat (IRC) channels you can connect to, choose one of them:
It's time to install and set up some software. These channels are somewhat incompatible, which means software built for #en.wikipedia probably won't work on #wikipedia-en-spam and vice versa. For #wikipedia-en-spam, any decent IRC client will do. But do not use Huggle! (Read the tutorial once through and you'll see why.)
Whichever channel you choose, a listing of diff links will dribble in from the tubes. Open the diff in your browser. If it's not a link addition, ignore it. If it is, does the link look like it could be spam? This can usually be determined with a quick glance, by looking at the TLD (see exercise below) or recognizing the reputation of the provider of the service, e.g. bbc.co.uk == British Broadcasting Corporation, nature.com == Nature. (These aren't likely to be spam). If you're not sure, do a quick assessment of the content of the link against WP:ELNO.
Solution: (view source to reveal answer) .com, .org (non-profits are not exempt from our spam rules!), .net, .tv (although this TLD belongs to Tuvalu, it is marketed for television related sites), .info, .biz
Bring up the user's contributions. Assess these against the following table on the suitability of link additions, which is an excerpt from MediaWiki:Spam-blacklisting.
Since spammers are not welcome here, I initially avoid using any template with the text "Welcome to Wikipedia". {{uw-spam2}} is usually a good start. You can add {{uw-coi}} if you suspect the user has a conflict of interest. If a suspected spammer turns out to be a good faith editor, you can always add a welcome template later.
Include in your warning a tracking URL or {{LinkSummaryLive}}. This allows other spam patrollers to determine when a domain was previously spammed and by who (it shows up in the linksearch). Personally, I put the URL in the header, but you don't have to. If there are multiple sites spammed, you can include multiple URLs or the URL of the company owning the sites.
If the user ignores your warning and continues to add links, follow up with a {{uw-spam4}} and block/reporting if necessary. Ignoring warnings makes it almost certainly spam, so we move on to the next step.
Now we have to determine whether the spamming campaign is just limited to the user you just caught. There are two angles of attack: one through existing links and the other through the sites themselves. Both need to be examined. You might find it easier if you investigate and prepare the spam report (see Reporting, below) concurrently.
Start by performing linksearches on the spammed domain(s). It's the first item in {{LinkSummaryLive}} or, directly, Special:Linksearch/*.example.com. You should see the tracking URL you placed in your warning - that's why you did it.
Now for each article result, look for the link in the "references" (if no inline citations) and/or "external links" sections. If it's there, then you can use WikiBlame to find who added the link. Whack in the description of the link, enter an appropriate amount of revisions (500 will do) and choose interpolated search for the fastest results. Bear in mind it doesn't work all of the time - there's always the page history. For those who have access to IRC, go to a channel where COIBot is active. COIBot is an IRC interface to a database of linkadditions (the same database that is used to create the LinkReports). A good start is #wikipedia-spam-t on freenode. People who hang around there can help you with sourcing the link-addition database.
Exercise 2.4: Who added a link to thejhelum.com to the article Jhelum River? Use WikiBlame.
Solution: (view source to reveal answer) 202.61.63.10 (talk·contribs) did, see [21]. I hope you didn't cheat and look at the page history because I've already removed the link. And besides, directly using the page history is usually much slower.
For each editor who added the links, check their contributions and revert any other spamming. You should warn all IPs and users whose contributions are almost exclusively spam, no matter when they spammed. Check the edits for any other domains spammed and any bad-faith behaviour. This is not limited to:
Note all spammers in your report under an appropriate heading using {{IPSummary}} or {{UserSummary}}, whichever is appropriate along with any bad-faith behaviour. Also include the sites spammed in a separate section with {{LinkSummaryLive}}.
The linksearch also helps us look for previous incidents through tracking URLs placed by other spam patrollers. By "previous incident" I mean prior WT:WPSPAM reports, blacklistings, deletion of spam pages through AFD or MFD and any other project-space discussion. Here is an egregious example (now blacklisted globally). List these in their own heading in the report.
You might also find the occasional spam page, especially on the user pages of registered spammers. You know how to deal with these. If you're an admin, have a look at the deleted contributions of any registered spammers to find previously deleted spam pages. List spam pages in the report with {{la}} (articles), {{li}} (uploaded images) or {{lu}} (user pages).
It's time to get your hands dirty and visit the sites in question. Every site is different, hence I can only give you a few pointers on what to look for. I suggest you keep your wits about you and your adblockers enabled because you never know what they're going to serve up before it's too late...
Are the spammed sites owned by the same company or person? This is usually obvious from the content of the site, but you can look at WHOIS records to confirm this.
Exercise: Are dailygujrat.com and geokashmir.com owned by the same company? Hint: The Internet Archive may be handy here.
Solution: (view source to reveal answer) Yes, they are. Both sites are "project[s] of JhelumSoft", as said on the bottom of the page. jhelumsoft.net is offline, so this is where the Internet Archive comes in handy.
Determine a site's Adsense and Google Analytics IDs. Open up the page source and look for a string starting with "pub-", followed by many digits (Adsense) and a string starting with "UA-" followed by some numbers, before a dash (GA). These are generally owner specific.
Exercise: What is the Adsense ID of dailypunjab.com?
Solution: (view source to reveal answer) pub-0371265814726923
Do the sites look the same?
Poke around the sites for any related domains. Publishers sometimes list their projects on each pages for SEO purposes or have a page detailing their other projects. These should be listed with tracking URLs in the spam report under their own heading. The URL info tool is handy for finding related domains in a link farm. You may want to invest some time in getting a text editor that allows fuzzy find and replace (e.g. by regex) - it'll save you a lot of time in formatting long lists of related domains.
Exercise: List all domains related to (www.)webooks.co.uk.
You didn't gather all that information above for nothing - now it's time to tell people about it. Post a new section on Wikipedia talk:WikiProject Spam, containing the following things (in handy checklist format):
[ ] Previous incidents
[ ] Spam pages
[ ] Sites spammed
[ ] Adsense ID, if appropriate
[ ] Tracking URLs, if not considered for blacklisting
[ ] Related domains
[ ] Tracking URLs, if not considered for blacklisting
[ ] Spammers
[ ] Evidence of bad faith behavior, if appropriate
[ ] Course of action, see "blacklisting" below
Domains should be listed with {{LinkSummaryLive}}, registered users with {{UserSummary}} and IPs with {{IPSummary}}. At the very least, you should list sites spammed and who spammed them. For examples of completed reports, see the archives of WT:WPSPAM.
Now if spammers have engaged in block evasion or otherwise are uncontrollable, proceed to the blacklisting phase.
Wikipedia maintains two spam blacklists - one on meta and one locally. MediaWiki prevents the addition of new URLs which match either blacklist and are not on spam whitelist. The meta blacklist affects all Wikimedia projects, all of Wikia and some other sites while the local blacklist affects the English Wikipedia only. The blacklists are editable by only meta and local administrators respectively. There is also User:XLinkBot, which is the ClueBot equivalent for spam.
There are five things to examine when considering a site for blacklisting:
Does the content of the site have any use to Wikimedia projects? Sites with no useful content, e.g. gambling sites, can be globally blacklisted despite the spamming being restricted to one project. Compare the website against WP:ELNO. If a site has useful content and other good-faith editors agree (say, by using it for references) consider using XLinkBot.
What is the scale of the spamming? Widespread spamming can lead to immediate blacklisting.
Whether the spamming is spread across multiple Wikimedia projects. You can use the cross-wiki linksearch to search up to 57 Wikipedias and/or the cross-wiki contributions tool as a preliminary. If there are hits, check the articles to see who added the link and whether they are spammers. If there are any additional spammers, the domain(s) are eligible (and recommended) for global blacklisting. List any additional spammers at WT:WPSPAM with an annotation that they have spammed non-English wikis. If the spam is extremely widespread, you can ask me to run a search of all Wikimedia projects.
Whether the site or related sites have been spammed before. The list of previous incidents you compiled earlier should come in handy.
Whether the site attempts to install malware (these should be blacklisted globally).
To file a blacklisting request, list the domains that were spammed at MediaWiki talk:Spam-blacklist or m:Talk:Spam blacklist under the appropriate section and a link to the WikiProject Spam report. If you are seeking global blacklisting, add the cross-wiki spammers to your request. A template, {{WPSPAM}} (exists locally only), allows you to add permanent links to such reports. An administrator will come along and process your request.
Unfortunately, blacklisting doesn't stop all spammers. I won't go into this in detail for a fairly good reason, but it's good to know what to do when these cases crop up.
Firstly, you need to be able to find spam that slips in between the cracks. Identify a unique phrase or string (e.g. a company name) that is used almost exclusively by the spammer, then search for it. Wikipedia's own search engine is barely adequate for the purpose, especially in the case of cross-wiki spam. If you must use it, set it to search the article and user namespaces. Otherwise, just use Google. site:wikipedia.org X is a good starting point, where X is the unique string you identified. If multiple projects have been spammed you can use OR to add projects searched, e.g. site:wikipedia.org OR site:wiktionary.org X These Google searches are automatically cross-wiki.
Block any new sockpuppets and blacklist any new sites promptly.
If there are large amounts of registered spammers, consider requesting checkuser.
Don't be afraid to ask for or make large and long rangeblocks. In the case of a company spamming, are the IPs registered to the company? Use a WHOIS, reverse DNS or a traceroute to find out. If they are, you can block for longer durations because they are likely static IPs.
Is the spam focussed a couple of pages? Try getting them protected.
Call in the wider community. This approach works best when you have a very widespread problem (hundreds of spam link additions).
How you deal with complaints from suspected spammers about their links being removed or page being deleted is dependent on your personality and your current mood. The only way to find a "strategy" that works for you is through experience, though I can give you a few tips. Successfully dealing with suspected spammers involves using many of these techniques.
State specifically and succintly what's wrong with the user's edits. (This takes a bit of practice). Refer to your notes (if applicable) about bad-faith activities. Tell them that systematic additions of external links looks like spamming and if they are a good-faith editor, point them to our tips on not setting off the spam radar. You can also comment on the content of the sites, in which case state specifically which points of WP:ELNO the spammed site(s) fail.
Some complaints are abusive. It's best if you ignore these, unless they contain legal threats or severe personal attacks in which case you should get the user blocked.
Be prepared to be blunt. Some users are utterly clueless. It can take you several goes to get the point across that they can contribute to the encyclopedia without adding links to a particular website. I've personally contemplated using big colorful blinking text many times, but they get it before I roll this technique out.
Ask the hard questions, e.g. why this particular (group of) site(s)? Example application (note the spammer hasn't been seen since).
You might want to inform the user about the consequences of spamming Wikipedia and that it can backfire spectacularly:
Our global blacklist affects all Wikimedia projects, all of Wikia and hundreds of third-party websites that use our blacklists for spam filtering. We generally do not remove sites from the blacklist at the requests of their owners, but only when high-volume editors can demonstrate a valuable encyclopedic use for the site. Therefore it is extremely difficult to get sites delisted.
Some search engines take Wikimedia's blacklists as user submitted spam reports and may result in spammed sites being penalized or delisted from search results.
Our records of the spamming may feature prominently in search results for the same reason spammers add links here - high PageRank. This is especially true for persistent spammers. In particular, Wikipedia talk:WikiProject Spam has a PageRank of 6.
Spamming may result in negative PR for your client or employer.
If you deleted someone's spam page and they complain about it, don't waffle on about notability. You should explain simply why their page is not permissible in an encyclopedia and appeal to their common sense (and not their knowledge of Wikipedia policy) e.g. "Do Encyclopedia Britannica or World Book contain promotional blurbs about companies written by that company?"
Spammers often go through a grieving process. It's your job to get them through this as fast as possible.
Tell them it's not worth the effort - it took you several hours to add those links, it took me a minute to remove them.
And that's it. Wikipedia is not a restaurant full of bloody Vikings, so kick them out.