Jump to content

User:Risker/Content moderation

From Wikipedia, the free encyclopedia

Methods of content moderation

[edit]

There are multiple types of content moderation carried out on Wikipedia, described below:

Editing
[edit]
  • Most common type of content moderation, used by any editor (whether or not confirmed or registered) to modify the content of a page.
  • Used by preference by some editors when added content is at least partially appropriate.
  • If added content is entirely inappropriate (e.g., vandalism, test edits, poorly sourced fact information such as sports scores), revert of the edit is normally used. See below.
Revert (undo/rollback) of edit
[edit]
  • See Help:Reverting
  • Second most common type of content moderation, used by any editor (whether or not confirmed or registered) to return a page to the previous version.
  • Used for vandalism, certain types of content removal
  • "Undo" is considered the gentlest form of reversion, and allows the reverting editor to include an edit summary explaining the reason for reversion. This is the most common method of reverting good-faith edits that are not appropriate to the page on which they have been made.
  • Various tools such as Twinkle add tabs or options to rollback an edit, usually with an automatic edit summary or no edit summary. This is primarily used for vandalism.
Revision deletion
[edit]
  • See Wikipedia:Revision deletion
  • Deletion of one or a series of edits (or log entries) to remove specific content/material from public view. Sometimes called "hiding" or "changing visibility".
  • Can be carried out only by administrators or persons with Oversight permission
  • Once content has been revision deleted, the content of the selected edits cannot be seen by anyone without administrator/oversight permissions
    • This can create challenges when reviewing the history of an article edit-by-edit, so its use is restricted. See criteria.
  • Revision deletion is reversible by administrators/oversighters
  • Revision deletion can also be used to redact username or action summary information in logs. Its use for this purpose is less common and is more likely to be subject to review or questions.
Suppression (Oversight)
[edit]
  • See Wikipedia:Oversight
  • Deletion of one or a series of edits or log entries to remove specific content/material from the view of both the public and administrators. Called "suppression" and "oversight" interchangeably.
  • Can be carried out only by users with oversight permissions.
    • Oversighters are appointed by the Arbitration Committee (Arbcom), and are (a) current Arbcom members, (b) former Arbcom members and (c) selected community members.
  • Once content is oversighted, the content of the selected edits cannot be seen by anyone (including administrators) except for other oversighters.
  • Use is regulated and restricted. Oversighters are expected to review each other's activities, and act as a self-regulating group.
  • Used mainly for material that is privacy-violating and/or potentially harmful (to a person or their reputation, to the project, to a person or place off-wiki, etc.), or that could make an individual excessively vulnerable (e.g., self-identification by minors, certain mental health issues).
Deletion
[edit]
  • See Wikipedia:Deletion
  • Removal (deletion) of the entire article or page.
  • Can only be carried out by administrators. Once deleted, the deleted contents of the page can be viewed by administrators/oversighters.
  • Deletion is carried out using established criteria, see proposed deletion, speedy deletion and the various "XFD" pages (articles, files, miscellany, redirects, etc.) for applicable criteria.
Suppression Deletion
[edit]
  • No good documentation for this process, although it is briefly discussed at Wikipedia:Oversight
  • Can only be carried out by oversighters. Once suppress-deleted, the contents of the page can only be viewed by oversighters.
    • Oversighters delete the page while also checking the "suppress data" box.
    • Oversighters may occasionally undelete a previously deleted page and then immediately redelete it using the suppression function as an alternative to suppressing each deleted edit individually. Most frequently occurs when the page has a significant (25+ revisions) history.
  • Used when every revision (or many discrete revisions) of the page qualifies for suppression.
  • Least common method of content moderation.
Page protection
[edit]

Interaction of Wikipedia user groups and page protection levels
  Unregistered or newly registered Confirmed or autoconfirmed Extended confirmed Template editor   Admin Interface admin Appropriate for
(See also: Wikipedia:Protection policy)
No protection Normal editing The vast majority of pages. This is the default protection level.
Pending changes All users can edit
Edits by unregistered or newly registered editors (and any subsequent edits by anyone) are hidden from readers who are not logged in until reviewed by a pending changes reviewer or administrator. Logged-in editors see all edits, whether accepted or not.
Infrequently edited pages with high levels of vandalism, BLP violations, edit-warring, or other disruption from unregistered and new users.
Semi Cannot edit Normal editing Pages that have been persistently vandalized by anonymous and registered users. Some highly visible templates and modules.
Extended confirmed Cannot edit Normal editing Specific topic areas authorized by ArbCom, pages where semi-protection has failed, or high-risk templates where template protection would be too restrictive.
Template Cannot edit Normal editing High-risk or very-frequently used templates and modules. Some high-risk pages outside of template space.
Full Cannot edit Normal editing Pages with persistent disruption from extended confirmed accounts.
Interface Cannot edit Normal editing Scripts, stylesheets, and similar objects central to operation of the site or that are in other editors' user spaces.
  The table assumes a template editor also has extended confirmed privileges, which is almost always the case in practice.
Other modes of protection:



Moderating potentially harmful content

[edit]

The primary methods of identifying problematic content are:

  • Recent changes (RC) patrol - carried out by all types of editors based on an ongoing log of changes to all pages.
    • Focus is primarily on "article space" - in particular, articles - which is where potentially harmful content is most likely to be widely disseminated. These are the pages that show up on google searches.
    • RC patrollers do not normally review edits to noticeboards or user talk pages, but may review edits by "other editors" to user pages or to templates or pages in the Wikipedia space that are rarely edited (such as policy pages), particularly if the edit is made by a new/unregistered user
  • New Page Patrol (NPP) - carried out by administrators and editors with a special "new page patroller" permission
    • Main focus is on newly created articles, which are reviewed for quality but will also flag potentially harmful content.
    • A few editors (mostly administrators) will do periodic sweeps of other new pages, particularly those in "user space" or sometimes "draft space", specifically to identify problematic/inappropriate/potentially harmful content.
      • Most common example is minors writing detailed pages about themselves/their families/their friends, etc. These are normally flagged for oversighter review/attention.
  • User contribution logs - carried out by all types of editors on a case-by-case basis when an account is identified to have made a problematic edit. Particularly focuses on new/unregistered accounts, although "experienced" accounts that have been dormant for a long time and suddenly start making problem edits will also be examined. Often these accounts will be brought to the attention of an administrator, who may well block the account.

In addition:

  • If a non-admin editor identifies material that should be revision-deleted or suppressed, they can (a) revert and privately notify an admin/oversighter or (b) email Oversight (an OTRS queue where oversighters receive requests for suppression and/or revision deletion).
  • If an admin identifies an edit that should be suppressed, they will normally revision-delete and (a) privately notify an admin/oversighter or (b) email Oversight


The chart below discusses various potentially harmful content types and the manner in which it is addressed on English Wikipedia. Preferred method is bolded. Examples are provided where appropriate (using quotation marks), and are all fictional.

Content type/example Edit/Revert RevDelete Suppression Other/Notes
Insulting content directed at a third party or

article subject

"John Smith is a jerk" "John Smith jerks off in the school

washroom"

"John Smith rapes babies"
Dependent on circumstances, may also result in a warning or block. If edit

qualifies for suppression, user will probably be blocked. User may be "Oversight-

blocked" if they continue to post the suppressed material after warning/block.

Insulting content directed at an editor "You're a dick" "You're a f**king dick who wouldn't

know his ass from a hole in the ground"

"You're a f**cking dick who rapes babies

in your spare time"

(As above)
Non-public information about article subject,

third party, or editor

(revert and report to OS

or admin)

(revdelete and report to OS) Examples: Phone numbers, addresses,

non-public names/DOB of offspring, "outing"

of editors,

(As above)
Threat of self-harm ("I am going to kill myself") (revert and report to OS

or admin, can also notify

emergency@)

(revdelete and report to OS, can also notify

emergency@)

Suppress and notify emergency@ emergency@ is an email address monitored 24/7 by WMF Trust & Safety staff,

who analyse the edit and contact appropriate Emergency Services.

Threat of harm to other individuals/locations

("I am going to shoot John Smith", "XYZ

School will be bombed on Thursday", "I will be

waiting outside your workplace tonight")

(revert and report to OS

or admin, can also notify

emergency@)

(revdelete and report to OS, can also notify

emergency@)

Suppress and notify emergency@ T&S will likely attempt to communicate with editor if editor is the subject of the

threat, other editors who are close to the threatened editor may also reach out.