Jump to content

User:Novem Linguae/Essays/Docker tutorial for Windows (WSL)

From Wikipedia, the free encyclopedia

My notes on how to get a local development environment up and running for MediaWiki development. Written from a Windows and VS Code perspective.

Local development environments are essential for the patch writing process. They allow you to instantly test your changes, before submitting your patch. They are also essential for the debugging process, since it allows you to step debug the issue if it's reproducible.

Some pessimistic advice

[edit]

Expect to spend more time setting up your dev environment than you do coding, until you've got it set up perfectly on all your computers, and you've mastered the ins and out of this work instruction. Can take months to become fluent. MediaWiki has a complicated toolchain.

Windows Subsystem for Linux (WSL)

[edit]

This is the Windows Subsystem for Linux (WSL) version of this work instruction. The no-WSL version is located at User:Novem Linguae/Essays/Docker tutorial for Windows (no-WSL), but not using WSL means very slow web load times (like 25 seconds). I recommend this work instruction that uses WSL.

Why use WSL?

  • Advantages
    • Not using WSL, some pages can take 25 seconds to load (barely usable). Using WSL can get you down to 3 seconds (normal, much better).
  • Disadvantages
    • Can't keep files in Dropbox anymore.
    • More complicated to set up.

Docker

[edit]

Docker is a fancy XAMPP. It lets whatever codebase you're working on pick what OS, what version of PHP/Python/Node, what database, etc. to use instead of depending on whatever version of XAMPP you happened to install. Then it automates the installation of everything for you.

If you try to use PHP 8.1 with a repo that is using a bunch of PHP 7.4 dependencies, for example, you may not be able to get a dev environment up and running, even if you do composer update instead of composer install. You'll get a bunch of errors. You'd be forced to uninstall XAMPP 8.1 and install XAMPP 7.4, which is a pain. Maybe you need XAMPP 8.1 for your other project, so would have to do this all over again when switching projects. Docker automates all this.

Install WSL

[edit]
  • In PowerShell...
  • Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux
  • wsl --install -d ubuntu
    • When prompted, enter a username such as novemlinguae
    • When prompted, enter a password
    • When prompted, retype your password
  • wsl --set-version ubuntu 2
  • install Docker Desktop for Windows
  • Docker -> Settings -> General -> tick "Use the WSL 2 based engine"
  • Docker -> Settings -> Resources -> WSL Integration -> tick "Ubuntu"

Install useful software (composer, git, etc.)

[edit]
  • Update the operating system
    • sudo apt update && sudo apt upgrade -y
  • Make sure Docker is not running. Else it will have trouble in the next step when trying to modify the running program mysql.
  • Install common dev programs that aren't already installed such as git-review and composer
    • sudo apt install composer git git-review imagemagick mysql-client mysql-server php php-apcu php-cli php-gd php-intl php-mbstring php-mysql php-xml zip php-curl
  • Configure git and git-review
    • git config --global user.name "Novem Linguae"; git config --global user.email "novemlinguae@gmail.com"; git config --global gitreview.remote origin; git config --global gitreview.username novemlinguae;
  • Configure npm (for running unit tests and downloading JS packages). If you don't install it now, npm will run the Windows version instead of the Ubuntu version and corrupt a bunch of stuff.
    • curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.0/install.sh | bash
    • restart bash
    • nvm install 18 - installs Node version 18, which is what is currently used by Wikimedia
  • In general, anytime you touch anything in composer, you'll want to use docker compose exec mediawiki composer ..., to avoid some nasty situations that can arise when you use the wrong PHP version. Your Ubuntu's PHP version may not be the same PHP version that is running in the Docker container.
  • Most console commands from now on in the rest of the tutorial will be done from within WSL (type ubuntu in PowerShell to access a Ubuntu shell) unless otherwise noted.

Eliminate password prompts

[edit]
  • Get git and git review to stop asking you for your password until you close the window:
  • Add to ~/.profile:
    • if test "$PS1"; then
        if [ -z "$SSH_AUTH_SOCK" ]; then
          eval $(ssh-agent -s)
        fi
      fi
      
  • eval `ssh-agent -s`
  • ssh-add /home/novemlinguae/.ssh/id_ed25519

Install MediaWiki core

[edit]

Automatically

[edit]

Consider wiping out your localhost and installing fresh via this script once a week, and/or when you get unexpected exceptions. Unexpected exceptions are often from alpha versions getting out of sync. For example, maybe you git pull MediaWiki core to be this week's version, but you forget to git pull skins/Vector, leaving you on last week's skins/Vector, which is incompatible.

Manually

[edit]

Install MediaWiki core (1)

[edit]
  • Set up your SSH keys in Ubuntu. You can generate new ones, or copy them over from Windows.
    • If you copy them over from Windows, they need to go from the C:\Users\NovemLinguae\.ssh\ directory to the /home/novemlinguae/.ssh/ directory.
    • You also need to make sure to set the private key file's permissions to 0600. chmod 0600 .ssh/id_ed25519
  • ubuntu
  • git clone "ssh://novemlinguae@gerrit.wikimedia.org:29418/mediawiki/core" mediawiki - replace "novemlinguae" with your Gerrit username[1]
  • create .env file. This is similar to the .env file provided at https://github.com/wikimedia/mediawiki/blob/master/DEVELOPERS.md, with a couple tweaks to make XDebug work, set the correct UID/GID for Windows, make PHPUnit throw less notices, etc.
MW_SCRIPT_PATH=/w
MW_SERVER=http://localhost:8080
MW_DOCKER_PORT=8080
MEDIAWIKI_USER=Admin
MEDIAWIKI_PASSWORD=dockerpass
XDEBUG_ENABLE=true
XDEBUG_CONFIG='mode=debug start_with_request=yes client_host=host.docker.internal client_port=9003 idekey=VSCODE'
XDEBUG_MODE=debug,coverage
XHPROF_ENABLE=true
PHPUNIT_LOGS=0
PHPUNIT_USE_NORMAL_TABLES=1
MW_DOCKER_UID=
MW_DOCKER_GID=

Start Docker

[edit]
  • ubuntu
  • cd mediawiki
  • docker compose up -d

Install MediaWiki core (2)

[edit]
  • ubuntu
  • follow the official instructions at https://github.com/wikimedia/mediawiki/blob/master/DEVELOPERS.md
    • docker compose exec mediawiki composer update[2]
    • docker compose exec mediawiki /bin/bash /docker/install.sh - does initial configuration and database creation. assumes sqlite. if you already have a LocalSettings.php file and want to install mariadb, see below.
  • VERY IMPORTANT FOR WINDOWS USERS: docker compose exec mediawiki chmod -R o+rwx cache/sqlite
  • npm ci

Install MediaWiki extensions and skins

[edit]

Automatically

[edit]

Manually

[edit]
  • ubuntu
  • cd extensions or cd skins
  • foreach (skin/extension):
    • git clone "ssh://novemlinguae@gerrit.wikimedia.org:29418/mediawiki/extensions/PageTriage" - replace "novemlinguae" with your Gerrit username, and replace "PageTriage" with the extension name[1]
    • docker compose exec mediawiki composer update --working-dir "extensions/PageTriage"
    • cd PageTriage (or whatever the name is)
    • npm ci
    • add wfLoadExtension( 'PageTriage' ); , wfLoadSkin( 'Vector' );, or similar to LocalSettings.php
    • create .vscode/settings.json (and populate it with the text in the section below)
  • docker compose exec mediawiki php maintenance/run.php update - does database updates for skins and extensions

Installing complicated extensions

[edit]

Adiutor

[edit]
  • install Echo
  • install BetaFeatures
  • docker compose exec mediawiki php maintenance/run.php Adiutor:updateConfiguration - this will create 7 .json pages onwiki. Check Special:RecentChanges to see them.
  • Special:Preferences -> beta features -> tick "Adiutor"
  • Special:Preferences -> moderation -> tick all
  • Special:AdiutorSettings
  • When I tried this, I was getting a blank page, with a JS error in the console. Did I not update a dependency?Try again someday.

CentralAuth

[edit]
  • AntiSpoof
    • install mw:Extension:AntiSpoof. mandatory dependency
      • docker compose exec mediawiki php maintenance/run.php AntiSpoof:batchAntiSpoof.php
    • test here if you want, to make sure AntiSpoof is working: http://localhost:8080/
    • add to config: $wgSharedTables[] = 'spoofuser';
  • install mw:Extension:CentralAuth, including the maintenance/run.php update step
  • log into HeidiSQL as root
    • left tree -> scroll to top -> left click "MediaWiki MariaDB" -> new -> database -> centralauth -> OK
    • Tools -> User manager -> my_user -> Add object -> centralauth
    • Tick the check box, granting access to all
    • Save
    • in the centralauth database...
      • CREATE TABLE IF NOT EXISTS `spoofuser` (
          `su_name` varbinary(255) NOT NULL,
          `su_normalized` varbinary(255) DEFAULT NULL,
          `su_legal` tinyint(1) DEFAULT NULL,
          `su_error` blob DEFAULT NULL,
          PRIMARY KEY (`su_name`),
          KEY `su_normname_idx` (`su_normalized`,`su_name`)
        ) ENGINE=InnoDB DEFAULT CHARSET=binary;
        
  • ubuntu
  • docker compose exec mediawiki php maintenance/run.php sql --wikidb centralauth extensions/CentralAuth/schema/mysql/tables-generated.sql - use mysql for mariadb, sqlite for sqlite
  • INSERT INTO global_group_permissions (ggp_group,ggp_permission) VALUES ('steward','globalgrouppermissions'), ('steward','globalgroupmembership');
  • docker compose exec mediawiki php maintenance/run.php CentralAuth:migratePass0.php
  • docker compose exec mediawiki php maintenance/run.php CentralAuth:migratePass1.php
  • Probably need to do a bunch of configuration, as detailed at mw:Extension:CentralAuth#Setup. I'm going to skip that, since all I need at the moment is for Special:GlobalGroupPermissions to work.
  • Turn on Special:GlobalRenameQueue and Special:GlobalVanishRequest
    • Assign your admin account steward rights
    • $wgCentralAuthAutomaticVanishPerformer = 'Admin';
      $wgCentralAuthRejectVanishUserNotification = 'Admin';
      $wgCentralAuthEnableGlobalRenameRequest = true;
      $wgGroupPermissions[ 'steward' ][ 'centralauth-merge' ] = true;
      $wgGroupPermissions[ 'steward' ][ 'globalgrouppermissions' ] = true;
      $wgGroupPermissions[ 'steward' ][ 'globalgroupmembership' ] = true;
      $wgGroupPermissions[ 'steward' ][ 'centralauth-rename' ] = true;
      

DiscussionTools

[edit]
  • install dependencies
    • Linter
    • Echo
    • VisualEditor
  • install as normal
  • LocalSettings.php
    • $wgLocaltimezone = "America/Los_Angeles";
    • date_default_timezone_set( $wgLocaltimezone );
    • $wgFragmentMode = [ 'html5' ];

FlaggedRevs

[edit]

FlaggedRevs is packed with features, so it is important to get its settings right, so that it behaves the way you expect it to. It is basically like two extensions in one and has two major modes: override and protection.

  • override - requires all pages to go through review before displaying that revision to a logged out user
  • protection - requires protected pages go through review before displaying that revision to a logged out user

enwiki (override = true, protection = true)

[edit]
// enwiki
// InitializeSettings.php
$wgFlaggedRevsOverride = false;
$wgFlaggedRevsProtection = true;
$wgSimpleFlaggedRevsUI = true;
$wgFlaggedRevsHandleIncludes = 0;
$wgFlaggedRevsAutoReview = 3;
$wgFlaggedRevsLowProfile = true;
// CommonSettings.php
$wgAvailableRights[] = 'autoreview';
$wgAvailableRights[] = 'autoreviewrestore';
$wgAvailableRights[] = 'movestable';
$wgAvailableRights[] = 'review';
$wgAvailableRights[] = 'stablesettings';
$wgAvailableRights[] = 'unreviewedpages';
$wgAvailableRights[] = 'validate';
$wgGrantPermissions['editprotected']['movestable'] = true;
// flaggedrevs.php
wfLoadExtension( 'FlaggedRevs' );
$wgFlaggedRevsAutopromote = false;
$wgHooks['MediaWikiServices'][] = static function () {
	global $wgAddGroups, $wgDBname, $wgDefaultUserOptions,
		$wgFlaggedRevsNamespaces, $wgFlaggedRevsRestrictionLevels,
		$wgFlaggedRevsTags, $wgFlaggedRevsTagsRestrictions,
		$wgGroupPermissions, $wgRemoveGroups;

	$wgFlaggedRevsNamespaces[] = 828; // NS_MODULE
	$wgFlaggedRevsTags = [ 'accuracy' => [ 'levels' => 2 ] ];
	$wgFlaggedRevsTagsRestrictions = [
		'accuracy' => [ 'review' => 1, 'autoreview' => 1 ],
	];
	$wgGroupPermissions['autoconfirmed']['movestable'] = true; // T16166
	$wgGroupPermissions['sysop']['stablesettings'] = false; // -aaron 3/20/10
	$allowSysopsAssignEditor = true;

	$wgFlaggedRevsNamespaces = [ NS_MAIN, NS_PROJECT ];
	# We have only one tag with one level
	$wgFlaggedRevsTags = [ 'status' => [ 'levels' => 1 ] ];
	# Restrict autoconfirmed to flagging semi-protected
	$wgFlaggedRevsTagsRestrictions = [
		'status' => [ 'review' => 1, 'autoreview' => 1 ],
	];
	# Restriction levels for auto-review/review rights
	$wgFlaggedRevsRestrictionLevels = [ 'autoconfirmed' ];
	# Group permissions for autoconfirmed
	$wgGroupPermissions['autoconfirmed']['autoreview'] = true;
	# Group permissions for sysops
	$wgGroupPermissions['sysop']['review'] = true;
	$wgGroupPermissions['sysop']['stablesettings'] = true;
	# Use 'reviewer' group
	$wgAddGroups['sysop'][] = 'reviewer';
	$wgRemoveGroups['sysop'][] = 'reviewer';
	# Remove 'editor' and 'autoreview' (T91934) user groups
	unset( $wgGroupPermissions['editor'], $wgGroupPermissions['autoreview'] );

	# Rights for Bureaucrats (b/c)
	if ( isset( $wgGroupPermissions['reviewer'] ) ) {
		if ( !in_array( 'reviewer', $wgAddGroups['bureaucrat'] ?? [] ) ) {
			// promote to full reviewers
			$wgAddGroups['bureaucrat'][] = 'reviewer';
		}
		if ( !in_array( 'reviewer', $wgRemoveGroups['bureaucrat'] ?? [] ) ) {
			// demote from full reviewers
			$wgRemoveGroups['bureaucrat'][] = 'reviewer';
		}
	}
	# Rights for Sysops
	if ( isset( $wgGroupPermissions['editor'] ) && $allowSysopsAssignEditor ) {
		if ( !in_array( 'editor', $wgAddGroups['sysop'] ) ) {
			// promote to basic reviewer (established editors)
			$wgAddGroups['sysop'][] = 'editor';
		}
		if ( !in_array( 'editor', $wgRemoveGroups['sysop'] ) ) {
			// demote from basic reviewer (established editors)
			$wgRemoveGroups['sysop'][] = 'editor';
		}
	}
	if ( isset( $wgGroupPermissions['autoreview'] ) ) {
		if ( !in_array( 'autoreview', $wgAddGroups['sysop'] ) ) {
			// promote to basic auto-reviewer (semi-trusted users)
			$wgAddGroups['sysop'][] = 'autoreview';
		}
		if ( !in_array( 'autoreview', $wgRemoveGroups['sysop'] ) ) {
			// demote from basic auto-reviewer (semi-trusted users)
			$wgRemoveGroups['sysop'][] = 'autoreview';
		}
	}
};

dewiki (override = true, protection = false)

[edit]
// dewiki
// InitializeSettings.php
$wgFlaggedRevsOverride = true;
$wgFlaggedRevsProtection = false;
$wgSimpleFlaggedRevsUI = true;
$wgFlaggedRevsHandleIncludes = 2;
$wgFlaggedRevsAutoReview = 3;
$wgFlaggedRevsLowProfile = true;
// CommonSettings.php
$wgAvailableRights[] = 'autoreview';
$wgAvailableRights[] = 'autoreviewrestore';
$wgAvailableRights[] = 'movestable';
$wgAvailableRights[] = 'review';
$wgAvailableRights[] = 'stablesettings';
$wgAvailableRights[] = 'unreviewedpages';
$wgAvailableRights[] = 'validate';
$wgGrantPermissions['editprotected']['movestable'] = true;
// flaggedrevs.php
wfLoadExtension( 'FlaggedRevs' );
$wgFlaggedRevsAutopromote = false;
call_user_func( static function () {
	global $wgDBname,
		$wgFlaggedRevsAutopromote, $wgFlaggedRevsAutoconfirm;

	$wmgStandardAutoPromote = [
		'days'                  => 60, # days since registration
		'edits'                 => 250, # total edit count
		'excludeLastDays'       => 1, # exclude the last X days of edits from below edit counts
		'benchmarks'            => 15, # number of "spread out" edits
		'spacing'               => 3, # number of days between these edits (the "spread")
		'totalContentEdits'     => 300, # edits to pages in $wgContentNamespaces
		'totalCheckedEdits'     => 200, # edits before the stable version of pages
		'uniqueContentPages'    => 14, # unique pages in $wgContentNamespaces edited
		'editComments'          => 50, # number of manual edit summaries used
		'userpageBytes'         => 0, # size of userpage (use 0 to not require a userpage)
		'neverBlocked'          => true, # username was never blocked before?
		'maxRevertedEditRatio'  => 0.03, # max fraction of edits reverted via "rollback"/"undo"
	];

	$wgFlaggedRevsAutopromote = $wmgStandardAutoPromote;
	$wgFlaggedRevsAutopromote['edits'] = 300;
	$wgFlaggedRevsAutopromote['editComments'] = 30;

	$wgFlaggedRevsAutoconfirm = [
		'days'                => 30, # days since registration
		'edits'               => 50, # total edit count
		'spacing'             => 3, # spacing of edit intervals
		'benchmarks'          => 7, # how many edit intervals are needed?
		'excludeLastDays'     => 2, # exclude the last X days of edits from edit counts
		// Either totalContentEdits reqs OR totalCheckedEdits requirements needed
		'totalContentEdits'   => 150, # $wgContentNamespaces edits OR...
		'totalCheckedEdits'   => 50, # ...Edits before the stable version of pages
		'uniqueContentPages'  => 8, # $wgContentNamespaces unique pages edited
		'editComments'        => 20, # how many edit comments used?
		'email'               => false, # user must be emailconfirmed?
		'neverBlocked'        => true, # Can users that were blocked be promoted?
	];
} );
$wgHooks['MediaWikiServices'][] = static function () {
	global $wgAddGroups, $wgDBname, $wgDefaultUserOptions,
		$wgFlaggedRevsNamespaces, $wgFlaggedRevsRestrictionLevels,
		$wgFlaggedRevsTags, $wgFlaggedRevsTagsRestrictions,
		$wgGroupPermissions, $wgRemoveGroups;

	$wgFlaggedRevsNamespaces[] = 828; // NS_MODULE
	$wgFlaggedRevsTags = [ 'accuracy' => [ 'levels' => 2 ] ];
	$wgFlaggedRevsTagsRestrictions = [
		'accuracy' => [ 'review' => 1, 'autoreview' => 1 ],
	];
	$wgGroupPermissions['autoconfirmed']['movestable'] = true; // T16166
	$wgGroupPermissions['sysop']['stablesettings'] = false; // -aaron 3/20/10
	$allowSysopsAssignEditor = true;
	
	$wgFlaggedRevsNamespaces[] = NS_CATEGORY;
	$wgFlaggedRevsTags['accuracy']['levels'] = 1;

	$wgGroupPermissions['sysop']['stablesettings'] = true; // -aaron 3/20/10
	
	# Rights for Bureaucrats (b/c)
	if ( isset( $wgGroupPermissions['reviewer'] ) ) {
		if ( !in_array( 'reviewer', $wgAddGroups['bureaucrat'] ?? [] ) ) {
			// promote to full reviewers
			$wgAddGroups['bureaucrat'][] = 'reviewer';
		}
		if ( !in_array( 'reviewer', $wgRemoveGroups['bureaucrat'] ?? [] ) ) {
			// demote from full reviewers
			$wgRemoveGroups['bureaucrat'][] = 'reviewer';
		}
	}
	# Rights for Sysops
	if ( isset( $wgGroupPermissions['editor'] ) && $allowSysopsAssignEditor ) {
		if ( !in_array( 'editor', $wgAddGroups['sysop'] ) ) {
			// promote to basic reviewer (established editors)
			$wgAddGroups['sysop'][] = 'editor';
		}
		if ( !in_array( 'editor', $wgRemoveGroups['sysop'] ) ) {
			// demote from basic reviewer (established editors)
			$wgRemoveGroups['sysop'][] = 'editor';
		}
	}
	if ( isset( $wgGroupPermissions['autoreview'] ) ) {
		if ( !in_array( 'autoreview', $wgAddGroups['sysop'] ) ) {
			// promote to basic auto-reviewer (semi-trusted users)
			$wgAddGroups['sysop'][] = 'autoreview';
		}
		if ( !in_array( 'autoreview', $wgRemoveGroups['sysop'] ) ) {
			// demote from basic auto-reviewer (semi-trusted users)
			$wgRemoveGroups['sysop'][] = 'autoreview';
		}
	}
};

plwiki (override = true, protection = false)

[edit]
// plwiki
// InitializeSettings.php
$wgFlaggedRevsOverride = true;
$wgFlaggedRevsProtection = false;
$wgSimpleFlaggedRevsUI = true;
$wgFlaggedRevsHandleIncludes = 2;
$wgFlaggedRevsAutoReview = 3;
$wgFlaggedRevsLowProfile = true;
// CommonSettings.php
$wgAvailableRights[] = 'autoreview';
$wgAvailableRights[] = 'autoreviewrestore';
$wgAvailableRights[] = 'movestable';
$wgAvailableRights[] = 'review';
$wgAvailableRights[] = 'stablesettings';
$wgAvailableRights[] = 'unreviewedpages';
$wgAvailableRights[] = 'validate';
$wgGrantPermissions['editprotected']['movestable'] = true;
// flaggedrevs.php
wfLoadExtension( 'FlaggedRevs' );
$wgFlaggedRevsAutopromote = false;
call_user_func( static function () {
	global $wgDBname,
		$wgFlaggedRevsAutopromote, $wgFlaggedRevsAutoconfirm;

	$wmgStandardAutoPromote = [
		'days'                  => 60, # days since registration
		'edits'                 => 250, # total edit count
		'excludeLastDays'       => 1, # exclude the last X days of edits from below edit counts
		'benchmarks'            => 15, # number of "spread out" edits
		'spacing'               => 3, # number of days between these edits (the "spread")
		'totalContentEdits'     => 300, # edits to pages in $wgContentNamespaces
		'totalCheckedEdits'     => 200, # edits before the stable version of pages
		'uniqueContentPages'    => 14, # unique pages in $wgContentNamespaces edited
		'editComments'          => 50, # number of manual edit summaries used
		'userpageBytes'         => 0, # size of userpage (use 0 to not require a userpage)
		'neverBlocked'          => true, # username was never blocked before?
		'maxRevertedEditRatio'  => 0.03, # max fraction of edits reverted via "rollback"/"undo"
	];

	$wgFlaggedRevsAutopromote = $wmgStandardAutoPromote;
	$wgFlaggedRevsAutopromote['days'] = 90;
	$wgFlaggedRevsAutopromote['edits'] = 500;
	$wgFlaggedRevsAutopromote['spacing'] = 3;
	$wgFlaggedRevsAutopromote['benchmarks'] = 15;
	$wgFlaggedRevsAutopromote['totalContentEdits'] = 500;
	$wgFlaggedRevsAutopromote['uniqueContentPages'] = 10;
	$wgFlaggedRevsAutopromote['editComments'] = 500;
} );
$wgHooks['MediaWikiServices'][] = static function () {
	global $wgAddGroups, $wgDBname, $wgDefaultUserOptions,
		$wgFlaggedRevsNamespaces, $wgFlaggedRevsRestrictionLevels,
		$wgFlaggedRevsTags, $wgFlaggedRevsTagsRestrictions,
		$wgGroupPermissions, $wgRemoveGroups;

	$wgFlaggedRevsNamespaces[] = 828; // NS_MODULE
	$wgFlaggedRevsTags = [ 'accuracy' => [ 'levels' => 2 ] ];
	$wgFlaggedRevsTagsRestrictions = [
		'accuracy' => [ 'review' => 1, 'autoreview' => 1 ],
	];
	$wgGroupPermissions['autoconfirmed']['movestable'] = true; // T16166
	$wgGroupPermissions['sysop']['stablesettings'] = false; // -aaron 3/20/10
	$allowSysopsAssignEditor = true;
	
	$wgFlaggedRevsNamespaces = [ NS_MAIN, NS_TEMPLATE, NS_CATEGORY, NS_HELP, 100, 828 ];
	$wgFlaggedRevsTags['accuracy']['levels'] = 1;
	
	# Rights for Bureaucrats (b/c)
	if ( isset( $wgGroupPermissions['reviewer'] ) ) {
		if ( !in_array( 'reviewer', $wgAddGroups['bureaucrat'] ?? [] ) ) {
			// promote to full reviewers
			$wgAddGroups['bureaucrat'][] = 'reviewer';
		}
		if ( !in_array( 'reviewer', $wgRemoveGroups['bureaucrat'] ?? [] ) ) {
			// demote from full reviewers
			$wgRemoveGroups['bureaucrat'][] = 'reviewer';
		}
	}
	# Rights for Sysops
	if ( isset( $wgGroupPermissions['editor'] ) && $allowSysopsAssignEditor ) {
		if ( !in_array( 'editor', $wgAddGroups['sysop'] ) ) {
			// promote to basic reviewer (established editors)
			$wgAddGroups['sysop'][] = 'editor';
		}
		if ( !in_array( 'editor', $wgRemoveGroups['sysop'] ) ) {
			// demote from basic reviewer (established editors)
			$wgRemoveGroups['sysop'][] = 'editor';
		}
	}
	if ( isset( $wgGroupPermissions['autoreview'] ) ) {
		if ( !in_array( 'autoreview', $wgAddGroups['sysop'] ) ) {
			// promote to basic auto-reviewer (semi-trusted users)
			$wgAddGroups['sysop'][] = 'autoreview';
		}
		if ( !in_array( 'autoreview', $wgRemoveGroups['sysop'] ) ) {
			// demote from basic auto-reviewer (semi-trusted users)
			$wgRemoveGroups['sysop'][] = 'autoreview';
		}
	}
};

ruwiki (override = false, protection = false)

[edit]

Note: You also need to add yourself to the "editor" group to review pages.

// ruwiki
// InitializeSettings.php
$wgFlaggedRevsOverride = false;
$wgFlaggedRevsProtection = false;
$wgSimpleFlaggedRevsUI = true;
$wgFlaggedRevsHandleIncludes = 0;
$wgFlaggedRevsAutoReview = 3;
$wgFlaggedRevsLowProfile = true;
// CommonSettings.php
$wgAvailableRights[] = 'autoreview';
$wgAvailableRights[] = 'autoreviewrestore';
$wgAvailableRights[] = 'movestable';
$wgAvailableRights[] = 'review';
$wgAvailableRights[] = 'stablesettings';
$wgAvailableRights[] = 'unreviewedpages';
$wgAvailableRights[] = 'validate';
$wgGrantPermissions['editprotected']['movestable'] = true;
// flaggedrevs.php
wfLoadExtension( 'FlaggedRevs' );
$wgFlaggedRevsAutopromote = false;
$wgHooks['MediaWikiServices'][] = static function () {
	global $wgAddGroups, $wgDBname, $wgDefaultUserOptions,
		$wgFlaggedRevsNamespaces, $wgFlaggedRevsRestrictionLevels,
		$wgFlaggedRevsTags, $wgFlaggedRevsTagsRestrictions,
		$wgGroupPermissions, $wgRemoveGroups;

	$wgFlaggedRevsNamespaces[] = 828; // NS_MODULE
	$wgFlaggedRevsTags = [ 'accuracy' => [ 'levels' => 2 ] ];
	$wgFlaggedRevsTagsRestrictions = [
		'accuracy' => [ 'review' => 1, 'autoreview' => 1 ],
	];
	$wgGroupPermissions['autoconfirmed']['movestable'] = true; // T16166
	$wgGroupPermissions['sysop']['stablesettings'] = false; // -aaron 3/20/10
	$allowSysopsAssignEditor = true;
	
	// T39675, T49337
	$wgFlaggedRevsNamespaces = [ NS_MAIN, NS_FILE, NS_TEMPLATE, NS_CATEGORY, 100, 828 ];
	$wgFlaggedRevsTags['accuracy']['levels'] = 1;
	$wgGroupPermissions['sysop']['stablesettings'] = true; // -aaron 3/20/10
	$wgGroupPermissions['sysop']['review'] = false; // T275811
	# Remove reviewer group
	unset( $wgGroupPermissions['reviewer'] );
	
	# Rights for Bureaucrats (b/c)
	if ( isset( $wgGroupPermissions['reviewer'] ) ) {
		if ( !in_array( 'reviewer', $wgAddGroups['bureaucrat'] ?? [] ) ) {
			// promote to full reviewers
			$wgAddGroups['bureaucrat'][] = 'reviewer';
		}
		if ( !in_array( 'reviewer', $wgRemoveGroups['bureaucrat'] ?? [] ) ) {
			// demote from full reviewers
			$wgRemoveGroups['bureaucrat'][] = 'reviewer';
		}
	}
	# Rights for Sysops
	if ( isset( $wgGroupPermissions['editor'] ) && $allowSysopsAssignEditor ) {
		if ( !in_array( 'editor', $wgAddGroups['sysop'] ) ) {
			// promote to basic reviewer (established editors)
			$wgAddGroups['sysop'][] = 'editor';
		}
		if ( !in_array( 'editor', $wgRemoveGroups['sysop'] ) ) {
			// demote from basic reviewer (established editors)
			$wgRemoveGroups['sysop'][] = 'editor';
		}
	}
	if ( isset( $wgGroupPermissions['autoreview'] ) ) {
		if ( !in_array( 'autoreview', $wgAddGroups['sysop'] ) ) {
			// promote to basic auto-reviewer (semi-trusted users)
			$wgAddGroups['sysop'][] = 'autoreview';
		}
		if ( !in_array( 'autoreview', $wgRemoveGroups['sysop'] ) ) {
			// demote from basic auto-reviewer (semi-trusted users)
			$wgRemoveGroups['sysop'][] = 'autoreview';
		}
	}
};

Dogu's configuration

[edit]
// Dogu's configuration
wfLoadExtension( 'FlaggedRevs' );

# FlaggedRevs configuration
$wgGroupPermissions['reviewer']['review'] = true;
$wgGroupPermissions['reviewer']['validate'] = true;
$wgGroupPermissions['reviewer']['autoreview'] = true;
$wgGroupPermissions['reviewer']['unreviewedpages'] = true;

$wgGroupPermissions['editor']['autoreview'] = true;
$wgGroupPermissions['editor']['unreviewedpages'] = true;

$wgGroupPermissions['autoreview']['autoreview'] = true;

$wgGroupPermissions['sysop']['stablesettings'] = true;
$wgGroupPermissions['sysop']['movestable'] = true;

$wgAvailableRights[] = 'review';
$wgAvailableRights[] = 'validate';
$wgAvailableRights[] = 'autoreview';
$wgAvailableRights[] = 'autoreviewrestore';
$wgAvailableRights[] = 'unreviewedpages';
$wgAvailableRights[] = 'stablesettings';
$wgAvailableRights[] = 'movestable';
$wgFlaggedRevsLowProfile = true;

MassMessage

[edit]
$wgGroupPermissions['user']['editcontentmodel'] = false;
$wgGroupPermissions['sysop']['editcontentmodel'] = true;
$wgGroupPermissions['massmessage-sender']['massmessage'] = true;

ORES

[edit]
  • Add this to LocalSettings.php:
$wgPageTriageEnableOresFilters = true;
$wgOresWikiId = 'enwiki';
$wgOresModels = [
	'articlequality' => [ 'enabled' => true, 'namespaces' => [ 0 ], 'cleanParent' => true ],
	'draftquality' => [ 'enabled' => true, 'namespaces' => [ 0 ], 'types' => [ 1 ] ]
];
  • docker compose exec mediawiki php maintenance/run.php ORES:BackfillPageTriageQueue.php
  • ORES fails to fetch data for some of the earlier revisions in the enwiki database, to circumvent this, you could use the selenium tests (npm run selenium-test) to create a bunch of articles and revisions in the DB.
  • NOTE: Some MediaWiki documentation will mention the $wgJobRunRate variable that controls how many jobs get executed. By default this variable is set to 0. Setting this however, is not necessary since Mediawiki-Docker provides a seperate container that periodically runs accumulated jobs.

PageTriage

[edit]

Config settings:

wfLoadExtension( 'PageTriage' );
	$wgPageTriageDraftNamespaceId = 118;
	$wgExtraNamespaces[ $wgPageTriageDraftNamespaceId ] = 'Draft';
	$wgExtraNamespaces[ $wgPageTriageDraftNamespaceId + 1 ] = 'Draft_talk';
	$wgPageTriageNoIndexUnreviewedNewArticles = true;
	// Special:NewPagesFeed has some code that puts "created by new editor" if they are not autoconfirmed. But autoconfirmed needs to be turned on.
	$wgAutoConfirmCount = 10;
	$wgAutoConfirmAge = 4;
	$wgPageTriageEnableCopyvio = true;
wfLoadExtension( 'ORES' );
	$wgPageTriageEnableOresFilters = true;
	$wgOresWikiId = 'enwiki';
	$wgOresModels = [
		'articlequality' => [ 'enabled' => true, 'namespaces' => [ 0 ], 'cleanParent' => true ],
		'draftquality' => [ 'enabled' => true, 'namespaces' => [ 0 ], 'types' => [ 1 ] ]
	];
wfLoadExtension( 'Echo' );
wfLoadExtension( 'WikiLove' );

ProofreadPage

[edit]

Scribunto (Modules, Lua)

[edit]
  • says there's extra steps, but works out of the box for me

SecurePoll

[edit]
  • install as normal
  • $wgGroupPermissions['electionadmin']['securepoll-create-poll'] = true;
  • $wgGroupPermissions['electionadmin']['securepoll-administrate-poll'] = true;
  • $wgGroupPermissions['electionadmin']['securepoll-view-voter-pii'] = true;
  • $wgSecurePollSingleTransferableVoteEnabled = true;
  • $wgSecurePollUseLogging = true;
  • // $wgSecurePollUseNamespace = true; // commenting out since this is currently broken in localhost
  • make yourself an electionadmin, so you can add yourself as an admin when creating polls
  • Special:SecurePoll/create

SyntaxHighlight

[edit]
  • Careful when git cloning. The extension is actually named SyntaxHighlight_GeSHi
  • chmod a+x extensions/SyntaxHighlight_GeSHi/pygments/pygmentize

VisualEditor

[edit]
  • install as normal
  • cd extensions/VisualEditor
  • git submodule update --init - this git clones the lib/ve repo into a subdirectory, so that visual editor works on your localhost wiki. do not edit these files though. see below.
  • there are two repos:
    • mediawiki/extensions/VisualEditor
    • VisualEditor/VisualEditor - the contents of the lib/ve folder. patches for this repo need to be done completely separately. git clone it into its own folder completely outside of /mediawiki/ when you work on it and submit patches for it.

Wikibase (Wikidata)

[edit]
  • Wikibase Repository and Wikibase Client have separate pages on MediaWiki wiki, but they are both located in a repo named Wikibase.
  • The repo is divided into a couple different sub-repos, contained in folders in the main repo
    • client
    • lib
    • repo

VS Code

[edit]

First time

[edit]
  • ubuntu
  • code . - This opens VS Code inside WSL
  • Go to your list of extensions. Filter by installed. They are installed in Windows but not WSL yet. You'll need to click a blue button ("Install in WSL: Ubuntu") to reinstall most of them.

Window #1 - Open the mediawiki folder in VS Code

[edit]
  • ubuntu
  • cd mediawiki
  • code . - This opens VS Code inside WSL
  • In the future, this will show up in File -> Open Recent, so you can quickly open it.

Window #2 - Open the extension folder in VS Code

[edit]
  • If you're working on a MediaWiki extension or skin, open two windows: one for MediaWiki core, and one for the extension you're working on.
    • Run your step debugger in the MediaWiki core window (including setting breakpoints)
    • Do your coding work in the extension window. This will give you "search within repo", git, etc.
  • ubuntu
  • cd mediawiki
  • cd extensions/PageTriage
  • code . - This opens VS Code inside WSL
  • Add this to your extension, in a file called .vscode/settings.json, so that MediaWiki core's libraries get imported and detected by PHP IntelliSense:
{
    "intelephense.environment.includePaths": [
        "../../"
    ]
}

Linters

[edit]
  • Sniffer Mode - onType
  • JavaScript linting - I use the VS Code extension "ESLint". It works out of the box.
  • PHP linting
    • PHP Sniffer - for detecting sniffs
      • settings -> tick "auto detect"
    • phpcbf - for fixing sniffs
      • to use it, right click -> format document

Other extensions

[edit]
  • Git Blame
  • GitHub Pull Requests
  • PHP Debug
  • PHP Intelephense
  • Sort lines
  • WSL

Debugging

[edit]

PHP step debugging: XDebug

[edit]
  • Always run XDebug from the /mediawiki/ directory, not from an extension directory. According to the documentation, this is mandatory.
  • Make sure VS Code has WSL and PHP Debug extensions installed.
  • Add this to your .env file:
XDEBUG_CONFIG='mode=debug start_with_request=yes client_host=host.docker.internal client_port=9003 idekey=VSCODE' 
XDEBUG_MODE=debug,coverage
  • If you like XDebug's feature of providing big orange errors/warnings/stack traces, include develop in your XDEBUG_MODE.
  • Replace your launch.json with the below. The "hostname": "0.0.0.0" and pathMappings parts are very important for getting XDebug to work inside WSL.
{
	// Use IntelliSense to learn about possible attributes.
	// Hover to view descriptions of existing attributes.
	// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
	"version": "0.2.0",
	"configurations": [
		{
			"name": "Listen for XDebug",
			"type": "php",
			"request": "launch",
			"hostname": "0.0.0.0",
			"port": 9003,
			"pathMappings": {
			  "/var/www/html/w": "${workspaceFolder}"
			}
		},
		{
			"name": "Launch currently open script",
			"type": "php",
			"request": "launch",
			"program": "${file}",
			"cwd": "${fileDirname}",
			"port": 9003
		}
	]
}

JavaScript step debugging: Google Chrome devtools

[edit]
  • TODO: see if I can get this working in VS Code instead
  • If you're having trouble setting a breakpoint (for example, the code you need is minified by ResourceLoader), add debugger; to your code.
  • if you're having trouble with minification or caching (15 minutes), add ?debug=1 to the URL

Vue debugging: Vue devtools (browser extension)

[edit]

Running tests

[edit]

How to run an extension's tests:

PHPUnit

[edit]
  • First time:
    • Add this to your .env file
      • to get PHPUnit to stop outputting detailed debugging (recommended, else your unit test output is really noisy): PHPUNIT_LOGS=0
      • to get PHPUnit to use your actual database instead of a TEMPORARY database, so that you can peek at the tables when you step debug: PHPUNIT_USE_NORMAL_TABLES=1
    • sudo chmod 0775 vendor/bin/phpunit
  • Core
    • docker compose exec mediawiki composer phpunit:entrypoint - all
  • Folder/type
    • docker compose exec mediawiki composer phpunit:unit - tests in the /unit/ subfolder only
    • docker compose exec mediawiki composer phpunit:integration - tests in the /integration/ subfolder only
  • Extensions and skins
    • docker compose exec mediawiki composer phpunit:entrypoint -- extensions/PageTriage/tests/phpunit/ - an extension's tests only
  • Specific file
    • docker compose exec mediawiki composer phpunit:entrypoint -- extensions/PageTriage/tests/phpunit/ApiPageTriageActionTest.php - a specific test file only
  • Specific test
    • docker compose exec mediawiki composer phpunit:entrypoint -- --filter testSubmissionSortingByCreatedDate extensions/PageTriage/tests/phpunit/integration/ApiPageTriageListTest.php
  • @group
    • 🔎(todo)
  • Debugging CI

Jest

[edit]
  • First time - install nvm (node version manager) so you can switch to the correct version of node used by Wikimedia
    • curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.0/install.sh | bash
    • restart bash
    • nvm install 18 - installs Node version 18, which is what is currently used by Wikimedia
  • cd mediawiki/extensions/PageTriage
  • npm test - does linting too
  • npm run test:unit - does tests (for this extension only) and code coverage
  • npm run test:unit --silent:false - shows console.log output, in case you want to spy on a variable
  • npm run test:unit -- ext.pageTriage.defaultTagsOptions.test.js - run a single test file
  • npm run test:unit ext.pageTriage.defaultTagsOptions.test.js -- --coverage=false - if a code coverage report is on by default in your repo, this silences it
  • npm run test:unit -- --updateSnapshot - this will regenerate snapshots for your snapshot tests
  • non-Mediawiki repos: generate HTML coverage reports using npx jest --coverage

QUnit

[edit]
  • Note that QUnit tests will run in CI even if they are not set up at npm test or in package.json. One of the CI test entry points is Special:JavaScriptTest. In fact, npm test is only supposed to be linters, no tests.
  • how to run the tests
    • in bash
    • via web interface
      • add $wgEnableJavaScriptTest = true; to LocalSettings.php
      • then visit Special:JavaScriptTest

Selenium

[edit]

Parser tests

[edit]
  • All
    • docker compose exec mediawiki php tests/parser/parserTests.php
  • Specific extension
    • docker compose exec mediawiki php tests/parser/parserTests.php --file=extensions/SyntaxHighlight_GeSHi/tests/parser/parserTests.txt

Code coverage

[edit]

How to generate code coverage reports:

  • PHPUnit
    • In your .env file, XDEBUG_MODE must include "coverage". Example: XDEBUG_MODE=debug,coverage. Restart your mediawiki docker after changing this.
    • Open the file mediawiki/tests/phpunit/suite.xml. Replace the <coverage></coverage> section with something similar to the following. You need to specify every extension file and directory you want checked, and you need to delete all the mediawiki directory folders.
      • 	<coverage includeUncoveredFiles="true">
        		<include>
        			<directory suffix=".php">../../extensions/FlaggedRevs/api</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/backend</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/business</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/frontend</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/maintenance</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/rest</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/scribunto</directory>
        			<file>../../extensions/FlaggedRevs/FlaggedRevsSetup.php</file>
        		</include>
        	</coverage>
        
    • docker compose exec mediawiki php tests/phpunit/phpunit.php --testsuite extensions --coverage-html extensions/FlaggedRevs/coverage extensions/FlaggedRevs/tests/phpunit

Generating documentation

[edit]

PHP

[edit]
  • First time:
    • docker compose exec mediawiki apt update
    • docker compose exec mediawiki apt install doxygen
  • docker compose exec mediawiki php maintenance/run.php mwdocgen
    • For core, this will take around 10 minutes.
  • The configuration file is located at maintenance/Doxyfile in core, Doxyfile (root directory) everywhere else
  • mw:Manual:Mwdocgen.php
  • mw:Manual:Coding conventions/PHP#Comments and documentation
  • Requires extra steps to get it published to doc.wikimedia.org. Including getting the permission of the maintainers, adding it to integration/config, and adding it to the homepage of doc.wikimedia.org.

JavaScript

[edit]
  • npm run doc - Generates documentation in the /docs/js/ folder. Navigate to /docs/js/index.html to view.
  • The configuration file is located at jsdoc.json or .jsdoc.json
  • mw:JSDoc#Configuration example
  • Requires extra steps to get it published to doc.wikimedia.org. Including getting the permission of the maintainers, adding it to integration/config, and adding it to the homepage of doc.wikimedia.org.
  • JS documentation does not ride the train. It publishes instantly. doc.wikimedia.org has a 1 hour server-side cache so it may be 1 hour before changes show up.

Running maintenance scripts

[edit]
  • core
    • docker compose exec mediawiki php maintenance/run.php showSiteStats will run maintenance/showSiteStats.php
  • extension
    • docker compose exec mediawiki php maintenance/run.php Adiutor:updateConfiguration will run extensions/Adiutor/maintenance/updateConfiguration.php

SQL database

[edit]
  • how to install the database if you already have a LocalSettings.php file with correct database connection info, and a created database
    • harder than it should be. I've created a ticket. But in the meantime...
    • 🔎go into HeidiSQL, delete all the tables
    • rename your LocalSettings.php file to something else
    • re-run docker compose exec mediawiki php maintenance/run.php install, with all the correct CLI parameters
    • delete LocalSettings.php
    • rename your old LocalSettings.php back to LocalSettings.php
  • how to update the database (installs SQL tables for extensions)
    • docker compose exec mediawiki php maintenance/run.php update
  • how to drop all tables on a MariaDB

SQLite or MariaDB?

[edit]
  • SQLite is the default. Pros and cons:
    • Pro - Keep your localhost database synced between computers, e.g. desktop and laptop, because the database is stored in the docker container in the /cache/ directory.
    • Pro - Easily clear the database by simply deleting the /cache/ directory.
    • Pro - Easy to set up a database viewer and editor, since you just need to point it to /cache/sqlite/my_wiki.sqlite
    • Con - Different than Wikimedia production, which uses MariaDB
    • Con - Subtle bugs such as using raw SQL instead of $this->db-expr() can break tests and break the extension in general. In Gerrit, check experimental can be used to test for some of this.
    • Con - JSherman (WMF) says he's had problems with DeferredUpdate and job queue behavior in SQLite.
  • MariaDB is an alternative, and removes an entire class of possible bugs since it is much closer to how Wikimedia production is set up. How to set it up:

Viewing and modifying the database: HeidiSQL

[edit]
  • to view/edit the SQL database, install HeidiSQL (download page)
  • sqlite
    • 🔎point HeidiSQL at mediawiki/cache/sqlite
  • mariadb
    • make sure your docker-compose/override.yml file has the following:
          ports:
            - 3306:3306
    • configure HeidiSQL with the settings in docker-compose.override.yml
      • root
        • hostname = localhost
        • username = root
        • password = root_password
      • or a specific database
        • hostname = localhost
        • username = my_username
        • password = my_password
        • database = my_database
    • I couldn't figure out how to shell into the database, so use HeidiSQL logged in as root for creating databases, editing users, etc.

LocalSettings.php

[edit]
  • To get uploading working...
    • LocalSettings.php: $wgEnableUploads = true;
    • bash: chmod 0777 images
    • Then visit Special:Upload

Things to do at the start of every session

[edit]
  • ubuntu
  • Fire up your 2 VS Code windows (1 for MediaWiki Core, 1 for the extension you're working on)
  • Activate XDebug for MediaWiki Core
  • eval `ssh-agent -s`; ssh-add /home/novemlinguae/.ssh/id_ed25519; cd ~/mediawiki; docker compose up -d; nvm use 18
  • GitHub change to Gerrit.ps1

Miscellaneous

[edit]
  • File sizes
    • MediaWiki + skin + extension files is around 1.1 GB
    • Docker files are around ?? GB
  • how to remote into Docker so that you don't have to add docker compose exec mediawiki to the start of every command, and so that you can cd around more easily
    • docker compose exec mediawiki bash
    • exit
  • how to run an extension's maintenance script
    • docker compose exec mediawiki php extensions/PageTriage/maintenance/DeleteAfcStates.php
  • restarts
    • any changes to the .env file require a restart of the Docker container: docker compose up -d

Troubleshooting

[edit]
  • PHP errors when loading the wiki in a browser, after taking a break for a couple weeks and then doing git pull on core or one extension
    • Update core, all extension, and all skins with git pull, docker compose exec mediawiki composer update, and npm ci.
    • Comment out the extensions and skins you're not using in LocalSettings.php, so you have less extensions and skins to update.
    • Don't forget to update vector. This is often forgotten and is often the source of the problem.
  • Container mediawiki-mariadb-1: Error response from daemon: Ports are not available: exposing port TCP 0.0.0.0:3306 -> 0.0.0.0:0: listen tcp 0.0.0.0:3306: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted.
    • Are you also running XAMPP? Close XAMPP, then go into Task Manager and terminate mysqld.exe.
  • error during connect: This error may indicate that the docker daemon is not running.: Get "http://%2F%2F.%2Fpipe%2Fdocker_engine/v1.24/containers/json?all=1&filters=%7B%22label%22%3A%7B%22com.docker.compose.project%3Dmediawiki%22%3Atrue%7D%7D&limit=0": open //./pipe/docker_engine: The system cannot find the file specified.
    • Start Docker Desktop, then try your CLI command again.
  • There's a bunch of files in WSL that end in .dropbox.attrs
    • Delete them with ubuntu, find . -name "*.dropbox.attrs" -type f -delete
  • fatal: fsync error on '//wsl.localhost/Ubuntu/home/novemlinguae/mediawiki/extensions/AbuseFilter/.git/objects/pack/tmp_idx_83ZVF3': Bad file descriptor. fatal: fetch-pack: invalid index-pack output
    • Did you git clone in PowerShell instead of WSL by accident? Need to git clone from within WSL.
  • npm ERR! code EUSAGE. The `npm ci` command can only install with an existing package-lock.json or npm-shrinkwrap.json with lockfileVersion >= 1. Run an install with npm@5 or later to generate a package-lock.json file, then try again.
    • Did you npm ci in PowerShell instead of WSL by accident? Need to npm ci from within WSL.
  • sh: 1: phpunit: Permission denied
    • sudo chmod 0775 vendor/bin/phpunit
  • cmd.exe was started with the above path as the current directory. unc paths are not supported
    • You're trying to run npm/Jest in Ubuntu, but npm is not installed in Ubuntu, so it is using the Windows version. The fix is to install the Ubuntu version. See the Unit Test -> Jest section above.
  • sh: 1: eslint: Permission denied
    • Your npm packages are corrupted. Did you install them using npm for Windows instead of npm for Ubuntu by accident? The fix is to install the Ubuntu version. See the Unit Test -> Jest section above. Then npm ci to repair your packages.
  • Error: Class "ResourceLoaderSkinModule" not found
    • Update your skins (git checkout master, git pull, docker compose exec mediawiki composer update)
  • Special:NewPagesFeed / pagetriagelist API query times out
    • Change the filters it is using. The combination of filters you're using is buggy. phab:T356833
  • Notice: Did not find alias for special page 'NewPagesFeed'. Perhaps no aliases are defined for it?
    • Restart your Docker container. docker compose down && docker compose up -d
  • git pull gives a "divergent branches" error
    • git reset --hard origin/master
  • VS Code: Failed to save 'X': Unable to write file 'Y' (NoPermissions (FileSystemError): Error: EACCES: permission denied, open 'Z')
    • Some of your files are owned by "root" instead of "novemlinguae". Fix with...
    • sudo chown -R novemlinguae:novemlinguae ~/mediawiki
  • Windows system gets laggy. Vmmem consumes huge amount of memory in task manager (8-14 GB)
    • WSL has a memory leak
    • exit
    • wsl --shutdown
    • When Docker pops up a window that WSL has crashed, click Restart
  • /bin/bash: warning: setlocale: LC_ALL: cannot change locale (en_US.UTF-8)
    • This repo has npm pre-commit installed, and VS Code's "commit" button doesn't work well for some reason. Do git commit -m "Your commit message in bash instead. Should work there.
  • After I build the assets, I noticed differences in the contents to what you committed. Try running `npm run build` again or removing the node_modules folder and running npm install with the correct node version.
    • This repo has npm pre-commit installed, and you forgot to run npm run build before committing. Run npm run build, then commit again.
  • ssh: Could not resolve hostname gerrit.wikimedia.org: Temporary failure in name resolution
    • Turn off your VPN
  • When loading localhost:8080 in browser, "This site can’t provide a secure connectionlocalhost sent an invalid response. Try running Windows Network Diagnostics. ERR_SSL_PROTOCOL_ERROR"

Notes

[edit]
  1. ^ a b Do not use git clone https://gerrit.wikimedia.org/r/mediawiki/core.git mediawiki. This will mess up Gerrit / Git Review when submitting patches.
  2. ^ In my case, not running this inside the Docker shell will use XAMPP instead of Docker, and my XAMPP is on PHP 7.4 instead of PHP 8.1, so I will get PHP version errors when trying to run it.