User:Novem Linguae/Essays/Docker tutorial for Windows (WSL)

From Wikipedia, the free encyclopedia

My notes on how to get a local development environment up and running for MediaWiki development. Written from a Windows and VS Code perspective.

Local development environments are essential for the patch writing process. They allow you to instantly test your changes, before submitting your patch. They are also essential for the debugging process, since it allows you to step debug the issue if it's reproducible.

🔎TODO: I'm currently running things in like 4 different containers (PowerShell, Ubuntu, MediaWiki Docker, Fresh), and switching between the consoles. Should probably get absolutely everything running in 1 container such as Fresh, then rewrite these directions. Would simplify things.

Things to do at the start of every session[edit]

  • ubuntu
  • Fire up your 2 VS Code windows (1 for MediaWiki Core, 1 for the extension you're working on)
  • Activate XDebug for MediaWiki Core
  • eval `ssh-agent -s`; ssh-add /home/novemlinguae/.ssh/id_ed25519
  • cd mediawiki; docker compose up -d; nvm use 18
  • GitHub change to Gerrit.ps1

Some pessimistic advice[edit]

Expect to spend more time setting up your dev environment than you do coding, until you've got it set up perfectly on all your computers, and you've mastered the ins and out of this work instruction. Can take months to become fluent. MediaWiki has a complicated toolchain.

Windows Subsystem for Linux (WSL)[edit]

This is the Windows Subsystem for Linux (WSL) version of this work instruction. The no-WSL version is located at User:Novem Linguae/Essays/Docker tutorial for Windows (no-WSL).

Why use WSL?

  • Advantages
    • Not using WSL, some pages can take 25 seconds to load (barely usable). Using WSL can get you down to 3 seconds (normal, much better).
  • Disadvantages
    • Can't keep files in Dropbox anymore.
    • More complicated to set up.

Docker[edit]

Docker is a fancy XAMPP. It lets whatever codebase you're working on pick what OS, what version of PHP/Python/Node, what database, etc. to use instead of depending on whatever version of XAMPP you happened to install. Then it automates the installation of everything for you.

If you try to use PHP 8.1 with a repo that is using a bunch of PHP 7.4 dependencies, for example, you may not be able to get a dev environment up and running, even if you do composer update instead of composer install. You'll get a bunch of errors. You'd be forced to uninstall XAMPP 8.1 and install XAMPP 7.4, which is a pain. Maybe you need XAMPP 8.1 for your other project, so would have to do this all over again when switching projects. Docker automates all this.

Install WSL[edit]

  • In PowerShell...
  • Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux
  • wsl --install -d ubuntu
    • When prompted, enter a username such as novemlinguae
    • When prompted, enter a password
    • When prompted, retype your password
  • wsl --set-version ubuntu 2
  • install Docker Desktop for Windows
  • Docker -> Settings -> General -> tick "Use the WSL 2 based engine"
  • Docker -> Settings -> Resources -> WSL Integration -> tick "Ubuntu"

Install useful software (composer, git, etc.)[edit]

  • Update the operating system
    • sudo apt update && sudo apt upgrade -y
  • Make sure Docker is not running. Else it will have trouble in the next step when trying to modify the running program mysql.
  • Install common dev programs that aren't already installed such as git-review and composer
    • sudo apt install apache2 composer git git-review imagemagick mysql-client mysql-server php php-apcu php-cli php-gd php-intl php-mbstring php-mysql php-xml zip php-curl
  • Configure git and git-review
    • git config --global user.name "Novem Linguae"; git config --global user.email "novemlinguae@gmail.com"; git config --global gitreview.remote origin; git config --global gitreview.username novemlinguae;
  • Configure npm (for running unit tests and downloading JS packages). If you don't install it now, npm will run the Windows version instead of the Ubuntu version and corrupt a bunch of stuff.
    • curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.0/install.sh | bash
    • restart bash
    • nvm install 18 - installs Node version 18, which is what is currently used by Wikimedia
  • In general, anytime you touch anything in composer, you'll want to use docker compose exec mediawiki composer ..., to avoid some nasty situations that can arise when you use the wrong PHP version. Your Ubuntu's PHP version may not be the same PHP version that is running in the Docker container.
  • Most console commands from now on in the rest of the tutorial will be done from within WSL (type ubuntu in PowerShell to access a Ubuntu shell) unless otherwise noted.

Eliminate password prompts[edit]

  • Get git and git review to stop asking you for your password until you close the window:
  • Add to ~/.profile:
    • if test "$PS1"; then
        if [ -z "$SSH_AUTH_SOCK" ]; then
          eval $(ssh-agent -s)
        fi
      fi
      
  • eval `ssh-agent -s`
  • ssh-add /home/novemlinguae/.ssh/id_ed25519

Install MediaWiki core (1)[edit]

  • Set up your SSH keys in Ubuntu. You can generate new ones, or copy them over from Windows.
    • If you copy them over from Windows, they need to go from the C:\Users\NovemLinguae\.ssh\ directory to the /home/novemlinguae/.ssh/ directory.
    • You also need to make sure to set the private key file's permissions to 0600. chmod 0600 .ssh/id_ed25519
  • ubuntu
  • git clone "ssh://novemlinguae@gerrit.wikimedia.org:29418/mediawiki/core" - replace "novemlinguae" with your Gerrit username[1]
  • create .env file. This is similar to the .env file provided at https://github.com/wikimedia/mediawiki/blob/master/DEVELOPERS.md, with a couple tweaks to make XDebug work, set the correct UID/GID for Windows, make PHPUnit throw less notices, etc.
MW_SCRIPT_PATH=/w
MW_SERVER=http://localhost:8080
MW_DOCKER_PORT=8080
MEDIAWIKI_USER=Admin
MEDIAWIKI_PASSWORD=dockerpass
XDEBUG_ENABLE=true
XDEBUG_CONFIG='mode=debug start_with_request=yes client_host=host.docker.internal client_port=9003 idekey=VSCODE'
XDEBUG_MODE=debug,coverage
XHPROF_ENABLE=true
PHPUNIT_LOGS=0
PHPUNIT_USE_NORMAL_TABLES=1
MW_DOCKER_UID=
MW_DOCKER_GID=

Start Docker[edit]

  • ubuntu
  • cd mediawiki
  • docker compose up -d

Install MediaWiki core (2)[edit]

  • ubuntu
  • follow the official instructions at https://github.com/wikimedia/mediawiki/blob/master/DEVELOPERS.md
    • docker compose exec mediawiki composer update[2]
    • docker compose exec mediawiki /bin/bash /docker/install.sh - does initial configuration and database creation. assumes sqlite. if you already have a LocalSettings.php file and want to install mariadb, see below.
  • VERY IMPORTANT FOR WINDOWS USERS: docker compose exec mediawiki chmod -R o+rwx cache/sqlite
  • npm ci

Install MediaWiki extensions and skins[edit]

Automatically[edit]

Place this bash script in your home directory, name it install-extensions.sh, then run it by typing bash install-extension.sh.

#!/bin/bash
# make sure Docker is running
cd mediawiki
docker compose up -d
#git checkout master
#git pull
#docker compose exec mediawiki composer update
#npm ci
echo "What's the name of the extension? Capitalize it correctly please."
read extensionName
cd extensions
git clone "ssh://novemlinguae@gerrit.wikimedia.org:29418/mediawiki/extensions/$extensionName"
docker compose exec mediawiki composer update --working-dir "extensions/$extensionName"
cd $extensionName
npm ci
mkdir .vscode
cd .vscode
touch settings.json
printf "{\n\t\"intelephense.environment.includePaths\": [\n\t\t\"../../\"\n\t]\n}\n" >> settings.json
cd ../../..
echo "wfLoadExtension( '$extensionName' );" >> LocalSettings.php
docker compose exec mediawiki composer update
docker compose exec mediawiki php maintenance/run.php update

Manually[edit]

  • ubuntu
  • cd extensions or cd skins
  • foreach (skin/extension):
    • git clone "ssh://novemlinguae@gerrit.wikimedia.org:29418/mediawiki/extensions/PageTriage" - replace "novemlinguae" with your Gerrit username, and replace "PageTriage" with the extension name[1]
    • docker compose exec mediawiki composer update --working-dir "extensions/PageTriage"
    • cd PageTriage (or whatever the name is)
    • npm ci
    • add wfLoadExtension( 'PageTriage' ); , wfLoadSkin( 'Vector' );, or similar to LocalSettings.php
    • create .vscode/settings.json (and populate it with the text in the section below)
  • docker compose exec mediawiki php maintenance/run.php update - does database updates for skins and extensions

Installing complicated extensions[edit]

Adiutor[edit]

  • install Echo
  • install BetaFeatures
  • docker compose exec mediawiki php maintenance/run.php Adiutor:updateConfiguration - this will create 7 .json pages onwiki. Check Special:RecentChanges to see them.
  • Special:Preferences -> beta features -> tick "Adiutor"
  • Special:Preferences -> moderation -> tick all
  • Special:AdiutorSettings
  • When I tried this, I was getting a blank page, with a JS error in the console. Did I not update a dependency?Try again someday.

CentralAuth[edit]

  • install mw:Extension:AntiSpoof. mandatory dependency
    • docker compose exec mediawiki php maintenance/run.php AntiSpoof:batchAntiSpoof.php
  • test here if you want, to make sure AntiSpoof is working: http://localhost:8080/
  • add to config: $wgSharedTables[] = 'spoofuser';
  • install mw:Extension:CentralAuth, skipping the maintenance/run.php update step.
  • log into HeidiSQL as root
    • create database named centralauth
    • Tools -> User manager -> my_user -> Add object -> centralauth
    • Tick the check box, granting access to all
    • Save
    • INSERT INTO global_group_permissions (ggp_group,ggp_permission) VALUES ('steward','globalgrouppermissions'), ('steward','globalgroupmembership');
    • take a backup of the Extension:AntiSpoof table (counter-intuitively named spoofuser). then upload that table to the new centralauth database
  • ubuntu
  • docker compose exec mediawiki php maintenance/run.php sql --wikidb centralauth extensions/CentralAuth/schema/mysql/tables-generated.sql - use mysql for mariadb, sqlite for sqlite
  • docker compose exec mediawiki php maintenance/run.php CentralAuth:migratePass0.php
  • docker compose exec mediawiki php maintenance/run.php CentralAuth:migratePass1.php
  • Probably need to do a bunch of configuration, as detailed at mw:Extension:CentralAuth#Setup. I'm going to skip that, since all I need at the moment is for Special:GlobalGroupPermissions to work.

DiscussionTools[edit]

  • install dependencies
    • Linter
    • Echo
    • VisualEditor
  • install as normal
  • LocalSettings.php
    • $wgLocaltimezone = "America/Los_Angeles";
    • date_default_timezone_set( $wgLocaltimezone );
    • $wgFragmentMode = [ 'html5' ];

ORES[edit]

  • Add this to LocalSettings.php:
$wgPageTriageEnableOresFilters = true;
$wgOresWikiId = 'enwiki';
$wgOresModels = [
	'articlequality' => [ 'enabled' => true, 'namespaces' => [ 0 ], 'cleanParent' => true ],
	'draftquality' => [ 'enabled' => true, 'namespaces' => [ 0 ], 'types' => [ 1 ] ]
];
  • docker compose exec mediawiki php maintenance/run.php ORES:BackfillPageTriageQueue.php

PageTriage[edit]

Config settings:

wfLoadExtension( 'PageTriage' );
	$wgPageTriageDraftNamespaceId = 118;
	$wgExtraNamespaces[ $wgPageTriageDraftNamespaceId ] = 'Draft';
	$wgExtraNamespaces[ $wgPageTriageDraftNamespaceId + 1 ] = 'Draft_talk';
	$wgPageTriageNoIndexUnreviewedNewArticles = true;
	// Special:NewPagesFeed has some code that puts "created by new editor" if they are not autoconfirmed. But autoconfirmed needs to be turned on.
	$wgAutoConfirmCount = 10;
	$wgAutoConfirmAge = 4;
	$wgPageTriageEnableCopyvio = true;
wfLoadExtension( 'ORES' );
	$wgPageTriageEnableOresFilters = true;
	$wgOresWikiId = 'enwiki';
	$wgOresModels = [
		'articlequality' => [ 'enabled' => true, 'namespaces' => [ 0 ], 'cleanParent' => true ],
		'draftquality' => [ 'enabled' => true, 'namespaces' => [ 0 ], 'types' => [ 1 ] ]
	];
wfLoadExtension( 'Echo' );
wfLoadExtension( 'WikiLove' );

ProofreadPage[edit]

Scribunto (Modules, Lua)[edit]

  • says there's extra steps, but works out of the box for me

SyntaxHighlight[edit]

  • Careful when git cloning. The extension is actually named SyntaxHighlight_GeSHi
  • chmod a+x extensions/SyntaxHighlight_GeSHi/pygments/pygmentize

VisualEditor[edit]

  • install as normal
  • cd extensions/VisualEditor
  • git submodule update --init - this git clones the lib/ve repo into a subdirectory

Wikibase (Wikidata)[edit]

  • Wikibase Repository and Wikibase Client have separate pages on MediaWiki wiki, but they are both located in a repo named Wikibase.
  • The repo is divided into a couple different sub-repos, contained in folders in the main repo
    • client
    • lib
    • repo

VS Code[edit]

First time[edit]

  • ubuntu
  • code . - This opens VS Code inside WSL
  • Go to your list of extensions. Filter by installed. They are installed in Windows but not WSL yet. You'll need to click a blue button ("Install in WSL: Ubuntu") to reinstall most of them.

Window #1 - Open the mediawiki folder in VS Code[edit]

  • ubuntu
  • cd mediawiki
  • code . - This opens VS Code inside WSL
  • In the future, this will show up in File -> Open Recent, so you can quickly open it.

Window #2 - Open the extension folder in VS Code[edit]

  • If you're working on a MediaWiki extension or skin, open two windows: one for MediaWiki core, and one for the extension you're working on.
    • Run your step debugger in the MediaWiki core window (including setting breakpoints)
    • Do your coding work in the extension window. This will give you "search within repo", git, etc.
  • ubuntu
  • cd mediawiki
  • cd extensions/PageTriage
  • code . - This opens VS Code inside WSL
  • Add this to your extension, in a file called .vscode/settings.json, so that MediaWiki core's libraries get imported and detected by PHP IntelliSense:
{
    "intelephense.environment.includePaths": [
        "../../"
    ]
}

Linters[edit]

  • Sniffer Mode - onType
  • JavaScript linting - I use the VS Code extension "ESLint". It works out of the box.
  • PHP linting - Broken, need to fix - I use the VS Code extension "PHP Sniffer & Beautifier". It requires some configuration.
    • Download the .bat files from the official repo. Place them somewhere. Then point to them.
    • Executable Path CS - D:\Dropbox\MV\BitBucket\vendor\bin\phpcs.bat
    • Executable Path CBF - D:\Dropbox\MV\BitBucket\vendor\bin\phpcbf.bat
    • Note to self: Change drive letter to F for home computer, D for laptop.
    • Windows 10 single slash, Windows 11 double slash.
    • I tried like 5 other extensions. None work out of the box like ESLint does.

Debugging[edit]

PHP step debugging: XDebug[edit]

Always run XDebug from the /mediawiki/ directory, not from an extension directory. According to the documentation, this is mandatory.

Add this to your .env file:

XDEBUG_CONFIG='mode=debug start_with_request=yes client_host=host.docker.internal client_port=9003 idekey=VSCODE' 
XDEBUG_MODE=debug,coverage

Replace your launch.json with this. The "hostname": "0.0.0.0" part is very important for getting XDebug to work inside WSL.

{
	// Use IntelliSense to learn about possible attributes.
	// Hover to view descriptions of existing attributes.
	// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
	"version": "0.2.0",
	"configurations": [
		{
			"name": "Listen for XDebug",
			"type": "php",
			"request": "launch",
			"hostname": "0.0.0.0",
			"port": 9003,
			"pathMappings": {
			  "/var/www/html/w": "${workspaceFolder}"
			}
		},
		{
			"name": "Launch currently open script",
			"type": "php",
			"request": "launch",
			"program": "${file}",
			"cwd": "${fileDirname}",
			"port": 9003
		}
	]
}

JavaScript step debugging: Google Chrome devtools[edit]

  • TODO: see if I can get this working in VS Code instead
  • If you're having trouble setting a breakpoint (for example, the code you need is minified by ResourceLoader), add debugger; to your code.
  • if you're having trouble with minification or caching (15 minutes), add ?debug=1 to the URL

Vue debugging: Vue devtools (browser extension)[edit]

Running tests[edit]

How to run an extension's tests:

PHPUnit[edit]

  • First time:
    • Add this to your .env file
      • to get PHPUnit to stop outputting detailed debugging (recommended, else your unit test output is really noisy): PHPUNIT_LOGS=0
      • to get PHPUnit to use your actual database instead of a TEMPORARY database, so that you can peek at the tables when you step debug: PHPUNIT_USE_NORMAL_TABLES=1
    • sudo chmod 0775 vendor/bin/phpunit
  • Core
    • docker compose exec mediawiki composer phpunit:entrypoint - all
  • Folder/type
    • docker compose exec mediawiki composer phpunit:unit - tests in the /unit/ subfolder only
    • docker compose exec mediawiki composer phpunit:integration - tests in the /integration/ subfolder only
  • Extensions and skins
    • docker compose exec mediawiki composer phpunit:entrypoint -- extensions/PageTriage/tests/phpunit/ - an extension's tests only
  • Specific file
    • docker compose exec mediawiki composer phpunit:entrypoint -- extensions/PageTriage/tests/phpunit/ApiPageTriageActionTest.php - a specific test file only
  • Specific test
    • docker compose exec mediawiki composer phpunit:entrypoint -- --filter testSubmissionSortingByCreatedDate extensions/PageTriage/tests/phpunit/integration/ApiPageTriageListTest.php
  • @group
    • 🔎(todo)
  • Debugging CI

Jest[edit]

  • First time - install nvm (node version manager) so you can switch to the correct version of node used by Wikimedia
    • curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.0/install.sh | bash
    • restart bash
    • nvm install 18 - installs Node version 18, which is what is currently used by Wikimedia
  • cd mediawiki/extensions/PageTriage
  • npm test - does linting too
  • npm run test:unit - does tests (for this extension only) and code coverage
  • npm run test:unit --silent:false - shows console.log output, in case you want to spy on a variable
  • npm run test:unit -- ext.pageTriage.defaultTagsOptions.test.js - run a single test file
  • npm run test:unit ext.pageTriage.defaultTagsOptions.test.js -- --coverage=false - if a code coverage report is on by default in your repo, this silences it
  • npm run test:unit -- --updateSnapshot - this will regenerate snapshots for your snapshot tests
  • non-Mediawiki repos: generate HTML coverage reports using npx jest --coverage

QUnit[edit]

Selenium[edit]

Parser tests[edit]

  • All
    • docker compose exec mediawiki php tests/parser/parserTests.php
  • Specific extension
    • docker compose exec mediawiki php tests/parser/parserTests.php --file=extensions/SyntaxHighlight_GeSHi/tests/parser/parserTests.txt

Code coverage[edit]

How to generate code coverage reports:

  • PHPUnit
    • In your .env file, XDEBUG_MODE must include "coverage". Example: XDEBUG_MODE=debug,coverage. Restart your mediawiki docker after changing this.
    • Open the file mediawiki/tests/phpunit/suite.xml. Replace the <coverage></coverage> section with something similar to the following. You need to specify every extension file and directory you want checked, and you need to delete all the mediawiki directory folders.
      • 	<coverage includeUncoveredFiles="true">
        		<include>
        			<directory suffix=".php">../../extensions/FlaggedRevs/api</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/backend</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/business</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/frontend</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/maintenance</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/rest</directory>
        			<directory suffix=".php">../../extensions/FlaggedRevs/scribunto</directory>
        			<file>../../extensions/FlaggedRevs/FlaggedRevsSetup.php</file>
        		</include>
        	</coverage>
        
    • docker compose exec mediawiki php tests/phpunit/phpunit.php --testsuite extensions --coverage-html extensions/FlaggedRevs/coverage extensions/FlaggedRevs/tests/phpunit

Running maintenance scripts[edit]

  • core
    • docker compose exec mediawiki php maintenance/run.php showSiteStats will run maintenance/showSiteStats.php
  • extension
    • docker compose exec mediawiki php maintenance/run.php Adiutor:updateConfiguration will run extensions/Adiutor/maintenance/updateConfiguration.php

SQL database[edit]

  • how to install the database if you already have a LocalSettings.php file with correct database connection info, and a created database
    • harder than it should be. I've created a ticket. But in the meantime...
    • 🔎go into HeidiSQL, delete all the tables
    • rename your LocalSettings.php file to something else
    • re-run docker compose exec mediawiki php maintenance/run.php install, with all the correct CLI parameters
    • delete LocalSettings.php
    • rename your old LocalSettings.php back to LocalSettings.php
  • how to update the database (installs SQL tables for extensions)
    • docker compose exec mediawiki php maintenance/run.php update
  • how to drop all tables on a MariaDB

SQLite or MariaDB?[edit]

  • SQLite is the default. Pros and cons:
    • Pro - Keep your localhost database synced between computers, e.g. desktop and laptop, because the database is stored in the docker container in the /cache/ directory.
    • Pro - Easily clear the database by simply deleting the /cache/ directory.
    • Pro - Easy to set up a database viewer and editor, since you just need to point it to /cache/sqlite/my_wiki.sqlite
    • Con - Causes integration tests to fail for certain extensions such as PageTriage, likely due to atomicity issues.
    • Con - Different than Wikimedia production, which uses MariaDB
  • MariaDB is an alternative. How to set it up:

Viewing and modifying the database: HeidiSQL[edit]

  • to view/edit the SQL database, install HeidiSQL (download page)
  • sqlite
    • 🔎point HeidiSQL at mediawiki/cache/sqlite
  • mariadb
    • make sure your docker-compose/override.yml file has the following:
          ports:
            - 3306:3306
    • configure HeidiSQL with the settings in docker-compose.override.yml
      • root
        • hostname = localhost
        • username = root
        • password = root_password
      • or a specific database
        • hostname = localhost
        • username = my_username
        • password = my_password
        • database = my_database
    • I couldn't figure out how to shell into the database, so use HeidiSQL logged in as root for creating databases, editing users, etc.

LocalSettings.php[edit]

  • To get uploading working...
    • LocalSettings.php: $wgEnableUploads = true;
    • bash: chmod 0777 images
    • Then visit Special:Upload

Miscellaneous[edit]

  • File sizes
    • MediaWiki + skin + extension files is around 1.1 GB
    • Docker files are around ?? GB
  • how to remote into Docker so that you don't have to add docker compose exec mediawiki to the start of every command, and so that you can cd around more easily
    • docker compose exec mediawiki bash
    • exit
  • how to run an extension's maintenance script
    • docker compose exec mediawiki php extensions/PageTriage/maintenance/DeleteAfcStates.php
  • restarts
    • any changes to the .env file require a restart of the Docker container: docker compose up -d

Troubleshooting[edit]

  • PHP errors when loading the wiki in a browser, after taking a break for a couple weeks and then doing git pull on core or one extension
    • Update core, all extension, and all skins with git pull, docker compose exec mediawiki composer update, and npm ci.
    • Comment out the extensions and skins you're not using in LocalSettings.php, so you have less extensions and skins to update.
    • Don't forget to update vector. This is often forgotten and is often the source of the problem.
  • Container mediawiki-mariadb-1: Error response from daemon: Ports are not available: exposing port TCP 0.0.0.0:3306 -> 0.0.0.0:0: listen tcp 0.0.0.0:3306: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted.
    • Are you also running XAMPP? Close XAMPP, then go into Task Manager and terminate mysqld.exe.
  • error during connect: This error may indicate that the docker daemon is not running.: Get "http://%2F%2F.%2Fpipe%2Fdocker_engine/v1.24/containers/json?all=1&filters=%7B%22label%22%3A%7B%22com.docker.compose.project%3Dmediawiki%22%3Atrue%7D%7D&limit=0": open //./pipe/docker_engine: The system cannot find the file specified.
    • Start Docker Desktop, then try your CLI command again.
  • There's a bunch of files in WSL that end in .dropbox.attrs
    • Delete them with ubuntu, find . -name "*.dropbox.attrs" -type f -delete
  • fatal: fsync error on '//wsl.localhost/Ubuntu/home/novemlinguae/mediawiki/extensions/AbuseFilter/.git/objects/pack/tmp_idx_83ZVF3': Bad file descriptor. fatal: fetch-pack: invalid index-pack output
    • Did you git clone in PowerShell instead of WSL by accident? Need to git clone from within WSL.
  • npm ERR! code EUSAGE. The `npm ci` command can only install with an existing package-lock.json or npm-shrinkwrap.json with lockfileVersion >= 1. Run an install with npm@5 or later to generate a package-lock.json file, then try again.
    • Did you npm ci in PowerShell instead of WSL by accident? Need to npm ci from within WSL.
  • sh: 1: phpunit: Permission denied
    • sudo chmod 0775 vendor/bin/phpunit
  • cmd.exe was started with the above path as the current directory. unc paths are not supported
    • You're trying to run npm/Jest in Ubuntu, but npm is not installed in Ubuntu, so it is using the Windows version. The fix is to install the Ubuntu version. See the Unit Test -> Jest section above.
  • sh: 1: eslint: Permission denied
    • Your npm packages are corrupted. Did you install them using npm for Windows instead of npm for Ubuntu by accident? The fix is to install the Ubuntu version. See the Unit Test -> Jest section above. Then npm ci to repair your packages.
  • Error: Class "ResourceLoaderSkinModule" not found
    • Update your skins (git checkout master, git pull, docker compose exec mediawiki composer update)
  • Special:NewPagesFeed / pagetriagelist API query times out
    • Change the filters it is using. The combination of filters you're using is buggy. phab:T356833
  • Notice: Did not find alias for special page 'NewPagesFeed'. Perhaps no aliases are defined for it?
    • Restart your Docker container. docker compose down && docker compose up -d
  • git pull gives a "divergent branches" error
    • git reset --hard origin/master

Notes[edit]

  1. ^ a b Do not use git clone https://gerrit.wikimedia.org/r/mediawiki/core.git mediawiki. This will mess up Gerrit / Git Review when submitting patches.
  2. ^ In my case, not running this inside the Docker shell will use XAMPP instead of Docker, and my XAMPP is on PHP 7.4 instead of PHP 8.1, so I will get PHP version errors when trying to run it.