Backing up the wiki

Backing up the wiki
Any confirmed user on this wiki is able to create an XML, JSON or image backup with Special:DataDump.

Basic site info, such as a list of the extensions in use, is downloaded to the JSON file. The XML dump does not contain user accounts, etc.

WikiTeam's dumpgenerator.py
Another method is to use the WikiTeam Python (only with version 2) dumpgenerator.py script from the command-line.

Example usage, this will produce a JSON file, an XML dump with page histories and a folder of files:

However large wikis may fail to export leaving an incomplete XML dump. The presence of a siteinfo.json file probably indicates a succesful XML dump.

Full instructions are at the WikiTeam tutorial.

Restoring from backup
See MediaWiki.org for more detailed instructions, (specifically Manual:Importing XML dumps and Manual:importImages.php).

After installing MediaWiki and extensions, in the shell use importDump.php to import the XML, this can take a long time. e.g. from the mediawiki folder

If that works repeat without --dry-run. It won't matter if the XML dump file has the file extension .gz or .bz2 (is compressed).

Due to the bug T206683 it may be necessary to also include  in the command.

Afterwards use ImportImages.php to import the images

Afterwards run  in order to update the content of Special:Recentchanges.