5000+ wikis saved to date
WikiTeam, a set of tools for wiki preservation and a repository of wikis
|Project status||Online! (at least some of them)|
|Archiving status||In progress... (you can help)|
|Project tracker||manual for now, check not archived wikis on wikiapiary|
Welcome to WikiTeam. A wiki is a website that allows the creation and editing of any number of interlinked web pages, generally used to store information on a specific subject or subjects. This is done with a day-to-day web browser using a simplified markup language (HTML as an example) or a WYSIWYG (what-you-see-is-what-you-get) text editor.
Most of the wikis don't offer public backups. How bad!
Wikis to archive
Please add a wiki to wikiapiary if you want someone to archive it sooner or later; or tell us on the #wikiteam channel if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.
You can help downloading wikis yourself. If you don't know where to start, pick a wiki which was not archived yet from the lists on wikiapiary. If you can't, edit those pages to link existing dumps! You'll help others focus their work.
Examples of huge wikis:
- Wikipedia - arguably the largest and one of the oldest Wikis on the planet. It offers public backups (also for sister projects): http://dumps.wikimedia.org
- They have some mirrors but not many.
- Every now and then we upload a copy to archive.org, but this is not automated. You can do it in our stead. ;)
- Wikimedia Commons - a Wiki of media files available for free usage. It offers public backups: http://dumps.wikimedia.org
- But there is no image dump available, only the image descriptions
- So we made it! http://archive.org/details/wikimediacommons
- Wikia - a website that allows the creation and hosting of wikis. Doesn't make regular backups.
There are also several wikifarms with hundreds of wikis. On this wiki we only create pages for those we have some special information about that we don't want to lose (like archiving history and tips). For a full list, please use wikiapiary: see the wikifarms main page.
Tools and source code
Official WikiTeam tools
- WikiTeam Google Code repository
- dumpgenerator.py to download MediaWiki wikis: python dumpgenerator.py --api=http://archiveteam.org/api.php --xml --images
- wikipediadownloader.py to download Wikipedia dumps from download.wikimedia.org: python wikipediadownloader.py
- Scripts of a guy who saved Wikitravel
- OddMuseWiki backup
- UseModWiki: use wget/curl and raw mode (might have a different URL scheme, like this)
Most of our dumps are in the wikiteam collection at the Internet Archive. If you want an item to land there, just upload it in "opensource" collection and remember the "WikiTeam" keyword, it will be moved at some point. When you've uploaded enough wikis, you'll probably be made a collection admin to save others the effort to move your stuff.
For a manually curated list, visit the download section on Google Code.
- When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z use to be smaller (better compress ratio), so use 7z.
- To download a mass of wikis, just
$listin N chunks, then start N instances of
tmux and at the same time a window or screen running a loop of
uploader.py $list --prune-directories --prune-wikidumpand a
sleeptime (to upload dumps as they're ready and clean up your storage). Occasionally attach to the tmux session and look (
ctrl-b f) for windows stuck on "is wrong", "is slow" or "......" loops, or which are inactive. Even with a couple cores you can run a hundred instances, just make sure to have enough disk space for the occasional huge ones (tens of GB).
You can download and seed the torrents from the archive.org collection.
- http://wikiindex.org - A lot of wikis to save
- http://wiki1001.com/ offline?
- http://www.cs.brown.edu/~pavlo/mediawiki/mediawikis.csv - 20,000 wikis
- List of largest wikis in the world
- Dump of nostalgia, an ancient version of Wikipedia from 2001, dump
- http://code.google.com/p/wikiteam/wiki/AvailableBackups many dumps