Difference between revisions of "WikiTeam"

From Archiveteam
Jump to navigation Jump to search
m
Tag: Manual revert
 
(141 intermediate revisions by 28 users not shown)
Line 1: Line 1:
<center><big>'''We save wikis, from Wikipedia to tiniest wikis'''<br/>[http://code.google.com/p/wikiteam/downloads/list?can=1 130+ wikis saved to date]</big></center>
{{Infobox project
{{Infobox project
| title = WikiTeam
| title = WikiTeam XML
| image = Wikiteam.jpg
| image = Wikiteam.jpg
| description = WikiTeam, a set of tools for wiki preservation and a repository of wikis
| description = WikiTeam, we preserve wikis
| URL = http://code.google.com/p/wikiteam
| project_status = {{specialcase}}
| project_status = {{online}}
| archiving_status = {{inprogress}} (manual)
| archiving_status = {{inprogress}}
| archiving_type = other
| source = [https://github.com/WikiTeam/wikiteam WikiTeam GitHub]
| irc = wikiteam
| irc = wikiteam
}}
}}


Welcome to '''WikiTeam'''. A '''wiki''' is a website that allows the creation and editing of any number of interlinked web pages, generally used to store information on a specific subject or subjects. This is done with a day-to-day web browser using a simplified markup language (HTML as an example) or a WYSIWYG (what-you-see-is-what-you-get) text editor.
{{Infobox project
| title = WikiBot
| image = Dummy.png
| description = IRC bot run by [[User:DigitalDragon|DigitalDragon]] using WikiTeam3 tools
| project_status = {{specialcase}}
| archiving_status = {{inprogress}} (manual)
| archiving_type = other
| source = [https://github.com/DigitalDwagon/WikiBot WikiBot GitHub]
| irc = wikibot
}}
 
'''WikiTeam''' software is a set of tools for archiving wikis. They work on [[MediaWiki]] wikis, but we want to expand to other wiki engines. As of 2019, WikiTeam has preserved more than 250,000 wikis.
 
You can check [https://archive.org/details/wikiteam our collection] at [[Internet Archive]], the [https://github.com/WikiTeam/wikiteam source code] on [[GitHub]] and some [https://wikiapiary.com/wiki/Websites/WikiTeam lists of wikis by status] on [[WikiApiary]]. There's also a [https://wikiapiary.com/wiki/Category:Website_not_archived list] of not yet archived wikis on WikiApiary.
 
There are two completely separate projects under the umbrella of '''WikiTeam''':
* The archival of the wikis in the form of XML dumps. This is what most of this page is about.
* The archival of external links found in wikis to WARCs. See the [[#Links warrior project|Links warrior project]] section.
 
The archival of the wikis themselves to WARCs is also desirable but has not been attempted yet.


Examples of huge wikis:
== Current status ==
* '''[[Wikipedia]]''' - arguably the largest and one of the oldest Wikis on the planet. It offers public backups: http://dumps.wikimedia.org
 
* '''[[Wikimedia Commons]]''' - a Wiki of media files available for free usage. It offers public backups: http://dumps.wikimedia.org
The total number of MediaWiki wikis is unknown, but some estimates exist.
** But there is no image dump available, only the image descriptions
 
* '''[[Wikia]]''' - a website that allows the creation and hosting of wikis. It offers public backups: http://wiki-stats.wikia.com
According to [[WikiApiary]], which is the most updated database, there are 21,139 independent wikis (1,718 are semantic) and 4,819 in wikifarms as of 2018-08-02.<ref>[https://wikiapiary.com/wiki/Websites Websites] - WikiApiary</ref> But it doesn't include 400,000+ [[Wikia]] wikis, and the independent list coverage can be improved for sure.
 
According to Pavlo's list generated in December 2008, there are 20,000 wikis.<ref>[http://cs.brown.edu/~pavlo/mediawiki/ Pavlo's list of wikis] ([http://www.cs.brown.edu/~pavlo/mediawiki/mediawikis.csv mediawiki.csv]) ([https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/mediawikis_pavlo.csv backup])</ref> This list was imported into WikiApiary.


There are also '''[[List of wikifarms|several wikifarms]]''' with hundreds of wikis.
According to [[WikiIndex]], there are 20,698 wikis.<ref>[http://wikiindex.org/Special:Statistics WikiIndex Statistics]</ref> The URLs in this project were added to WikiApiary in the past too.


Most of the wikis don't offer public backups. How bad!
A number of [[#Wikifarms|wikifarms]] have vanished and about 180 are still online.<ref>[https://wikiapiary.com/wiki/Farm:Farms Wikifarms]</ref><ref>[https://en.wikipedia.org/wiki/Comparison_of_wiki_hosting_services Comparison of wiki hosting services]</ref><ref>[http://wikiindex.org/Category:WikiFarm Category:WikiFarm]</ref>


== Tools and source code ==
Most wikis are small, containing about 100 pages or less, but there are some very large wikis:<ref>[http://meta.wikimedia.org/wiki/List_of_largest_wikis List of largest wikis]</ref><ref>[http://s23.org/wikistats/largest_html.php?th=15000&lines=500 List of largest wikis in the world]</ref>
=== Official WikiTeam tools ===
* By '''number of pages''': Wikimedia Commons (77 million), Wikidata (72 million), English Wikipedia (49 million), DailyWeeKee (35 million), WikiBusiness (22 million).
* [http://code.google.com/p/wikiteam/ WikiTeam Google Code repository]
* By '''number of files''': Wikimedia Commons (57 million), English Wikipedia (800,000).
* '''[http://code.google.com/p/wikiteam/source/browse/trunk/dumpgenerator.py dumpgenerator.py] to download MediaWiki wikis:''' <tt>python dumpgenerator.py --api=http://archiveteam.org/api.php --xml --images</tt>
* [http://code.google.com/p/wikiteam/source/browse/trunk/wikipediadownloader.py wikipediadownloader.py] to download Wikipedia dumps from download.wikimedia.org: <tt>python wikipediadownloader.py</tt>


=== Other ===
The oldest dumps are probably some 2001 dumps of Wikipedia when it used UseModWiki.<ref>[https://dumps.wikimedia.org/archive/ Wikimedia Downloads Historical Archives]</ref><ref>[http://dumps.wikimedia.org/nostalgiawiki Dump] of [http://nostalgia.wikipedia.org/ Nostalgia], an ancient version of Wikipedia from 2001</ref>
* [http://dl.dropbox.com/u/63233/Wikitravel/Source%20Code%20and%20tools/Source%20Code%20and%20tools.7z Scripts of a guy who saved Wikitravel]
* [http://www.communitywiki.org/en/BackupThisWiki OddMuseWiki backup]
* UseModWiki: use wget/curl and [http://www.usemod.com/cgi-bin/wiki.pl?WikiPatches/RawMode raw mode] (might have a different URL scheme, like [http://meatballwiki.org/wiki/action=browse&id=TheTippingPoint&raw=1 this])


{{-}}
As of 2019, our collection at Internet Archive holds dumps for 250,000 wikis (including independent, wikifarm wikis, some packages of wikis and Wiki[pm]edia).<ref>[https://archive.org/details/wikiteam WikiTeam collection] at Internet Archive</ref>


== Wiki dumps ==
== Wikifarms ==
For a more detailed list, [http://code.google.com/p/wikiteam/downloads/list?can=1 visit the download section] on Google Code.


There is another site of MediaWiki dumps located [http://mirrors.sdboyd56.com/WikiTeam/index.html here] on [http://www.archiveteam.org/index.php?title=User:Sdboyd Scott's] Website. More dumps are available as a collection in the [http://www.archive.org/details/wikiteam Internet Archive].
There are also wikifarms with hundreds of wikis. Here we only create pages for those we have some special information about that we don't want to lose (like archiving history and tips). For a full list, please use WikiApiary [https://wikiapiary.com/wiki/Farm:Main_Page wikifarms main page].


TODO lists:
Before backing up a wikifarm, try to update the list of wikis for it. There are [https://github.com/WikiTeam/wikiteam/tree/master/listsofwikis/mediawiki Python scripts to generate those lists] for many wikifarms.
* [[WikiTeam/Sites using MediaWiki (English)]]
* [[WikiTeam/Sites using MediaWiki (Multilingual)]]
* Backup your favorite wikis or leave the URL [[Talk:WikiTeam|here]].


{| class="wikitable"
{| class="wikitable sortable plainlinks" style="text-align: center;"
| colspan=2 | '''Legend'''
! width=140px | Wikifarm !! width=80px | Wikis !! Status !! width=80px | Dumps !! Comments
|-
| style="background: lightgreen" |&nbsp;&nbsp;&nbsp;&nbsp;
| Good
|-
| style="background: lightyellow" |&nbsp;&nbsp;&nbsp;&nbsp;
| Could be better
|-
| style="background: lightcoral" |&nbsp;&nbsp;&nbsp;&nbsp;
| Bad
|-
| &nbsp;&nbsp;&nbsp;&nbsp;
| Unknown
|-
|}
{| class="wikitable" border=1 width=99% style="text-align: center;"
! Wiki !! Wiki is online? !! Dumps available? (official or home-made) !! Comments/Details !! Saved by us? Who? Where?
|-
| [http://s23.org/wikistats/anarchopedias_html.php Anarchopedias] || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: no. Home-made: [http://www.mediafire.com/file/t73az9cwhzco2wb/Anarchopedia_Jun2011.7z Yes] || - || idiolect
|-
| [http://archiveteam.org Archive Team Wiki] || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: no. Home-made: [http://code.google.com/p/wikiteam/downloads/list?can=1&q=archiveteam yes] || - || WikiTeam
|-
|-
| Bulbapedia || style="background: lightgreen" | Yes || style="background: lightcoral" | Official: no. Home-made: no || - || dr-spangle is working on it with a self-built PHP downloader
| [[Battlestar Wiki]] ([http://battlestarwiki.org site]) || data-sort-value=4 | 4<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/battlestarwiki.org battlestarwiki.org - list of wikis]</ref> || {{green|Online}} || data-sort-value=3 | 3<ref>[https://archive.org/search.php?query=identifier%3Awiki%2Abattlestarwikiorg%2A battlestarwikiorg - dumps]</ref> || Last dumped Mar/Apr 2022, [https://fr.battlestarwiki.ddns.net/ fr.battlestarwiki.ddns.net] shows wrong XML when dumping
|-
|-
| [[Citizendium]] || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: [http://en.citizendium.org/wiki/CZ:Downloads daily] (no full history). Home-made: [[Citizendium|yes]], April 2011 || style="background: lightyellow" | No image dumps available || -
| [[BluWiki]] ([http://wayback.archive.org/web/20090301060338/http://bluwiki.com/go/Main_Page site]) || data-sort-value=0 | Unknown || {{red|Offline}} || data-sort-value=24 | 24<ref>[https://archive.org/search.php?query=bluwiki%20subject%3Awikiteam bluwiki - dumps]</ref> ||  
|-
|-
| [http://s23.org/wikistats/editthis_html.php EditThis] || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: no. Home-made: in progress || - || -
| [[Communpedia]] ([https://wikiapiary.com/wiki/Communpedia_%28ru%29 site]) || data-sort-value=1 | 1 || {{orange|Unstable}} || data-sort-value=6 | 6<ref>[https://archive.org/search.php?query=subject%3A%22Comunpedia%22%20OR%20subject%3A%22Communpedia%22%20OR%20subject%3A%22kommynistru%22 communpedia - dumps]</ref> || Appears to no longer use MediaWiki.
|-
|-
| enciclopedia.us.es || style="background: lightgreen" | Yes || style="background: lightcoral" | Official: no. Home-made: no || style="background: lightyellow" | Sysop sent me page text sql tables || emijrp
| [[EditThis]] ([http://editthis.info site]) || data-sort-value=1357 | 1,357<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/editthis.info editthis.info - list of wikis]</ref>  || {{orange|Unstable}} || data-sort-value=1297 | 1,297 WikiDump<ref>[https://archive.org/search.php?query=editthisinfo%20subject%3Awikiteam editthis.info - dumps]</ref><br />1,357 WARC<ref>[https://archive.org/search.php?query=subject%3Aeditthis.info editthis.info warcarchives]</ref> || Most dumps were done in 2014. This wikifarm is not well covered in WikiApiary.<ref>[https://wikiapiary.com/wiki/Farm:EditThis Farm:EditThis]</ref> Uses MediaWiki 1.15.1 and therefore is not supported by dumpgenerator.py<ref>[https://editthis.info/1337/Special:Version editthis.info/1337 - Special:Version]</ref> WARC dumps were done in July 2022.
|-
|-
| [[Encyclopedia Dramatica]] || style="background: lightcoral" | No || style="background: lightyellow" | Official: no. Home-made: partial  || style="background: lightyellow" | WebEcology Project Article Dump (~9000 Articles)<br />Most of the Images probably Lost || -
| [[elwiki.com]] ([https://web.archive.org/web/20070917110429/http://www.elwiki.com/ site]) || data-sort-value=0 | Unknown<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/elwiki.com elwiki.com - list of wikis]</ref> || {{red|Offline}} || data-sort-value=0 | None<ref>[https://archive.org/search.php?query=elwiki%20subject%3Awikiteam elwiki.com - dumps]</ref> || Last seen online in 2008.<ref>[https://web.archive.org/web/20080221125135/http://www.elwiki.com/ We're sorry about the downtime we've been having lately]</ref> There is no dumps, presumably lost. Perhaps [https://web.archive.org/web/form-submit.jsp?type=prefixquery&url=http://elwiki.com/ some pages] are in the Wayback Machine.
|-
|-
| [[Encyclopedia Dramatica|Encyclopedia Dramatica.ch]]<br />(new ED) || style="background: lightgreen" | Yes || Official: ? Home-made: ?  || style="background: lightyellow" | Slowly being rebuilt from old sources.<br />Should be up for a while but for who knows how long? || -
| [[Fandom]] ([http://www.fandom.com site]) || data-sort-value=261146 | 261,146<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/fandom.com fandom.com - list of wikis]</ref> || {{green|Online}} || data-sort-value=300000 | 300,000+ || [http://community.fandom.com/wiki/Help:Database_download Help:Database download], [https://github.com/Wikia/app/tree/dev/extensions/wikia/WikiFactory/Dumps Their dumping code]
|-
|-
| [http://s23.org/wikistats/gentoo_html.php Gentoo wikis] || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: no. Home-made: [http://code.google.com/p/wikiteam/downloads/list?can=1&q=gentoo yes] || - || WikiTeam
| [[Miraheze]] ([https://meta.miraheze.org site]) || data-sort-value=6438 | 6,438<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/miraheze.org miraheze.org - list of wikis]</ref> || {{green|Online}} || data-sort-value=2200 | ~2,200<ref>[https://archive.org/search.php?query=miraheze%20subject%3Awikiteam miraheze - dumps]</ref> || Non-profit. Dumps were made in September 2016. Later in 2019 more dumps were uploaded.
|-
|-
| GNUpedia || style="background: lightcoral" | No || style="background: lightcoral" | Official: no. Home-made: no || style="background: lightcoral" | No database. This "wiki encyclopedia" was only HTML pages. Only ~3 articles were sent to the mailing list. After that, the project was closed || -
| [[Neoseeker.com]] ([https://neowiki.neoseeker.com site])|| data-sort-value=183 | 183<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/neoseeker.com neoseeker.com - list of wikis]</ref> || {{green|Online}} || data-sort-value=583 |583<ref>[https://archive.org/search.php?query=identifier%3Awiki%2Aneoseekercom%2A neoseeker.com - dumps]</ref> || Last dumped in June 2022, [https://sandbox.neoseeker.com/ sandbox.neoseeker.com] is a broken wiki
|-
|-
| [[MeatBall]] || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: no. Home-made: [http://mirrors.sdboyd56.com/WikiTeam/meatball_wiki-20110706.7z yes] ([http://code.google.com/p/wikiteam/downloads/detail?name=meatball_wiki-20110706.7z mirror]) || style="background: lightyellow" | No histories, no xml format || SDBoyd
| [[Orain]] ([https://meta.orain.org site]) || data-sort-value=425 | 425<ref>[https://raw.githubusercontent.com/WikiTeam/wikiteam/master/listsofwikis/mediawiki/orain.org orain.com - list of wikis]</ref> || {{red|Offline}} || data-sort-value=380 |380<ref>[https://archive.org/search.php?query=orain%20subject%3Awikiteam orain - dumps]</ref><ref>[https://archive.org/details/wikifarm-orain.org-20130824 Orain wikifarm dump (August 2013)]</ref> || Last seen online in September 2015. Dumps were made in August 2013, January 2014 and August 2015.
|-
|-
| [http://s23.org/wikistats/metapedias_html.php Metapedia] || style="background: lightgreen" | Yes || style="background: lightcoral" | Official: ?. Home-made: no || - || -
| [[Referata]] ([http://www.referata.com site]) || data-sort-value=156 | 156<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/referata.com referata.com - list of wikis]</ref> || {{orange|Unstable}} || data-sort-value=80 | ~80<ref>[https://archive.org/search.php?query=referata%20subject%3Awikiteam referata.com - dumps]</ref><ref>[https://archive.org/details/referata.com-20111204 Referata wikifarm dump 20111204]</ref><ref>[https://archive.org/details/wikifarm-referata.com-20130824 Referata wikifarm dump (August 2013)]</ref> || Check why there are dozens of wikis without dump.
|-
|-
| [http://s23.org/wikistats/scoutwiki_html.php Neoseeker] aka Scout wikis || style="background: lightgreen" | Yes || style="background: lightcoral" | Official: ?. Home-made: no || - || -
| [[ScribbleWiki]] ([http://scribblewiki.com site]) || data-sort-value=119 | 119<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/scribblewiki.com scribblewiki.com - list of wikis]</ref> || {{red|Offline}} || data-sort-value=0 | None<ref>[https://archive.org/search.php?query=scribblewiki%20subject%3Awikiteam scribblewiki.com - dumps]</ref> || Last seen online in 2008.<ref>[https://web.archive.org/web/20080404093502/http://scribblewiki.com/main.php What is ScribbleWiki?]</ref> There is no dumps, presumably lost. Perhaps [https://web.archive.org/web/form-submit.jsp?type=prefixquery&url=http://scribblewiki.com/ some pages] are in the Wayback Machine.
|-
|-
| [[Nupedia]] || style="background: lightcoral" | No || style="background: lightyellow" | Official: ?. Home-made: Yes, saved from IA || - || -
| [[ShoutWiki]] ([http://www.shoutwiki.com site]) || data-sort-value=2173 | 2,173<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/shoutwiki.com shoutwiki.com - list of wikis]</ref> || {{green|Online}} || data-sort-value=1300 | ~3,400<ref>[https://archive.org/search.php?query=identifier%3Awiki-%2Ashoutwikicom%2A%20subject%3Awikiteam shoutwiki.com - dumps]</ref><ref>[http://www.archive.org/details/shoutwiki.com ShoutWiki wikifarm dump]</ref> || Last dumped in June 2022. ~100 wikis cannot be archived because they are private.
|-
|-
| OmegaWiki || style="background: lightgreen" | Yes || style="background: lightgreen" | Official: [http://www.omegawiki.org/Development daily] || - || -
| [[Sourceforge]] || data-sort-value=0 | Unknown || {{green|Online}} || data-sort-value=315 | 315<ref>[https://archive.org/search.php?query=sourceforge%20subject%3Awikiteam sourceforge - dumps]</ref> ||  
|-
|-
| OpenStreetMap || style="background: lightgreen" | Yes || Official: Yes. Home-made: no || - || -
| [[Telepedia]] ([https://meta.telepedia.net/wiki/Telepedia_Meta_Wiki site]) || data-sort-value=3267 | 37 || {{green|Online}} || data-sort-value=2 | 2 ||  
|-
|-
| [http://s23.org/wikistats/opensuse_html.php OpenSUSE wikis] || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: no. Home-made: [http://code.google.com/p/wikiteam/downloads/list?can=1&q=opensuse yes] || - || Hydriz
| [[TropicalWikis]] ([http://tropicalwikis.com site]) || data-sort-value=187 | 187<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/tropicalwikis.com tropicalwikis.com - list of wikis]</ref> || {{red|Offline}} || data-sort-value=152 | 152<ref>[https://archive.org/search.php?query=tropicalwikis%20subject%3Awikiteam tropicalwikis.com - dumps]</ref> || Killed off in November 2013. Allegedly pending move to [[Orain]] (which became offline too). Data from February 2013 and earlier saved.
|-
|-
| OSDev || style="background: lightgreen" | Yes || style="background: lightgreen" | Official: [http://wiki.osdev.org/OSDev_Wiki:About weekly] || - || Not yet
| [[Wik.is]] ([http://wik.is site]) || data-sort-value=0 | Unknown || {{red|Offline}} || data-sort-value=0 | Unknown || Non-MediaWiki.
|-
|-
| [http://tvtropes.org TV Tropes] || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: No Unofficial: In progress || style="background: lightyellow" | No dump mechanism, using wget -nc -r -p -l 0 -np -w 45 -E -k -T 10 -nv -x "http://tvtropes.org" || DoubleJ
| [[Wiki-Site]] ([http://www.wiki-site.com site]) || data-sort-value=2659 | 2,659<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/wiki-site.com wiki-site.com - list of wikis]</ref> || {{green|Online}} || data-sort-value=367 | 367 || No uploaded dumps yet.
|-
|-
| [http://s23.org/wikistats/uncyclomedia_html.php Uncyclomedias] || - || - || - || -
| [[WikiHub]] ([http://wikihub.ssu.lt site]) || data-sort-value=0 | Unknown || {{red|Offline}} || data-sort-value=7 | 7<ref>[https://archive.org/details/wikifarm-wikihub.ssu.lt-20131110 wikihub - dumps]</ref> ||  
|-
|-
| Wikanda || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: no. Home-made: [http://code.google.com/p/wikiteam/downloads/list?can=1&q=wikanda yes] || - || emijrp
| [[Wiki.Wiki]] ([https://wiki.wiki site]) || data-sort-value=100 | 100<ref>[https://raw.githubusercontent.com/WikiTeam/wikiteam/master/listsofwikis/mediawiki/wiki.wiki wiki.wiki - list of wikis]</ref> || {{red|Offline}} || data-sort-value=0 | Unknown || Last seen online in 2017.
|-
|-
| [[Wikia]] || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: [http://wiki-stats.wikia.com/ on demand] || style="background: lightyellow" | No image dumps available || Not yet
| [[WikiForge]] ([https://meta.wikiforge.net site]) || data-sort-value=0 | Unknown || {{green|Online}} || data-sort-value=0 | None || Premium counterpart of [[WikiTide]].
|-
|-
| [http://wikifur.com WikiFur] || style="background: lightgreen" | Yes || style="background: lightgreen" | Official: [http://dumps.wikifur.com/ yes] || style="background: lightyellow" | No image dumps available || Not yet
| [[WikiTide]] ([https://meta.wikitide.com site]) || data-sort-value=174 | 174<ref>[https://meta.wikitide.com/wiki/WikiTide list of wikis]</ref> || {{green|Online}} || data-sort-value=0 | None || Non-profit. Formed by group of ex-[[Miraheze]] volunteers, some who had also volunteered at [[Orain]] and [[TropicalWikis]]. Self-dumps every month.
|-
|-
| WikiHow  || - || - || - || -
| [[Wikkii]] ([https://web.archive.org/web/20140621054654/http://wikkii.com/wiki/Free_Wiki_Hosting site]) || data-sort-value=3267 | 3,267 || {{red|Offline}} || data-sort-value=1300 | 1,300<ref>[https://archive.org/search.php?query=wikkii%20subject%3Awikiteam wikki.com - dumps]</ref> ||  
|-
|-
| [[Wikimedia Commons]] || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: [http://dumps.wikimedia.org/commonswiki/latest/ periodically] || style="background: lightyellow" | No image dumps available || Not yet
| [[YourWiki.net]] ([https://web.archive.org/web/20100124003107/http://www.yourwiki.net/wiki/YourWiki site]) || data-sort-value=0 | Unknown || {{red|Offline}} || data-sort-value=0 | Unknown ||  
|-
| [[Wikipedia]] || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: [http://dumps.wikimedia.org/backup-index.html periodically] || style="background: lightyellow" | No image dumps available. English Wikipedia dump uses to be very old || Not yet
|-
| [http://s23.org/wikistats/wikisite_html.php Wiki-site.com] || - || - || - || -
|-
|  WikiTravel || style="background: lightgreen" | Yes || style="background: lightyellow" | Official: [http://wikitravel.org/en/Wikitravel:Database_dump not yet]. Home-made: [http://code.google.com/p/wikiteam/downloads/list?can=1&q=wikitravel yes], another of [http://dl.dropbox.com/u/63233/Wikitravel/Complete%20zip/WikitravelComplete14-June-2010.7z 2010-06-14] || - || WikiTeam
|-
|  WikiWikiWeb || style="background: lightgreen" | Yes || style="background: lightyellow" | Home-made: [http://www.multiupload.com/BGGCFUHOE7 yes] || - || Ca7
|-
|  [http://co-forum.de/ (o:forum] || style="background: lightgreen" | Yes || style="background: lightcoral" | No || - || Not yet, to figure out how
|-
|  [http://www.wikiwiki.de/newwiki/pmwiki.php WikiWiki.de] || style="background: lightgreen" | Yes || style="background: lightcoral" | No || - || Not yet, to figure out how
|-
|  [http://www.wikiservice.at/gruender/wiki.cgi?action=HomePage GruenderWiki] || style="background: lightgreen" | Yes || style="background: lightcoral" | No || - || Not yet, to figure out how
|}
|}
== Wikis to archive ==
Please [https://wikiapiary.com/wiki/Special:FormEdit/Website add a wiki to WikiApiary] if you want someone to archive it sooner or later; or tell us on IRC ({{IRC|wikiteam}}) if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.
[https://github.com/WikiTeam/wikiteam/wiki/Tutorial You can help] downloading wikis yourself. If you don't know where to start, pick a [https://wikiapiary.com/wiki/Category:Website_not_archived wiki which was not archived yet] from the lists on WikiApiary. Also, you can edit those pages to link existing dumps! You'll help others focus their work.
Examples of huge wikis:
* '''[[Wikipedia]]''' - arguably the largest and one of the oldest wikis on the planet. It offers public backups (also for sister projects): https://dumps.wikimedia.org
** They have some mirrors but not many.
** The transfer of the dumps to the Internet Archive is automated and is currently managed by [[User:Hydriz|Hydriz]].
* '''[[Wikimedia Commons]]''' - a wiki of media files available for free usage. It offers public backups: https://dumps.wikimedia.org
** But there is no image dump available, only the image descriptions
** So we made it! http://archive.org/details/wikimediacommons
* '''[[Wikia]]''' - a website that allows the creation and hosting of wikis. Doesn't make regular backups.
We're trying to decide which [https://groups.google.com/forum/#!topic/wikiteam-discuss/TxzfrkN4ohA other wiki engines] to work on: suggestions needed!
== Tools and source code ==
=== Official WikiTeam tools ===
* [https://github.com/WikiTeam/wikiteam WikiTeam in GitHub]
* '''[https://raw.githubusercontent.com/WikiTeam/wikiteam/master/dumpgenerator.py dumpgenerator.py] to download MediaWiki wikis:''' <tt>python dumpgenerator.py --api=https://www.archiveteam.org/api.php --xml --images</tt>
* [https://raw.githubusercontent.com/WikiTeam/wikiteam/master/wikipediadownloader.py wikipediadownloader.py] to download Wikipedia dumps from download.wikimedia.org: <tt>python wikipediadownloader.py</tt>
=== Other ===
* [https://web.archive.org/web/20150403081903/http://dl.dropbox.com/u/63233/Wikitravel/Source%20Code%20and%20tools/Source%20Code%20and%20tools.7z Scripts of a guy who saved Wikitravel]
* [http://www.communitywiki.org/en/BackupThisWiki OddMuseWiki backup]
* UseModWiki: use wget/curl and [http://www.usemod.com/cgi-bin/wiki.pl?WikiPatches/RawMode raw mode] (might have a different URL scheme, like [http://meatballwiki.org/wiki/action=browse&id=TheTippingPoint&raw=1 this])
** Some wikis: [[UseMod:SiteList]]
=== MediaWiki Dump Generator ===
The MediaWiki Client Tools' [https://github.com/mediawiki-client-tools/mediawiki-dump-generator MediaWiki Dump Generator] dumpgenerator script, a [https://en.wikipedia.org/wiki/Python_(programming_language) Python] 3.x port of the [https://github.com/WikiTeam/wikiteam WikiTeam] Python 2.7  dumpgenerator.py script.
It is run from the command-line in a terminal.
The XML dump can include full or only most recent page history.
The images dump will contain all file types with associated descriptions.
The <code>siteinfo.json</code> and <code>SpecialVersion.html</code> files will contain information about wiki features such as the installed extensions and skins.
User account information won't be preserved.
Full instructions are at the MediaWiki Client Tools' [https://github.com/mediawiki-client-tools/mediawiki-dump-generator MediaWiki Dump Generator] GitHub repository.
* WikiBot is an IRC bot designed to manage running MediaWiki Dump Generator and [[DokuWiki]] grabs. {{IRC|wikibot}}
== Wiki dumps ==
Most of our dumps are in the [http://www.archive.org/details/wikiteam wikiteam collection at the Internet Archive]. If you want an item to land there, just upload it in "opensource" collection and remember the "WikiTeam" keyword, it will be moved at some point. When you've uploaded enough wikis, you'll probably be made a collection admin to save others the effort to move your stuff.
For a manually curated list, [https://github.com/WikiTeam/wikiteam/wiki/Available-Backups visit the download section] on GitHub.
There is another site of MediaWiki dumps located [http://mirrors.sdboyd56.com/WikiTeam/index.html here] on [http://www.archiveteam.org/index.php?title=User:Sdboyd Scott's] website.


=== Tips ===
=== Tips ===
Some tips:
Some tips follow.
* When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z use to be smaller (better compress ratio), so use 7z.
 
Before archiving or asking for archiving, check that the wiki is suitable to be archived:
 
* Check the size of the wiki, very large wikis might overflow disk or be hard to archive; visit the <code>Special:SpecialPages</code> page and click on the <code>Special:MediaStatistics</code> (or <code>Special:ListFiles</code> for older wikis) and <code>Special:Statistics</code> pages. You can also directly enter those pages in the search bar or edit the URL. For non-English wikis, they will redirect to the correct localized title.
* Check the date the last upload of the wiki you are after, either [https://archive.org/search?query=originalurl%3A*examplewikiname* in the browser], or on the command-line using the [https://github.com/jjjake/internetarchive ia] tool:
 
<pre>
ia search originalurl:*examplewikiname* | jq -r .identifier | xargs ia metadata | jq -r '.metadata.addeddate, .metadata.originalurl'
</pre>
 
Don't issue commands you don't understand, especially batch commands which use loops or find and xargs, unless you're ready to lose all the data you got.
 
When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z is usually smaller (better compress ratio), so use 7z.
 
To download a mass of wikis with N parallel threads, just <code>split</code> your full <code>$list</code> in N chunks, then start N instances of <code>launcher.py</code> ([https://github.com/WikiTeam/wikiteam/wiki/Tutorial#Download_a_list_of_wikis tutorial]), one for each list
* If you want to upload dumps as they're ready and clean up your storage: at the same time, in a separate window or screen, run a loop of the kind <code>while true; do ./uploader.py $list --prune-directories --prune-wikidump; sleep 12h; done;</code> (the <code>sleep</code> ensure each run has something to do).
* If you're using --xmlrevisions, dumpgenerator.py will use much less memory because it won't get giant blobs of XML from Special:Export when a big page has a thousand revisions or more. You can then afford to run 100 instances of launcher.py/dumpgenerator.py with just 2 cores and 8 GiB of RAM. Watch your ulimit for the number of files, individual and total memory: 7z may consume up to 5 GiB of RAM for the biggest dumps (over 10 GiB). CPU usage tends to be lower at the beginning (launcher.py is not yet launching any 7z task because few dumps have completed) and the disk is usually hit harder at a beginning of a resume (launcher.py needs to scan the directories multiple times and dumpgeneratory.py needs to read the lists of titles, XML and image directories). Before increasing concurrency, make sure you have enough resources for those stressful times, not just for the easy ride at the beginning of the dump.
* If you want to go advanced and run really ''many'' instances, use <code>tmux</code>[http://blog.hawkhost.com/2010/07/02/tmux-%E2%80%93-the-terminal-multiplexer-part-2/]! Use [https://serverfault.com/a/814089/203035 tmux new-window] to launch several instances in the same session. Every now and then, attach to the tmux session and look (<code>ctrl-b f</code>) for windows stuck on "is wrong", "is slow" or "......" loops, or which are inactive[http://unix.stackexchange.com/questions/78093/how-can-i-make-tmux-monitor-a-window-for-inactivity]. Even with a couple cores you can run a hundred instances, just make sure to have enough disk space for the occasional huge ones (tens of GB).
* If you get closer to a 1000 instances of launcher.py, it may be too much for tmux to handle. You're probably not actually going to look at the output of hundreds of windows anyway. So just run everything in the background with xargs, monitor the crashes and then check the directories manually.<pre>split -a 4 -d -l 10 wikistoarchive.txt wt_ ; ls -1 wt_* | xargs -n1 -I§ -P300 sh -c "python launcher.py § 2>&1 > /dev/null ; "</pre>
 
If you have many wikidump directories, some of the following commands may be useful. Sometimes a dump is complete but the 7z is missing or broken (e.g. for lack of memory), or you're running low on disk and you can't wait for uploader.py to verify the uploads one by one. A hint of a complete dump is the presence of siteinfo.json: that means dumpgenerator thought the XML was done, but an image download may still be running.
* Check errors in 7z files. It's better to avoid running uploader.py on many archives if you're not sure they're fine (for instance if you've not monitored crashes of dumpgenerator.py/launcher.py). It's much harder for other people to download the 7z files from archive.org and check them after they've been uploaded, and the presence of an archive may discourage someone else from making a new one even if the archive is not actually usable. <pre>find -maxdepth 1 -type f -name "*7z" | xargs -n1 -P4 -I§ sh -c "7z l § 2>&1 | grep ^ERROR "</pre>
* Delete directories corresponding to a 7z file.<pre>find -maxdepth 1 -name "*wikidump.7z" | cut -d/ -f2 | sed 's,.7z,,g' | xargs -P8 rm -rf</pre>
* If launcher.py has failed to create 7z files due to running low on resources, you may make them manually with a loop and lower compression level.<pre>find -maxdepth 1 -name siteinfo.json | cut -d/ -f2 |sed 's,wikidump,,g' | xargs -n1 -P6 -I§ sh -c "cd §wikidump/ ; 7za a -ms=off -mx=3 ../§history.xml.7z §history.xml §titles.txt errors.log index.html config.txt siteinfo.json Special:Version.html ; ../§history.xml.7z ../§wikidump.7z ; 7za a -mx=1 ../§wikidump.7z images/ §images.txt ; "</pre>
* Find the biggest ongoing wikidump directories: when you don't have something as nice as [https://dev.yorhel.nl/ncdu ncdu], something simple may suffice, like  <code>du -shc * | grep  G</code> or <code>find -maxdepth 2 -type f -name "*xml" -size +1G</code>.


=== BitTorrent downloads ===
=== BitTorrent downloads ===
A feed of BitTorrent downloads is available for the latest files posted to the [http://code.google.com/p/wikiteam/downloads/list WikiTeam Google Code Downloads].
You can download and seed the torrents from the archive.org collection. Every item has a "Torrent" link.
* [http://pipes.yahoo.com/lobstor/google_code_torrent?_render=rss&project=wikiteam WikiTeam Torrent Feed] (pipes.yahoo.com)
Files under 1 MB are blocked on the service generating these torrents (Burnbit.com), so not every file is available as a torrent. There may be some delay after a file is uploaded before the torrent appears on the feed. You can subscribe to this feed in your BitTorrent client for automatic downloads (this has been tested successfully in µTorrent on Windows).


=== Mirrors ===
=== Old mirrors ===
<span class="plainlinks">
<span class="plainlinks">
# [https://sourceforge.net/projects/wikiteam/files/ Sourceforge] (also mirrored to another 26 mirrors)
# [https://sourceforge.net/projects/wikiteam/files/ Sourceforge] (also mirrored to another 26 mirrors)
Line 144: Line 197:
</span>
</span>


== Closing/In danger ==
=== Recursive ===
* Gentoo wikis: Error 503 Service Unavailable as of 2011-04-06 http://s23.org/wikistats/gentoo_html.php
 
** Again up. [http://code.google.com/p/wikiteam/downloads/list?can=1&q=gentoo Saved]! [[User:Emijrp|Emijrp]] 21:30, 10 April 2011 (UTC)
We also have dumps for our coordination wikis:
* [[ArchiveTeam wiki]] ([https://archive.org/details/wiki-archiveteamorg 2014-03-26])
* [[WikiApiary]] ([https://archive.org/details/wiki-wikiapiarycom_w 2015-03-25])
 
== Restoring wikis ==
 
Anyone can restore a wiki using its XML dump and images.
 
Wikis.cc is [https://www.wikis.cc/wiki/Wikis_recuperados restoring some sites].
 
== Links warrior project ==
{{Infobox project
| title = WikiTeam links
| image = Wikiteam.jpg
| description = We preserve external links used in wikis
| project_status = {{specialcase}}
| archiving_status = {{inprogress}} (dormant since 2017)
| source = [https://github.com/Archiveteam/wikis-grab wikis-grab]
| tracker = [https://tracker.archiveteam.org/wikis/ wikis]
| irc = wikiteam
}}
 
There is a (currently dormant) warrior project to archive external links used in wikis. The target format for this archival is [[WARC]]. The data from this project is uploaded to [https://archive.org/details/archiveteam_wiki this collection] on the Internet Archive.
 
== References ==
<references/>


== External links ==
== External links ==
* http://wikiindex.org - A lot of wikis to save
* [https://github.com/WikiTeam/wikiteam WikiTeam] on GitHub
* http://wiki1001.com/ offline?
* [https://github.com/mediawiki-client-tools MediaWiki Client Tools] on GitHub
* http://www.cs.brown.edu/~pavlo/mediawiki/mediawikis.csv - 20,000 wikis
* [http://wikiindex.org WikiIndex] - an index of wikis
* http://meta.wikimedia.org/wiki/List_of_largest_wikis
* [http://s23.org/wikistats/ S23 wikistats] - stats for over 40,000 wikis
* http://s23.org/wikistats/
* [https://www.mediawiki.org/wiki/Hosting_services Comparison of wiki farms]
* http://en.wikipedia.org/wiki/Comparison_of_wiki_farms
* [http://en.wikipedia.org/wiki/User:Emijrp/Wikipedia_Archive Wikipedia Archive]
* http://en.wikipedia.org/wiki/User:Emijrp/Wikipedia_Archive
* http://blog.shoutwiki.com/
* http://wikiheaven.blogspot.com/
* [http://s23.org/wikistats/largest_html.php?th=15000&lines=500 List of largest wikis in the world]
* Dump of [http://nostalgia.wikipedia.org/ nostalgia], an ancient version of Wikipedia from 2001, [http://dumps.wikimedia.org/nostalgiawiki dump]
* http://code.google.com/p/wikiteam/downloads/list?can=1 many dumps


{{Navigation box}}
{{wikis}}


[[Category:Archive Team]]
[[Category:Archive Team]]
[[Category:Wikis| ]]

Latest revision as of 18:41, 24 September 2023

WikiTeam XML
WikiTeam, we preserve wikis
WikiTeam, we preserve wikis
Status Special case
Archiving status In progress... (manual)
Archiving type other
Project source WikiTeam GitHub
IRC channel #wikiteam (on hackint)
WikiBot
IRC bot run by DigitalDragon using WikiTeam3 tools
IRC bot run by DigitalDragon using WikiTeam3 tools
Status Special case
Archiving status In progress... (manual)
Archiving type other
Project source WikiBot GitHub
IRC channel #wikibot (on hackint)

WikiTeam software is a set of tools for archiving wikis. They work on MediaWiki wikis, but we want to expand to other wiki engines. As of 2019, WikiTeam has preserved more than 250,000 wikis.

You can check our collection at Internet Archive, the source code on GitHub and some lists of wikis by status on WikiApiary. There's also a list of not yet archived wikis on WikiApiary.

There are two completely separate projects under the umbrella of WikiTeam:

  • The archival of the wikis in the form of XML dumps. This is what most of this page is about.
  • The archival of external links found in wikis to WARCs. See the Links warrior project section.

The archival of the wikis themselves to WARCs is also desirable but has not been attempted yet.

Current status

The total number of MediaWiki wikis is unknown, but some estimates exist.

According to WikiApiary, which is the most updated database, there are 21,139 independent wikis (1,718 are semantic) and 4,819 in wikifarms as of 2018-08-02.[1] But it doesn't include 400,000+ Wikia wikis, and the independent list coverage can be improved for sure.

According to Pavlo's list generated in December 2008, there are 20,000 wikis.[2] This list was imported into WikiApiary.

According to WikiIndex, there are 20,698 wikis.[3] The URLs in this project were added to WikiApiary in the past too.

A number of wikifarms have vanished and about 180 are still online.[4][5][6]

Most wikis are small, containing about 100 pages or less, but there are some very large wikis:[7][8]

  • By number of pages: Wikimedia Commons (77 million), Wikidata (72 million), English Wikipedia (49 million), DailyWeeKee (35 million), WikiBusiness (22 million).
  • By number of files: Wikimedia Commons (57 million), English Wikipedia (800,000).

The oldest dumps are probably some 2001 dumps of Wikipedia when it used UseModWiki.[9][10]

As of 2019, our collection at Internet Archive holds dumps for 250,000 wikis (including independent, wikifarm wikis, some packages of wikis and Wiki[pm]edia).[11]

Wikifarms

There are also wikifarms with hundreds of wikis. Here we only create pages for those we have some special information about that we don't want to lose (like archiving history and tips). For a full list, please use WikiApiary wikifarms main page.

Before backing up a wikifarm, try to update the list of wikis for it. There are Python scripts to generate those lists for many wikifarms.

Wikis to archive

Please add a wiki to WikiApiary if you want someone to archive it sooner or later; or tell us on IRC (#wikiteam (on hackint)) if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.

You can help downloading wikis yourself. If you don't know where to start, pick a wiki which was not archived yet from the lists on WikiApiary. Also, you can edit those pages to link existing dumps! You'll help others focus their work.

Examples of huge wikis:

  • Wikipedia - arguably the largest and one of the oldest wikis on the planet. It offers public backups (also for sister projects): https://dumps.wikimedia.org
    • They have some mirrors but not many.
    • The transfer of the dumps to the Internet Archive is automated and is currently managed by Hydriz.
  • Wikia - a website that allows the creation and hosting of wikis. Doesn't make regular backups.

We're trying to decide which other wiki engines to work on: suggestions needed!

Tools and source code

Official WikiTeam tools

Other

MediaWiki Dump Generator

The MediaWiki Client Tools' MediaWiki Dump Generator dumpgenerator script, a Python 3.x port of the WikiTeam Python 2.7 dumpgenerator.py script. It is run from the command-line in a terminal.

The XML dump can include full or only most recent page history. The images dump will contain all file types with associated descriptions. The siteinfo.json and SpecialVersion.html files will contain information about wiki features such as the installed extensions and skins. User account information won't be preserved.

Full instructions are at the MediaWiki Client Tools' MediaWiki Dump Generator GitHub repository.

  • WikiBot is an IRC bot designed to manage running MediaWiki Dump Generator and DokuWiki grabs. #wikibot (on hackint)

Wiki dumps

Most of our dumps are in the wikiteam collection at the Internet Archive. If you want an item to land there, just upload it in "opensource" collection and remember the "WikiTeam" keyword, it will be moved at some point. When you've uploaded enough wikis, you'll probably be made a collection admin to save others the effort to move your stuff.

For a manually curated list, visit the download section on GitHub.

There is another site of MediaWiki dumps located here on Scott's website.

Tips

Some tips follow.

Before archiving or asking for archiving, check that the wiki is suitable to be archived:

  • Check the size of the wiki, very large wikis might overflow disk or be hard to archive; visit the Special:SpecialPages page and click on the Special:MediaStatistics (or Special:ListFiles for older wikis) and Special:Statistics pages. You can also directly enter those pages in the search bar or edit the URL. For non-English wikis, they will redirect to the correct localized title.
  • Check the date the last upload of the wiki you are after, either in the browser, or on the command-line using the ia tool:
ia search originalurl:*examplewikiname* | jq -r .identifier | xargs ia metadata | jq -r '.metadata.addeddate, .metadata.originalurl'

Don't issue commands you don't understand, especially batch commands which use loops or find and xargs, unless you're ready to lose all the data you got.

When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z is usually smaller (better compress ratio), so use 7z.

To download a mass of wikis with N parallel threads, just split your full $list in N chunks, then start N instances of launcher.py (tutorial), one for each list

  • If you want to upload dumps as they're ready and clean up your storage: at the same time, in a separate window or screen, run a loop of the kind while true; do ./uploader.py $list --prune-directories --prune-wikidump; sleep 12h; done; (the sleep ensure each run has something to do).
  • If you're using --xmlrevisions, dumpgenerator.py will use much less memory because it won't get giant blobs of XML from Special:Export when a big page has a thousand revisions or more. You can then afford to run 100 instances of launcher.py/dumpgenerator.py with just 2 cores and 8 GiB of RAM. Watch your ulimit for the number of files, individual and total memory: 7z may consume up to 5 GiB of RAM for the biggest dumps (over 10 GiB). CPU usage tends to be lower at the beginning (launcher.py is not yet launching any 7z task because few dumps have completed) and the disk is usually hit harder at a beginning of a resume (launcher.py needs to scan the directories multiple times and dumpgeneratory.py needs to read the lists of titles, XML and image directories). Before increasing concurrency, make sure you have enough resources for those stressful times, not just for the easy ride at the beginning of the dump.
  • If you want to go advanced and run really many instances, use tmux[1]! Use tmux new-window to launch several instances in the same session. Every now and then, attach to the tmux session and look (ctrl-b f) for windows stuck on "is wrong", "is slow" or "......" loops, or which are inactive[2]. Even with a couple cores you can run a hundred instances, just make sure to have enough disk space for the occasional huge ones (tens of GB).
  • If you get closer to a 1000 instances of launcher.py, it may be too much for tmux to handle. You're probably not actually going to look at the output of hundreds of windows anyway. So just run everything in the background with xargs, monitor the crashes and then check the directories manually.
    split -a 4 -d -l 10 wikistoarchive.txt wt_ ; ls -1 wt_* | xargs -n1 -I§ -P300 sh -c "python launcher.py § 2>&1 > /dev/null ; "

If you have many wikidump directories, some of the following commands may be useful. Sometimes a dump is complete but the 7z is missing or broken (e.g. for lack of memory), or you're running low on disk and you can't wait for uploader.py to verify the uploads one by one. A hint of a complete dump is the presence of siteinfo.json: that means dumpgenerator thought the XML was done, but an image download may still be running.

  • Check errors in 7z files. It's better to avoid running uploader.py on many archives if you're not sure they're fine (for instance if you've not monitored crashes of dumpgenerator.py/launcher.py). It's much harder for other people to download the 7z files from archive.org and check them after they've been uploaded, and the presence of an archive may discourage someone else from making a new one even if the archive is not actually usable.
    find -maxdepth 1 -type f -name "*7z" | xargs -n1 -P4 -I§ sh -c "7z l § 2>&1 | grep ^ERROR "
  • Delete directories corresponding to a 7z file.
    find -maxdepth 1 -name "*wikidump.7z" | cut -d/ -f2 | sed 's,.7z,,g' | xargs -P8 rm -rf
  • If launcher.py has failed to create 7z files due to running low on resources, you may make them manually with a loop and lower compression level.
    find -maxdepth 1 -name siteinfo.json | cut -d/ -f2 |sed 's,wikidump,,g' | xargs -n1 -P6 -I§ sh -c "cd §wikidump/ ; 7za a -ms=off -mx=3 ../§history.xml.7z §history.xml §titles.txt errors.log index.html config.txt siteinfo.json Special:Version.html ; ../§history.xml.7z ../§wikidump.7z ; 7za a -mx=1 ../§wikidump.7z images/ §images.txt ; "
  • Find the biggest ongoing wikidump directories: when you don't have something as nice as ncdu, something simple may suffice, like du -shc * | grep G or find -maxdepth 2 -type f -name "*xml" -size +1G.

BitTorrent downloads

You can download and seed the torrents from the archive.org collection. Every item has a "Torrent" link.

Old mirrors

  1. Sourceforge (also mirrored to another 26 mirrors)
  2. Internet Archive (direct link to directory)

Recursive

We also have dumps for our coordination wikis:

Restoring wikis

Anyone can restore a wiki using its XML dump and images.

Wikis.cc is restoring some sites.

Links warrior project

WikiTeam links
We preserve external links used in wikis
We preserve external links used in wikis
Status Special case
Archiving status In progress... (dormant since 2017)
Archiving type Unknown
Project source wikis-grab
Project tracker wikis
IRC channel #wikiteam (on hackint)

There is a (currently dormant) warrior project to archive external links used in wikis. The target format for this archival is WARC. The data from this project is uploaded to this collection on the Internet Archive.

References

  1. Websites - WikiApiary
  2. Pavlo's list of wikis (mediawiki.csv) (backup)
  3. WikiIndex Statistics
  4. Wikifarms
  5. Comparison of wiki hosting services
  6. Category:WikiFarm
  7. List of largest wikis
  8. List of largest wikis in the world
  9. Wikimedia Downloads Historical Archives
  10. Dump of Nostalgia, an ancient version of Wikipedia from 2001
  11. WikiTeam collection at Internet Archive
  12. battlestarwiki.org - list of wikis
  13. battlestarwikiorg - dumps
  14. bluwiki - dumps
  15. communpedia - dumps
  16. editthis.info - list of wikis
  17. editthis.info - dumps
  18. editthis.info warcarchives
  19. Farm:EditThis
  20. editthis.info/1337 - Special:Version
  21. elwiki.com - list of wikis
  22. elwiki.com - dumps
  23. We're sorry about the downtime we've been having lately
  24. fandom.com - list of wikis
  25. miraheze.org - list of wikis
  26. miraheze - dumps
  27. neoseeker.com - list of wikis
  28. neoseeker.com - dumps
  29. orain.com - list of wikis
  30. orain - dumps
  31. Orain wikifarm dump (August 2013)
  32. referata.com - list of wikis
  33. referata.com - dumps
  34. Referata wikifarm dump 20111204
  35. Referata wikifarm dump (August 2013)
  36. scribblewiki.com - list of wikis
  37. scribblewiki.com - dumps
  38. What is ScribbleWiki?
  39. shoutwiki.com - list of wikis
  40. shoutwiki.com - dumps
  41. ShoutWiki wikifarm dump
  42. sourceforge - dumps
  43. tropicalwikis.com - list of wikis
  44. tropicalwikis.com - dumps
  45. wiki-site.com - list of wikis
  46. wikihub - dumps
  47. wiki.wiki - list of wikis
  48. list of wikis
  49. wikki.com - dumps

External links

v · t · e         Knowledge and Wikis
Software

DokuWiki · MediaWiki · MoinMoin · Oddmuse · PukiWiki · UseModWiki · YukiWiki

Wikifarms

atwiki · Battlestar Wiki · BluWiki · Communpedia · EditThis · elwiki.com · Fandom · Miraheze · Neoseeker.com · Orain · Referata · ScribbleWiki · Seesaa · ShoutWiki · SourceForge · TropicalWikis · Wik.is · Wiki.Wiki · Wiki-Site · Wikidot · WikiHub · Wikispaces · WikiForge · WikiTide · Wikkii · YourWiki.net

Wikimedia

Wikipedia · Wikimedia Commons · Wikibooks · Wikidata · Wikinews · Wikiquote · Wikisource · Wikispecies · Wiktionary · Wikiversity · Wikivoyage · Wikimedia Incubator · Meta-Wiki

Other

Anarchopedia · Citizendium · Conservapedia · Creation Wiki · EcuRed · Enciclopedia Libre Universal en Español · GNUPedia · Moegirlpedia · Nico Nico Pedia · Nupedia · OmegaWiki · OpenStreetMap · Pixiv Encyclopedia

Indexes and stats

WikiApiary · WikiIndex · Wikistats