Difference between revisions of "WikiTeam"

From Archiveteam
Jump to navigation Jump to search
(→‎Wikifarms: update number of miraheze wikis)
(40 intermediate revisions by 8 users not shown)
Line 3: Line 3:
| image = Wikiteam.jpg
| image = Wikiteam.jpg
| description = WikiTeam, we preserve wikis
| description = WikiTeam, we preserve wikis
| URL = https://github.com/WikiTeam/wikiteam
| URL = [https://github.com/WikiTeam/wikiteam wikiteam github], manual for now, check [https://wikiapiary.com/wiki/Category:Website_not_archived not archived wikis on wikiapiary]
| project_status = {{online}} (at least some of them)
| project_status = {{online}} (at least some of them)
| tracker = manual for now, check [https://wikiapiary.com/wiki/Category:Website_not_archived not archived wikis on wikiapiary]
| source = [https://github.com/Archiveteam/wikis-grab wikis-grab]
| tracker = [http://tracker.archiveteam.org/wikis/ wiki tracker]
| archiving_status = {{inprogress}}
| archiving_status = {{inprogress}}
| irc = wikiteam
| irc = wikiteam
}}
}}


'''WikiTeam''' software is a set of tools for archiving wikis. They work on [[MediaWiki]] wikis, but we want to expand to other wiki engines. As of October 2015, WikiTeam has preserved more than 27,000 stand-alone.
'''WikiTeam''' software is a set of tools for archiving wikis. They work on [[MediaWiki]] wikis, but we want to expand to other wiki engines. As of January 2017, WikiTeam has preserved more than 27,000 stand-alone.


You can check [https://archive.org/details/wikiteam our collection] at [[Internet Archive]], the [https://github.com/WikiTeam/wikiteam source code] in [[GitHub]] and some [https://wikiapiary.com/wiki/Websites/WikiTeam lists of wikis by status] in [[WikiApiary]].
You can check [https://archive.org/details/wikiteam our collection] at [[Internet Archive]], the [https://github.com/WikiTeam/wikiteam source code] in [[GitHub]] and some [https://wikiapiary.com/wiki/Websites/WikiTeam lists of wikis by status] in [[WikiApiary]].
Line 18: Line 19:
The total number of MediaWiki wikis is unknown, but some estimates exist.
The total number of MediaWiki wikis is unknown, but some estimates exist.


According to [[WikiApiary]], which is the most updated database, there are 21,369 independent wikis (1,508 are semantic) and 4,554 in wikifarms.<ref>[https://wikiapiary.com/wiki/Websites Websites] - WikiApiary</ref> But it doesn't include [[Wikia]] 200,000+ wikis and the independent list coverage can be improved for sure.
According to [[WikiApiary]], which is the most updated database, there are 21,369 independent wikis (1,508 are semantic) and 4,554 in wikifarms.<ref>[https://wikiapiary.com/wiki/Websites Websites] - WikiApiary</ref> But it doesn't include [[Wikia]] 400,000+ wikis and the independent list coverage can be improved for sure.


According to Pavlo's list generated in December 2008, there are 20,000 wikis.<ref>[http://cs.brown.edu/~pavlo/mediawiki/ Pavlo's list of wikis] ([http://www.cs.brown.edu/~pavlo/mediawiki/mediawikis.csv mediawiki.csv]) ([https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/mediawikis_pavlo.csv backup])</ref> This list was imported into WikiApiary.
According to Pavlo's list generated in December 2008, there are 20,000 wikis.<ref>[http://cs.brown.edu/~pavlo/mediawiki/ Pavlo's list of wikis] ([http://www.cs.brown.edu/~pavlo/mediawiki/mediawikis.csv mediawiki.csv]) ([https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/mediawikis_pavlo.csv backup])</ref> This list was imported into WikiApiary.


According to [[WikiIndex]], there are 20,698 wikis.<ref>[http://wikiindex.org/Special:Statistics WikiIndex Statistics]</ref>
According to [[WikiIndex]], there are 20,698 wikis.<ref>[http://wikiindex.org/Special:Statistics WikiIndex Statistics]</ref> The URLs in this project were added to WikiApiary in the past too.


A number of [[#Wikifarms|wikifarms]] have vanished and about 150 are still online.<ref>[https://wikiapiary.com/wiki/Farm:Farms Wikifarms]</ref><ref>[https://en.wikipedia.org/wiki/Comparison_of_wiki_hosting_services Comparison of wiki hosting services]</ref><ref>[http://wikiindex.org/Category:WikiFarm Category:WikiFarm]</ref>
A number of [[#Wikifarms|wikifarms]] have vanished and about 150 are still online.<ref>[https://wikiapiary.com/wiki/Farm:Farms Wikifarms]</ref><ref>[https://en.wikipedia.org/wiki/Comparison_of_wiki_hosting_services Comparison of wiki hosting services]</ref><ref>[http://wikiindex.org/Category:WikiFarm Category:WikiFarm]</ref>


Most wikis are small, containing about 100 pages or less, but there are some very large wikis:<ref>[http://meta.wikimedia.org/wiki/List_of_largest_wikis List of largest wikis]</ref><ref>[http://s23.org/wikistats/largest_html.php?th=15000&lines=500 List of largest wikis in the world]</ref>
Most wikis are small, containing about 100 pages or less, but there are some very large wikis:<ref>[http://meta.wikimedia.org/wiki/List_of_largest_wikis List of largest wikis]</ref><ref>[http://s23.org/wikistats/largest_html.php?th=15000&lines=500 List of largest wikis in the world]</ref>
* By number of pages: Wikimedia Commons (40 million), English Wikipedia (37 million), DailyWeeKee (35 million), WikiBusiness (22 million) and Wikdiata (19 million).
* By number of pages: Wikimedia Commons (40 million), English Wikipedia (37 million), DailyWeeKee (35 million), WikiBusiness (22 million) and Wikidata (19 million).
* By number of files: Wikimedia Commons (28 million), English Wikipedia (800,000).
* By number of files: Wikimedia Commons (28 million), English Wikipedia (800,000).


The oldest dumps are probably some 2001 dumps of Wikipedia when it used UseModWiki.<ref>[https://dumps.wikimedia.org/archive/ Wikimedia Downloads Historical Archives]</ref><ref>[http://dumps.wikimedia.org/nostalgiawiki Dump] of [http://nostalgia.wikipedia.org/ Nostalgia], an ancient version of Wikipedia from 2001</ref>
The oldest dumps are probably some 2001 dumps of Wikipedia when it used UseModWiki.<ref>[https://dumps.wikimedia.org/archive/ Wikimedia Downloads Historical Archives]</ref><ref>[http://dumps.wikimedia.org/nostalgiawiki Dump] of [http://nostalgia.wikipedia.org/ Nostalgia], an ancient version of Wikipedia from 2001</ref>


As of October 2015, our collection at Internet Archive holds dumps for 27,398 wikis (including independent, wikifarm wikis, some packages of wikis and Wiki[pm]edia).<ref>[https://archive.org/details/wikiteam WikiTeam collection] at Internet Archive</ref>
As of January 2017, our collection at Internet Archive holds dumps for 27,867 wikis (including independent, wikifarm wikis, some packages of wikis and Wiki[pm]edia).<ref>[https://archive.org/details/wikiteam WikiTeam collection] at Internet Archive</ref>


== Wikifarms ==
== Wikifarms ==
There are also wikifarms with hundreds of wikis. Here we only create pages for those we have some special information about that we don't want to lose (like archiving history and tips). For a full list, please use WikiApiary [https://wikiapiary.com/wiki/Farm:Main_Page wikifarms main page].


Before backing up a wikifarm, try to update the list of wikis for it. There are [https://github.com/WikiTeam/wikiteam/tree/master/listsofwikis/mediawiki Python scripts to generate those lists] for many wikifarms.
Before backing up a wikifarm, try to update the list of wikis for it. There are [https://github.com/WikiTeam/wikiteam/tree/master/listsofwikis/mediawiki Python scripts to generate those lists] for many wikifarms.
Line 41: Line 44:
! width=140px | Wikifarm !! width=80px | Wikis !! Status !! width=80px | Dumps !! Comments
! width=140px | Wikifarm !! width=80px | Wikis !! Status !! width=80px | Dumps !! Comments
|-
|-
| [[Edit.This]] ([http://editthis.info site]) || 1,350<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/editthis.info editthis.info - list of wikis]</ref>  || {{yellow|Unstable}} || 1,297<ref>[https://archive.org/search.php?query=editthis%20subject%3Awikiteam editthis.info - dumps]</ref> || Most dumps were done in 2014. There are many useless pages in a namespace (MediaWiki:?) in every wiki. Exclude them when downloading. No API available. This wikifarm is not well covered in WikiApiary.<ref>[https://wikiapiary.com/wiki/Farm:EditThis Farm:EditThis]</ref>
| [[Battlestar Wiki]] ([http://battlestarwiki.org site]) || 8 || {{green|Online}} || 0<ref>[https://archive.org/search.php?query=battlestarwikiorg%20subject%3Awikiteam battlestarwikiorg - dumps]</ref> ||
|-
| [[BluWiki]] ([http://wayback.archive.org/web/20090301060338/http://bluwiki.com/go/Main_Page site]) || ? || {{red|Offline}} || ~20<ref>[https://archive.org/search.php?query=bluwiki%20subject%3Awikiteam bluwiki - dumps]</ref> ||
|-
| [[Communpedia]] ([https://wikiapiary.com/wiki/Communpedia_%28ru%29 site]) || 5 || {{yellow|Unestable}} || 4<ref>[https://archive.org/search.php?query=subject%3A%22Comunpedia%22%20OR%20subject%3A%22Communpedia%22%20OR%20subject%3A%22kommynistru%22 communpedia - dumps]</ref>
|-
| [[EditThis]] ([http://editthis.info site]) || 1,350<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/editthis.info editthis.info - list of wikis]</ref>  || {{yellow|Unstable}} || 1307+ (IA: 1,297<ref>[https://archive.org/search.php?query=editthisinfo%20subject%3Awikiteam editthis.info - dumps]</ref>) || Most dumps were done in 2014. This wikifarm is not well covered in WikiApiary.<ref>[https://wikiapiary.com/wiki/Farm:EditThis Farm:EditThis]</ref>
|-
|-
| [[elwiki.com]] ([https://web.archive.org/web/20070917110429/http://www.elwiki.com/ site]) || Unknown<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/elwiki.com elwiki.com - list of wikis]</ref> || {{red|Offline}} || None<ref>[https://archive.org/search.php?query=elwiki%20subject%3Awikiteam elwiki.com - dumps]</ref> || Last seen online in 2008.<ref>[https://web.archive.org/web/20080221125135/http://www.elwiki.com/ We're sorry about the downtime we've been having lately]</ref> There is no dumps, presumably lost. Perhaps [https://web.archive.org/web/form-submit.jsp?type=prefixquery&url=http://elwiki.com/ some pages] are in the Wayback Machine.
| [[elwiki.com]] ([https://web.archive.org/web/20070917110429/http://www.elwiki.com/ site]) || Unknown<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/elwiki.com elwiki.com - list of wikis]</ref> || {{red|Offline}} || None<ref>[https://archive.org/search.php?query=elwiki%20subject%3Awikiteam elwiki.com - dumps]</ref> || Last seen online in 2008.<ref>[https://web.archive.org/web/20080221125135/http://www.elwiki.com/ We're sorry about the downtime we've been having lately]</ref> There is no dumps, presumably lost. Perhaps [https://web.archive.org/web/form-submit.jsp?type=prefixquery&url=http://elwiki.com/ some pages] are in the Wayback Machine.
|-
| [[Miraheze]] ([https://meta.miraheze.org site]) || 2319 || {{green|Online}} || 685 [outdated] || Non-profit. Dumps were made in September 2016.
|-
|-
| [[Neoseeker.com]] ([https://neowiki.neoseeker.com site])|| 229<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/neoseeker.com neoseeker.com - list of wikis]</ref> || {{green|Online}} || 159<ref>[https://archive.org/search.php?query=neoseeker+subject%3Awikiteam neoseeker.com - dumps]</ref> || Check why there are dozens of wikis without dump.
| [[Neoseeker.com]] ([https://neowiki.neoseeker.com site])|| 229<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/neoseeker.com neoseeker.com - list of wikis]</ref> || {{green|Online}} || 159<ref>[https://archive.org/search.php?query=neoseeker+subject%3Awikiteam neoseeker.com - dumps]</ref> || Check why there are dozens of wikis without dump.
|-
| [[Orain]] ([https://meta.orain.org site]) || 425<ref>[https://raw.githubusercontent.com/WikiTeam/wikiteam/master/listsofwikis/mediawiki/orain.org orain.com - list of wikis]</ref> || {{red|Offline}} || ~380<ref>[https://archive.org/search.php?query=orain%20subject%3Awikiteam orain - dumps]</ref><ref>[https://archive.org/details/wikifarm-orain.org-20130824 Orain wikifarm dump (August 2013)]</ref> || Last seen online in September 2015. Dumps were made in August 2013, January 2014 and August 2015.
|-
|-
| [[Referata]] ([http://www.referata.com site]) || 156<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/referata.com referata.com - list of wikis]</ref> || {{green|Online}} || ~80<ref>[https://archive.org/search.php?query=referata%20subject%3Awikiteam referata.com - dumps]</ref><ref>[https://archive.org/details/referata.com-20111204 Referata wikifarm dump 20111204]</ref><ref>[https://archive.org/details/wikifarm-referata.com-20130824 Referata wikifarm dump (August 2013)]</ref> || Check why there are dozens of wikis without dump.
| [[Referata]] ([http://www.referata.com site]) || 156<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/referata.com referata.com - list of wikis]</ref> || {{green|Online}} || ~80<ref>[https://archive.org/search.php?query=referata%20subject%3Awikiteam referata.com - dumps]</ref><ref>[https://archive.org/details/referata.com-20111204 Referata wikifarm dump 20111204]</ref><ref>[https://archive.org/details/wikifarm-referata.com-20130824 Referata wikifarm dump (August 2013)]</ref> || Check why there are dozens of wikis without dump.
Line 53: Line 66:
| [[ShoutWiki]] ([http://www.shoutwiki.com site]) || 1,879<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/shoutwiki.com shoutwiki.com - list of wikis]</ref> || {{green|Online}} || ~1,300<ref>[https://archive.org/search.php?query=shoutwiki%20subject%3Awikiteam shoutwiki.com - dumps]</ref><ref>[http://www.archive.org/details/shoutwiki.com ShoutWiki wikifarm dump]</ref> ||  Check why there are dozens of wikis without dump.
| [[ShoutWiki]] ([http://www.shoutwiki.com site]) || 1,879<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/shoutwiki.com shoutwiki.com - list of wikis]</ref> || {{green|Online}} || ~1,300<ref>[https://archive.org/search.php?query=shoutwiki%20subject%3Awikiteam shoutwiki.com - dumps]</ref><ref>[http://www.archive.org/details/shoutwiki.com ShoutWiki wikifarm dump]</ref> ||  Check why there are dozens of wikis without dump.
|-
|-
| [[TropicalWikis]] ([http://tropicalwikis.com site]) || 187<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/tropicalwikis.com tropicalwikis.com - list of wikis]</ref> || {{red|Offline}} || 152<ref>[https://archive.org/search.php?query=tropicalwikis%20subject%3Awikiteam tropicalwikis.com - dumps]</ref> || Killed off in November 2013. Allegedly pending move to [[Orain]]. Data from February 2013 and earlier saved.
| [[Sourceforge]] || ? || {{green|Online}} || 315<ref>[https://archive.org/search.php?query=sourceforge%20subject%3Awikiteam sourceforge - dumps]</ref> ||  
|-
|-
| [[Wik.is]] || || {{red|Offline}} || ||  
| [[TropicalWikis]] ([http://tropicalwikis.com site]) || 187<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/tropicalwikis.com tropicalwikis.com - list of wikis]</ref> || {{red|Offline}} || 152<ref>[https://archive.org/search.php?query=tropicalwikis%20subject%3Awikiteam tropicalwikis.com - dumps]</ref> || Killed off in November 2013. Allegedly pending move to [[Orain]] (which became offline too). Data from February 2013 and earlier saved.
|-
|-
| [[Wiki-Site]] || || || ||  
| [[Wik.is]] ([http://wik.is site]) || ? || {{red|Offline}} || ? || Non-MediaWiki.
|-
|-
| [[Wikia]] ([http://www.wikia.com site]) || 340,000 || {{green|Online}} || ~34,000<ref>[https://archive.org/details/wikia_dump_20121204 Wikia wikis data dumps]</ref> || [http://community.wikia.com/wiki/Help:Database_download Help:Database download], [https://github.com/Wikia/app/tree/dev/extensions/wikia/WikiFactory/Dumps Their dumping code]
| [[Wiki-Site]] ([http://www.wiki-site.com site]) || 5,839<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/wiki-site.com wiki-site.com - list of wikis]</ref> || {{green|Online}} || 367 || No uploaded dumps yet.
|-
|-
| [[Wikkii]] || || {{red|Offline}} || [https://archive.org/search.php?query=wikkii 1,300] ||
| [[Wikia]] ([http://www.wikia.com site]) || 400,000<ref>[https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/wikia.com wikia.com - list of wikis]</ref> || {{green|Online}} || ~34,000<ref>[https://archive.org/details/wikia_dump_20121204 Wikia wikis data dumps]</ref> || [http://community.wikia.com/wiki/Help:Database_download Help:Database download], [https://github.com/Wikia/app/tree/dev/extensions/wikia/WikiFactory/Dumps Their dumping code]
|-
|-
| [[YourWiki.net]] || || || ||  
| [[WikiHub]] ([http://wikihub.ssu.lt site]) || ? || {{red|Offline}} || 7<ref>[https://archive.org/details/wikifarm-wikihub.ssu.lt-20131110 wikihub - dumps]</ref> ||
|-
| [[Wiki.Wiki]] ([https://wiki.wiki site]) || 100<ref>[https://raw.githubusercontent.com/WikiTeam/wikiteam/master/listsofwikis/mediawiki/wiki.wiki wiki.wiki - list of wikis]</ref> || {{green|Online}} || ? ||
|-
| [[Wikkii]] ([https://web.archive.org/web/20140621054654/http://wikkii.com/wiki/Free_Wiki_Hosting site]) || 3,267 || {{red|Offline}} || 1,300<ref>[https://archive.org/search.php?query=wikkii%20subject%3Awikiteam wikki.com - dumps]</ref> ||
|-
| [[YourWiki.net]] ([https://web.archive.org/web/20100124003107/http://www.yourwiki.net/wiki/YourWiki site]) || ? || {{red|Offline}} || ? ||  
|}
|}


== Wikis to archive ==
== Wikis to archive ==


Please [https://wikiapiary.com/wiki/Special:FormEdit/Website add a wiki to wikiapiary] if you want someone to archive it sooner or later; or tell us on the #wikiteam channel if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.
Please [https://wikiapiary.com/wiki/Special:FormEdit/Website add a wiki to WikiApiary] if you want someone to archive it sooner or later; or tell us on the #wikiteam channel if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.


[https://github.com/WikiTeam/wikiteam/wiki/Tutorial You can help] downloading wikis yourself. If you don't know where to start, pick a [https://wikiapiary.com/wiki/Category:Website_not_archived wiki which was not archived yet] from the lists on wikiapiary. If you can't, edit those pages to link existing dumps! You'll help others focus their work.
[https://github.com/WikiTeam/wikiteam/wiki/Tutorial You can help] downloading wikis yourself. If you don't know where to start, pick a [https://wikiapiary.com/wiki/Category:Website_not_archived wiki which was not archived yet] from the lists on WikiApiary. Also, you can edit those pages to link existing dumps! You'll help others focus their work.


Examples of huge wikis:
Examples of huge wikis:
* '''[[Wikipedia]]''' - arguably the largest and one of the oldest Wikis on the planet. It offers public backups (also for sister projects): http://dumps.wikimedia.org
 
* '''[[Wikipedia]]''' - arguably the largest and one of the oldest wikis on the planet. It offers public backups (also for sister projects): http://dumps.wikimedia.org
** They have some mirrors but not many.
** They have some mirrors but not many.
** Every now and then we upload a copy to archive.org, but this is not automated. You can do it in our stead. ;)
** The transfer of the dumps to the Internet Archive is automated and is currently managed by [[User:Hydriz|Hydriz]].
* '''[[Wikimedia Commons]]''' - a Wiki of media files available for free usage. It offers public backups: http://dumps.wikimedia.org
 
* '''[[Wikimedia Commons]]''' - a wiki of media files available for free usage. It offers public backups: http://dumps.wikimedia.org
** But there is no image dump available, only the image descriptions
** But there is no image dump available, only the image descriptions
** So we made it! http://archive.org/details/wikimediacommons
** So we made it! http://archive.org/details/wikimediacommons
* '''[[Wikia]]''' - a website that allows the creation and hosting of wikis. Doesn't make regular backups.
* '''[[Wikia]]''' - a website that allows the creation and hosting of wikis. Doesn't make regular backups.
There are also '''[[List of wikifarms|several wikifarms]]''' with hundreds of wikis. On this wiki we only create pages for those we have some special information about that we don't want to lose (like archiving history and tips). For a full list, please use wikiapiary: see the [https://wikiapiary.com/wiki/Farm:Main_Page wikifarms main page].


We're trying to decide which [https://groups.google.com/forum/#!topic/wikiteam-discuss/TxzfrkN4ohA other wiki engines] to work on: suggestions needed!
We're trying to decide which [https://groups.google.com/forum/#!topic/wikiteam-discuss/TxzfrkN4ohA other wiki engines] to work on: suggestions needed!
Line 113: Line 133:


=== BitTorrent downloads ===
=== BitTorrent downloads ===
You can download and seed the torrents from the archive.org collection.
You can download and seed the torrents from the archive.org collection. Every item has a "Torrent" link.


=== Old mirrors ===
=== Old mirrors ===
Line 121: Line 141:
</span>
</span>


== See also ==
=== Recursive ===
* [[List of wikifarms]]
 
We also have dumps for our coordination wikis:
* [[ArchiveTeam wiki]] ([https://archive.org/details/wiki-archiveteamorg 2014-03-26])
* [[WikiApiary]] ([https://archive.org/details/wiki-wikiapiarycom_w 2015-03-25])
 
== Restoring wikis ==
 
Anyone can restore a wiki using its XML dump and images.
 
Wikis.cc is [https://www.wikis.cc/wiki/Wikis_recuperados restoring some sites].


== References ==
== References ==
Line 137: Line 166:
* http://wikiheaven.blogspot.com/
* http://wikiheaven.blogspot.com/


 
{{wikis}}
 
{{Navigation box}}


[[Category:Archive Team]]
[[Category:Archive Team]]
[[Category:Wikis| ]]

Revision as of 11:16, 31 July 2017

WikiTeam
WikiTeam, we preserve wikis
WikiTeam, we preserve wikis
URL wikiteam github, manual for now, check not archived wikis on wikiapiary
Status Online! (at least some of them)
Archiving status In progress...
Archiving type Unknown
Project source wikis-grab
Project tracker wiki tracker
IRC channel #wikiteam (on hackint)

WikiTeam software is a set of tools for archiving wikis. They work on MediaWiki wikis, but we want to expand to other wiki engines. As of January 2017, WikiTeam has preserved more than 27,000 stand-alone.

You can check our collection at Internet Archive, the source code in GitHub and some lists of wikis by status in WikiApiary.

Current status

The total number of MediaWiki wikis is unknown, but some estimates exist.

According to WikiApiary, which is the most updated database, there are 21,369 independent wikis (1,508 are semantic) and 4,554 in wikifarms.[1] But it doesn't include Wikia 400,000+ wikis and the independent list coverage can be improved for sure.

According to Pavlo's list generated in December 2008, there are 20,000 wikis.[2] This list was imported into WikiApiary.

According to WikiIndex, there are 20,698 wikis.[3] The URLs in this project were added to WikiApiary in the past too.

A number of wikifarms have vanished and about 150 are still online.[4][5][6]

Most wikis are small, containing about 100 pages or less, but there are some very large wikis:[7][8]

  • By number of pages: Wikimedia Commons (40 million), English Wikipedia (37 million), DailyWeeKee (35 million), WikiBusiness (22 million) and Wikidata (19 million).
  • By number of files: Wikimedia Commons (28 million), English Wikipedia (800,000).

The oldest dumps are probably some 2001 dumps of Wikipedia when it used UseModWiki.[9][10]

As of January 2017, our collection at Internet Archive holds dumps for 27,867 wikis (including independent, wikifarm wikis, some packages of wikis and Wiki[pm]edia).[11]

Wikifarms

There are also wikifarms with hundreds of wikis. Here we only create pages for those we have some special information about that we don't want to lose (like archiving history and tips). For a full list, please use WikiApiary wikifarms main page.

Before backing up a wikifarm, try to update the list of wikis for it. There are Python scripts to generate those lists for many wikifarms.

Wikis to archive

Please add a wiki to WikiApiary if you want someone to archive it sooner or later; or tell us on the #wikiteam channel if it's particularly urgent. Remember that there are thousands of wikis we don't even know about yet.

You can help downloading wikis yourself. If you don't know where to start, pick a wiki which was not archived yet from the lists on WikiApiary. Also, you can edit those pages to link existing dumps! You'll help others focus their work.

Examples of huge wikis:

  • Wikipedia - arguably the largest and one of the oldest wikis on the planet. It offers public backups (also for sister projects): http://dumps.wikimedia.org
    • They have some mirrors but not many.
    • The transfer of the dumps to the Internet Archive is automated and is currently managed by Hydriz.
  • Wikia - a website that allows the creation and hosting of wikis. Doesn't make regular backups.

We're trying to decide which other wiki engines to work on: suggestions needed!

Tools and source code

Official WikiTeam tools

Other

Wiki dumps

Most of our dumps are in the wikiteam collection at the Internet Archive. If you want an item to land there, just upload it in "opensource" collection and remember the "WikiTeam" keyword, it will be moved at some point. When you've uploaded enough wikis, you'll probably be made a collection admin to save others the effort to move your stuff.

For a manually curated list, visit the download section on GitHub.

There is another site of MediaWiki dumps located here on Scott's website.

Tips

Some tips:

  • When downloading Wikipedia/Wikimedia Commons dumps, pages-meta-history.xml.7z and pages-meta-history.xml.bz2 are the same, but 7z use to be smaller (better compress ratio), so use 7z.
  • To download a mass of wikis with N parallel threads, just split your full $list in N chunks, then start N instances of launcher.py (tutorial), one for each list
    • If you want to upload dumps as they're ready and clean up your storage: at the same time, in a separate window or screen, run a loop of the kind while true; do ./uploader.py $list --prune-directories --prune-wikidump; sleep 12h; done; (the sleep ensure each run has something to do).
    • If you want to go advanced and run really many instances, use tmux[1]! Every now and then, attach to the tmux session and look (ctrl-b f) for windows stuck on "is wrong", "is slow" or "......" loops, or which are inactive[2]. Even with a couple cores you can run a hundred instances, just make sure to have enough disk space for the occasional huge ones (tens of GB).

BitTorrent downloads

You can download and seed the torrents from the archive.org collection. Every item has a "Torrent" link.

Old mirrors

  1. Sourceforge (also mirrored to another 26 mirrors)
  2. Internet Archive (direct link to directory)

Recursive

We also have dumps for our coordination wikis:

Restoring wikis

Anyone can restore a wiki using its XML dump and images.

Wikis.cc is restoring some sites.

References

External links

v · t · e         Knowledge and Wikis
Software

DokuWiki · MediaWiki · MoinMoin · Oddmuse · PukiWiki · UseModWiki · YukiWiki

Wikifarms

atwiki · Battlestar Wiki · BluWiki · Communpedia · EditThis · elwiki.com · Fandom · Miraheze · Neoseeker.com · Orain · Referata · ScribbleWiki · Seesaa · ShoutWiki · SourceForge · TropicalWikis · Wik.is · Wiki.Wiki · Wiki-Site · Wikidot · WikiHub · Wikispaces · WikiForge · WikiTide · Wikkii · YourWiki.net

Wikimedia

Wikipedia · Wikimedia Commons · Wikibooks · Wikidata · Wikinews · Wikiquote · Wikisource · Wikispecies · Wiktionary · Wikiversity · Wikivoyage · Wikimedia Incubator · Meta-Wiki

Other

Anarchopedia · Citizendium · Conservapedia · Creation Wiki · EcuRed · Enciclopedia Libre Universal en Español · GNUPedia · Moegirlpedia · Nico Nico Pedia · Nupedia · OmegaWiki · OpenStreetMap · Pixiv Encyclopedia

Indexes and stats

WikiApiary · WikiIndex · Wikistats