Talk:Internet Archive

From Archiveteam
Revision as of 07:18, 5 May 2019 by Bzc6p (talk | contribs) (robots.txt immunity?)
Jump to: navigation, search

hi guys,

I was thinking to update the links on Uploading to's also an unofficial bookmarklet and shell function page as they don't seem to be working. Here is what I get when I click on them: = function ia-save() { curl -s -m 60 -I$* | grep Content-Location | awk '{print ""$2}' } & = javascript:void(open(''+document.location))] So I have found this that is working pretty good and has helped archive many pages on the Internet Archive. I'll go ahead and do the change. Let me know if you know of any additional add-ons!


When you upload a item via torrent, IA keeps seeding the torrent "forever"? HadeanEon (talk) 17:27, 7 February 2019 (UTC)

It does leech (tries to download) it for a maximum of 7 days. Also, IIRC it also stops downloading after being idle (no seeds) for 24 hours. (I used to upload stuff via bittorrent).
The torrent file you upload won't be seeded. The Archive creates a new torrent, which contains all files of the item. That one is probably seeded forever. bzc6p (talk) 07:17, 5 May 2019 (UTC)

robots.txt immunity?

Apparently, the WayBack Machine is now taking revenge on robots.txt and ignoring it increasingly.
Is it just me or did any of you also notice it? (I am extremely glad about it!) --ATrescue (talk) 18:12, 4 May 2019 (UTC)

Not just you. bzc6p (talk) 07:18, 5 May 2019 (UTC)