-
Anthony Mallet authored
Some sites, most notably github redirecting to amazon s3 storage and googlecode, refuse HEAD requests (403). For those, fallback to a GET request with a 'Range: bytes 0-0' header. Other sites refuse curl user agent (403). Because of those, one now send a 'robotpkg' user agent (although it's still curl, but what else can be done?). While here, add a variable NO_MASTER_SITES_CHECK to disable the check-msster-sites target for a package.
00fe22b5