Search code examples
pythondownloadurllib2python-requestsurllib

python requests is slow


I am developing a download manager. Using the requests module in python to check for a valid link (and hopefully broken links). My code for checking link below:

url = 'http://pyscripter.googlecode.com/files/PyScripter-v2.5.3-Setup.exe'
r = requests.get(url, allow_redirects=False) # this line takes 40 seconds
if r.status_code==200:
    print("link valid")
else:
    print("link invalid")

Now, the issue is this takes approximately 40 seconds to perform this check, which is huge. My question is how can I speed this up maybe using urllib2 or something??

Note: Also if I replace url with the actual URL which is 'http://pyscripter.googlecode.com/files/PyScripter-v2.5.3-Setup.exe', this takes one second so it appears to be an issue with requests.


Solution

  • Not all hosts support head requests. You can use this instead:

    r = requests.get(url, stream=True)
    

    This actually only download the headers, not the response content. Moreover, if the idea is to get the file afterwards, you don't have to make another request.

    See here for more infos.