python requests is slow

17,672

Solution 1

Not all hosts support head requests. You can use this instead:

r = requests.get(url, stream=True)

This actually only download the headers, not the response content. Moreover, if the idea is to get the file afterwards, you don't have to make another request.

See here for more infos.

Solution 2

Don't use get that actually retrieves the file, use:

r = requests.head(url,allow_redirects=False)

Which goes from 6.9secs on my machine to 0.4secs

Share:
17,672
scandalous
Author by

scandalous

Updated on June 04, 2022

Comments

  • scandalous
    scandalous almost 2 years

    I am developing a download manager. Using the requests module in python to check for a valid link (and hopefully broken links). My code for checking link below:

    url = 'http://pyscripter.googlecode.com/files/PyScripter-v2.5.3-Setup.exe'
    r = requests.get(url, allow_redirects=False) # this line takes 40 seconds
    if r.status_code==200:
        print("link valid")
    else:
        print("link invalid")
    

    Now, the issue is this takes approximately 40 seconds to perform this check, which is huge. My question is how can I speed this up maybe using urllib2 or something??

    Note: Also if I replace url with the actual URL which is 'http://pyscripter.googlecode.com/files/PyScripter-v2.5.3-Setup.exe', this takes one second so it appears to be an issue with requests.

  • scandalous
    scandalous about 11 years
    hey thanks doukremt...indeed it was showing 404 even when the item was there in some situations... thanks!