How to limit download rate of HTTP requests in requests python library?
Solution 1
There are several approaches to rate limiting; one of them is token bucket, for which you can find a recipe here and another one here.
Usually you would want to do throttling or rate limiting on socket.send()
and socket.recv()
. You could play with socket-throttle
and see if it does what you need.
This is not to be confused with x-ratelimit
rate limiting response headers, which are related to a number of requests rather than a download / transfer rate.
Solution 2
No built-in support but, it is possible to use stream api.
>>> import requests
>>> import time
>>> req = requests.request('GET', 'https://httpbin.org/get', stream=True)
>>> for data in req.iter_content(chunk_size=1024):
... time.sleep(0.001)
...
In advanced usage there is written that its's allow you to retrieve smaller quantities of the response at a time.
In my network the example above (leading to a several GB large file) without sleep had bandwidth 17.4 MB/s and with sleep 1 ms 2.5 MB/s.
Related videos on Youtube
Joe Mornin
Updated on June 07, 2022Comments
-
Joe Mornin almost 2 years
Is it possible to limit the download rate of GET requests using the
requests
Python library? For instance, with a command like this:r = requests.get('https://stackoverflow.com/')
...is it possible to limit the download rate? I'm hoping for something similar to this
wget
command:wget --limit-rate=20k https://stackoverflow.com/
I know it's possible with
urllib2
. I'm asking specifically about therequests
library. -
Dan H over 5 yearsSince the OP asked about doing this with the
requests
library, this doesn't really answer the question... except that you are suggesting he re-write deep portions ofrequests
to do what he wants. -
dnozay over 5 years@DanH, you are more than welcome to provide a better answer, at time of writing the answer was "no, unless you mess with the internals", which isn't helpful to OP.