How to retry urllib2.request when fails?

31,257

Solution 1

I would use a retry decorator. There are other ones out there, but this one works pretty well. Here's how you can use it:

@retry(urllib2.URLError, tries=4, delay=3, backoff=2)
def urlopen_with_retry():
    return urllib2.urlopen("http://example.com")

This will retry the function if URLError is raised. Check the link above for documentation on the parameters, but basically it will retry a maximum of 4 times, with an exponential backoff delay doubling each time, e.g. 3 seconds, 6 seconds, 12 seconds.

Solution 2

There are a few libraries out there that specialize in this.

One is backoff, which is designed with a particularly functional sensibility. Decorators are passed arbitrary callables returning generators which yield successive delay values. A simple exponential backoff with a maximum retry time of 32 seconds could be defined as:

@backoff.on_exception(backoff.expo,
                      urllib2.URLError,
                      max_value=32)
def url_open(url):
    return urllib2.urlopen("http://example.com")

Another is retrying which has very similar functionality but an API where retry parameters are specified by way of predefined keyword args.

Solution 3

To retry on timeout you could catch the exception as @Karl Barker suggested in the comment:

assert ntries >= 1
for _ in range(ntries):
    try:
        page = urlopen(request, timeout=timeout)
        break # success
    except URLError as err:
        if not isinstance(err.reason, socket.timeout):
           raise # propagate non-timeout errors
else: # all ntries failed 
    raise err # re-raise the last timeout error
# use page here

Solution 4

For Python3:

from urllib3 import Retry, PoolManager


retries = Retry(connect=5, read=2, redirect=5, backoff_factor=0.1)
http = PoolManager(retries=retries)
response = http.request('GET', 'http://example.com/')

If the backoff_factor is 0.1, then :func:.sleep will sleep for [0.0s, 0.2s, 0.4s, ...] between retries. It will never be longer than :attr:Retry.BACKOFF_MAX. urllib3 will sleep for::

        {backoff factor} * (2 ** ({number of total retries} - 1))
Share:
31,257
iTayb
Author by

iTayb

Just another guy surfing the web (:

Updated on September 02, 2020

Comments

  • iTayb
    iTayb almost 4 years

    When urllib2.request reaches timeout, a urllib2.URLError exception is raised. What is the pythonic way to retry establishing a connection?

  • e-satis
    e-satis over 12 years
    This is a really cool snippet. Do you know an alternative, but as a context manager ?
  • jterrace
    jterrace over 12 years
    Hmm, I think you could probably rewrite it as a context manager pretty easily, but I don't have one offhand.
  • e-satis
    e-satis over 12 years
    It's no easy to do, since there is not easy way to capture the block inside the with statement. You need some deep introspection.
  • jterrace
    jterrace over 12 years
    No, I don't think that's true. Exceptions are re-raised inside a context manager after the yield.
  • e-satis
    e-satis over 12 years
    The problem is not the exception, but the code raising the exception. How do you retry a code if you can't run it ? There is no notion of anonymous bloc in Python. It's doable, but not intuitive.
  • jterrace
    jterrace over 12 years
    Ah, I see. It would be even harder if the calling code was not a with block. I have no idea how to do that.