How to retry urllib2.request when fails? How to retry urllib2.request when fails? python python

How to retry urllib2.request when fails?


I would use a retry decorator. There are other ones out there, but this one works pretty well. Here's how you can use it:

@retry(urllib2.URLError, tries=4, delay=3, backoff=2)def urlopen_with_retry():    return urllib2.urlopen("http://example.com")

This will retry the function if URLError is raised. Check the link above for documentation on the parameters, but basically it will retry a maximum of 4 times, with an exponential backoff delay doubling each time, e.g. 3 seconds, 6 seconds, 12 seconds.


There are a few libraries out there that specialize in this.

One is backoff, which is designed with a particularly functional sensibility. Decorators are passed arbitrary callables returning generators which yield successive delay values. A simple exponential backoff with a maximum retry time of 32 seconds could be defined as:

@backoff.on_exception(backoff.expo,                      urllib2.URLError,                      max_value=32)def url_open(url):    return urllib2.urlopen("http://example.com")

Another is retrying which has very similar functionality but an API where retry parameters are specified by way of predefined keyword args.


To retry on timeout you could catch the exception as @Karl Barker suggested in the comment:

assert ntries >= 1for _ in range(ntries):    try:        page = urlopen(request, timeout=timeout)        break # success    except URLError as err:        if not isinstance(err.reason, socket.timeout):           raise # propagate non-timeout errorselse: # all ntries failed     raise err # re-raise the last timeout error# use page here