Check if the internet cannot be accessed in Python

12,890

Solution 1

You should wrap the request in a try/except statement so that you catch the fault and then let them know.

try:
   u = urllib2.urlopen(req)
except HTTPError as e:
   #inform them of the specific error here (based off the error code)
except URLError as e:
   #inform them of the specific error here
except Exception as e:
   #inform them that a general error has occurred 

Solution 2

urllib2 - The Missing Manual has a good section on how to handle URLError and HTTPError exceptions and how to differentiate the conditions that caused them.

Solution 3

How about catching URLError, then testing the reason attribute? If the reason isn't one you're interested in, re-throw the URLError and handle it somewhere else.

Alternatively, you could try httplib2. Its ServerNotFoundError exception would probably suit your needs.

Share:
12,890
Sridhar Ratnakumar
Author by

Sridhar Ratnakumar

Updated on June 04, 2022

Comments

  • Sridhar Ratnakumar
    Sridhar Ratnakumar almost 2 years

    I have an app that makes a HTTP GET request to a particular URL on the internet. But when the network is down (say, no public wifi - or my ISP is down, or some such thing), I get the following traceback at urllib2.urlopen:

    70, in get
        u = urllib2.urlopen(req)
      File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 126, in urlopen
        return _opener.open(url, data, timeout)
      File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 391, in open
        response = self._open(req, data)
      File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 409, in _open
        '_open', req)
      File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 369, in _call_chain
        result = func(*args)
      File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 1161, in http_open
        return self.do_open(httplib.HTTPConnection, req)
      File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 1136, in do_open
        raise URLError(err)
    URLError: <urlopen error [Errno 8] nodename nor servname provided, or not known>
    

    I want to print a friendly error to the user telling him that his network maybe down instead of this unfriendly "nodename nor servname provided" error message. Sure I can catch URLError, but that would catch every url error, not just the one related to network downtime.

    I am not a purist, so even an error message like "The server example.com cannot be reached; either the server is indeed having problems or your network connection is down" would be nice. How do I go about selectively catching such errors? (For a start, if DNS resolution fails at urllib2.urlopen, that can be reasonably assumed as network inaccessibility? If so, how do I "catch" it in the except block?)