Search code examples
pythonpython-2.7exceptionurllib2

Cant catch exeptions with urllib2


I have a script printing out the response from an API, but I cant seem to catch any exceptions. I think I've gone thru every question asked on this topic without any luck.

How can I check if the script will catch any errors/exceptions?

I'm testing the script on a site i know returns 403 Forbidden, but it does'nt show.

My script:

import urllib2

url_se = 'http://www.example.com'

opener = urllib2.build_opener()
opener.addheaders = [('User-agent', 'API to File')]

try:
    request = opener.open(url_se)
except urllib2.HTTPError, e:
    print  e.code
except urllib2.URLError, e:
    print e.args
except Exception:
    import traceback
    print 'Generic exception ' + traceback.format_exc()

response = request.read()                            
print response

Is this the right approach? Whats the best practice for catching exeptions concerning

urllib2

Solution

  • There is a bug in your program. If any exception occurs in try block then variable request becomes undefined in response = request.text() block.
    Correct it as

    import urllib2
    
    url_se = 'http://www.example.com'
    
    opener = urllib2.build_opener()
    opener.addheaders = [('User-agent', 'API to File')]
    
    try:
        request = opener.open(url_se)
        response = request.read()                            
        print response
    except urllib2.HTTPError, e:
        print  e.code
    except urllib2.URLError, e:
        print e.args
    except Exception as e:
        import traceback
        print 'Generic exception ' + traceback.format_exc()
    

    Test it on your machine you will surely see the Exceptions. Catching exceptions individually only make sense if you are doing something specify with them otherwise if you only want to log the exceptions then a universal except block will do the job.