Search code examples
pythonbeautifulsoupurllib2http-status-code-403

python, urllib2, crashes on 404 error


I have a program that grabs content from url's stored in a database. I am using beautifulsoup , urllib2 for grabbing the content. When I output the result, I see that the program crashes when it encoutners (what looks like) a 403 error. So how do I prevent my program from crashing on 403/404 etc errors?

Relevant output:

Traceback (most recent call last):
  File "web_content.py", line 29, in <module>
    grab_text(row) 
  File "web_content.py", line 21, in grab_text
    f = urllib2.urlopen(row)
  File "/usr/lib/python2.7/urllib2.py", line 126, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.7/urllib2.py", line 400, in open
    response = meth(req, response)
  File "/usr/lib/python2.7/urllib2.py", line 513, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib/python2.7/urllib2.py", line 438, in error
    return self._call_chain(*args)
  File "/usr/lib/python2.7/urllib2.py", line 372, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.7/urllib2.py", line 521, in http_error_default
    raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
urllib2.HTTPError: HTTP Error 403: Forbidden

Solution

  • You can surround the request with a try/except, e.g.

    try:
        urllib2.openurl(url)
    except urllib2.HTTPError, e:
        print e
    

    See http://www.voidspace.org.uk/python/articles/urllib2.shtml#handling-exceptions for some good examples and information.