I am facing the SSLError, when trying to scrape websites.
import requests
url = 'https://www.amazon.com/'
page = requests.get(url)
content=page.content
print(content)
Output:~\Anaconda3\lib\site-packages\requests\adapters.py in send(self, request, stream, timeout, verify, cert, proxies) 429 except (_SSLError, _HTTPError) as e: 430 if isinstance(e, _SSLError): --> 431 raise SSLError(e, request=request) 432 elif isinstance(e, ReadTimeoutError): 433 raise ReadTimeout(e, request=request)
SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1076)
there is a workaround that works:
import requests
url = 'https://www.amazon.com/'
page = requests.get(url=url, verify =False)
content=page.content
print(content)
But I would love to settle the issue with my certificates!
I have updated everything including requests, reinstalled my Anaconda3, checked my certificates with https://www.ssllabs.com - they are ok.
System inf: Windows 10, pip version 20.0.2 ,anaconda3,python 3.7
ANY IDEA on WHAT does THAT DAMN 1076 ERROR particularly refer to and HOW TO FIX it?
Thanks a ton in advance
It was the 'wonderful' Zscaler installed by global IT as the company policy now requires, I got the certificate from the IT department - strugling to install it to Anaconda3 now posting this question here