I've got an app using the requests module to handle connecting to a remote webserver. This works perfectly, but I want to deploy it at within an organisation using an enterprise proxy server. The machines in the organisation have the proxy configured at the operating system level (ie windows setting the system proxy).
I'd prefer to have my app automatically use the already configured OS proxy settings, rather than have to ask them for the info (especially as they use basic authentication, so I'd have to securely store a username/password, not just the proxy host/port).
Does Requests automatically use the operating system's proxy settings if you do not specify a proxy directly yourself?
I couldn't find the definitive answer to this after reading Request's documentation, or the underlying urllib3.
On my dev machine I don't have a proxy to test with, and so would like to know the answer before I go and code manual proxy handling in my app that might not actually be necessary...
As a bit of comparison, Urllib does do this - see https://docs.python.org/3/library/urllib.request.html#urllib.request.ProxyHandler ...if no proxy is specified it will utilize the system configured one.
If seemed on my initial review of Request's documentation it didn't use the system configuration, instead only using environment variables if they were set: https://2.python-requests.org/en/master/user/advanced/#proxies
But, after a bit more digging, I found a way to at least obtain the OS proxy configuration, using urllib.request.getproxies()
: https://stackoverflow.com/a/16311657/9423009
At this point I thought I'd at least be able to use the above at run time to get the OS proxy config, and pass that to requests
...
...but then I found this post, which states that requests
will use the OS level configuration if nothing is specified: How to use requests library without system-configured proxies
So, at this point, I can't find a definitive answer in the documentation either for requests or urllib3, but do have a SO post stating requests will use the OS level config, by calling urllib.requests.getproxies()
itself.
...so can anyone confirm/deny this is the case?
thanks!
There are two aspects in your question
As of version requests=2.25.1
, from Session.request
source, if not provided, proxy information is obtained from self.merge_environment_settings
if self.trust_env:
# Set environment's proxies.
no_proxy = proxies.get('no_proxy') if proxies is not None else None
env_proxies = get_environ_proxies(url, no_proxy=no_proxy)
And get_environ_proxies
uses getproxies
that is either imported from urllib
(py2) or from urllib.request
(py3).
So the answer is YES
As far as I know, "the OS configured one" is not reliable on windows. At least on my corporate machine, urllib.request.getproxies does not pick up the proxy. From its documentation or from the one in ProxyHandler it states
If no proxy environment variables are set, then in a Windows environment proxy settings are obtained from the registry’s Internet Settings section, and in a Mac OS X environment proxy information is retrieved from the OS X System Configuration Framework.
From the source code I see that it reads under HKEY_CURRENT_USER > 'Software\Microsoft\Windows\CurrentVersion\Internet Settings'
, the value of ProxyEnable
and ProxyServer
. On my machine, that has a proxy configured, this is empty - the settings seem to be rather stored in Internet Explorer / the .Net stack somewhere.
Note that very often in corporate environments the proxy is set from a .pac :
So to conclude on windows at least as of today, we can not reliably trust urllib.request.getproxies
. This is why I developed envswitch
to make it extremely easy for me and my colleagues to switch all the proxy-related environment variables in one click, back and forth (home-train-plane/office). At least urllib
(and requests
) use them reliably when they are set. (note: the tool works fine even if there is a "build failed" badge on the tool's doc page :) )