I am facing this strange issue. I was asked to do a poc. So I did a quick and dirty appliction (Console App). I was to hit a url and parse certain part of response. Lets suppose that url is www.google.co.uk. My code on console app ran fine. After that I ported it my web application (which runs on asp.net 4.5, web api controller method). The same code now starting giving error as soon as try to hit the url. The code is as following
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(new Uri(contentDeliveryURL));
request.CookieContainer = cookieContainer;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
The error was : Unable to connect to the remote server: No connection could be made because the target machine actively refused it
It fails on GetResponse() method on Web Api controller. Now I know it has do with proxy because when i tried to hit a url which is internally hosted(inside our corporate network), the code ran fine. My question is why in the desktop application i had not to provide any proxy information but in web i will need to ? how does the desktop application reads proxy information ? also i elavated the app pool identity to use my account in the web application and it still did not worked(if you give me suggestions about changing the user under which app pool is running)
An application reads the default proxy setting which is bound to the user's profile. (The simplest way to set it interactively via IE.) A .NET app honors that setting, and many other apps honors that very same setting like Fiddler, Chrome, FF etc. Actually an application which does not honor that setting can be considered to incompatible. (For example Java applications)
The catch is "bind to the user's profile". So a service which runs under a special user account not necessary has the same setting as your interactive user. (or has no setting at all)
The safest solution would be to set the proxy via code for your .NET http classes. Of course you can read it from a config file and not hardcoded.