I'm writing a program that reads data from a series of web pages; this works for the first page, however it then hangs while waiting for a response from the next page until it times out. Whether I start by reading page 1, 2, 140, etc. it will always successfully read the first page but none after that.
I think this might be related to the "cookieHeader" variable, which is needed to access the pages as the website requires a user to login first. However when I inspect this element, its expiration time is set as the following day, so I don't see how it could be expired yet.
I am new to this so I'm hoping someone who has encountered this problem before or who has a better understanding of cookies could help me. I would appreciate any input! Below is a snippet of code, where the timeout error is being caught by the try-catch wrapper.
// loop through each page
for (int i = 1; i <= totalPages; i++)
{
string thisUrl = chatUrl + i; //add page number to url
WebRequest getReq = WebRequest.Create(thisUrl);
getReq.Headers.Add("Cookie", cookieHeader);
try
{
WebResponse getResp = getReq.GetResponse();
Console.WriteLine("Page " + i + " read successfully");
}
catch (Exception e)
{
Console.WriteLine("Page " + i + " failed");
}
}
I think it has got something to do with not closing the response properly. WebResponse implements IDisposable if I remember it right. So try disposing it using using statement. Also, there are some restrictions on some client versions of .Net where, by default, you can only have 2 connections open to the same server simultaneously. This restriction might kick in if you do not dispose the connections (WebResponse) properly (potentially because of keep alive settings).