Search code examples
c#parallel-processingwebclientparallel.foreachdownloadstring

C# Download data from huge list of urls


I have a huge list of web pages which display a status, which i need to check. Some urls are within the same site, another set is located on another site.

Right now i'm trying to do this in a parallel way by using code like below, but i have the feeling that i'm causing too much overhead.

while(ListOfUrls.Count > 0){
  Parallel.ForEach(ListOfUrls, url =>
  {
    WebClient webClient = new WebClient();
    webClient.DownloadString(url);
    ... run my checks here.. 
  });

  ListOfUrls = GetNewUrls.....
}

Can this be done with less overhead, and some more control over how many webclients and connections i use/reuse? So, that in the end the job can be done faster?


Solution

  • Parallel.ForEach is good for CPU-bound computational tasks, but it will unnecessary block pool threads for synchronous IO-bound calls like DownloadString in your case. You can improve the scalability of your code and reduce the number of threads it may use, by using DownloadStringTaskAsync and tasks instead:

    // non-blocking async method
    async Task<string> ProcessUrlAsync(string url)
    {
        using (var webClient = new WebClient())
        {
            string data = await webClient.DownloadStringTaskAsync(new Uri(url));
            // run checks here.. 
            return data;
        }
    }
    
    // ...
    
    if (ListOfUrls.Count > 0) {
        var tasks = new List<Task>();
        foreach (var url in ListOfUrls)
        {
          tasks.Add(ProcessUrlAsync(url));
        }
    
        Task.WaitAll(tasks.ToArray()); // blocking wait
    
        // could use await here and make this method async:
        // await Task.WhenAll(tasks.ToArray());
    }