I am building an app that downloads a large number of images, sometimes 1500-5000 images depending on what the user requests. To do this, I am using AFNetworking 2
. At first, I was just looping through all of my URLs and then making a request for each one.
for (NSString *url in urls) {
NSURLRequest *request = [NSURLRequest requestWithURL:url];
AFHTTPRequestOperation *requestOperation = [[AFHTTPRequestOperation alloc] initWithRequest:request];
requestOperation.responseSerializer = [AFImageResponseSerializer serializer];
[requestOperation setCompletionBlockWithSuccess:^(AFHTTPRequestOperation *operation, id responseObject) {
completion(responseObject);
} failure:^(AFHTTPRequestOperation *operation, NSError *error) {
failure(error);
}];
[requestOperation setDownloadProgressBlock:^(NSUInteger bytesRead, long long totalBytesRead, long long totalBytesExpectedToRead) {
double percentDone = (double)totalBytesRead / (double)totalBytesExpectedToRead;
progress(percentDone);
}];
[requestOperation start];
}
But, after I got to about 900 downloads/requests I would start to get the following error:
The request timed out
I am assuming this error came directly from AFNetworking
.
What is the best and most efficient way to make a large number of download requests like this without timing out? Should I be using dispatch_group
to batch the requests as outlined here?
Or, should I use a recursive method that will download one image at a time, and only start the next request once the first one finishes?
Try this
NSMutableURLRequest *request = [[NSMutableURLRequest alloc] initWithURL:url];
[request setTimeoutInterval:600]; /* 10 minutes */
But it would be best solution if you just download an archive of images and then unpack it.