Search code examples
rwaitgeturl

getURL get stuck, need a wait function


I am trying to use R to surf the web but I have a strange problem, lets say that I have a list named URLlist containing some URL. Here is my code

for (k in 1:length(URLlist)){
    temp = getURL(URLlist[k])
}

I don't know why but at some random URL, R blocks. It has nothing to do with the URL as it can work for an execution of the loop for but not for another one for the same URL. I think that the loop is going to fast and that the download of data doesn't follow. So I was thinking of making the code wait for 1 seconde before each new calling of getURL function, but I didn't find such a wait function. Any idea please ? thank you !


Solution

  • ?Sys.sleep()

    Description:

     Suspend execution of R expressions for a given number of seconds
    

    Usage:

     Sys.sleep(time)
    

    Arguments:

    time: The time interval to suspend execution for, in seconds.
    

    Whether or not this will solve your problem is another issue.

    I would suggest looking at the XML package and using htmlParse() to surf the web with R since there are rarely instances where you want html being returned as text.