rcurl web scraping timeout exits program
- by user1742368
I am using a loop and rcurl scrape data from multiple pages which seems to work fine at certain times but fails when there is a timeout due to the server not responding. I am using a timeout=30 which traps the timeout error however the program stops after the timeout. i would like the progrm to continue to the next page when the timeout occurrs but cant figureout how to do this?
url = getCurlHandle(cookiefile = "", verbose = TRUE)
Here is the statement I am using that causes the timeout. I am happy to share the code if there is interest.
webpage = getURLContent(url, followlocation=TRUE, curl = curl,.opts=list( verbose = TRUE, timeout=90, maxredirs = 2))
woodwardjj