I'm writing a webapp (Firefox-compatible only) which uses long polling (via jQuery's ajax abilities) to send more-or-less constant updates from the server to the client. I'm concerned about the effects of leaving this running for long periods of time, say, all day or overnight. The basic code skeleton is this:
function processResults(xml)
{
// do stuff with the xml from the server
}
function fetch()
{
setTimeout(function ()
{
$.ajax({
type: 'GET',
url: 'foo/bar/baz',
dataType: 'xml',
success: function (xml)
{
processResults(xml);
fetch();
},
error: function (xhr, type, exception)
{
if (xhr.status === 0)
{
console.log('XMLHttpRequest cancelled');
}
else
{
console.debug(xhr);
fetch();
}
}
});
}, 500);
}
(The half-second "sleep" is so that the client doesn't hammer the server if the updates are coming back to the client quickly - which they usually are.)
After leaving this running overnight, it tends to make Firefox crawl. I'd been thinking that this could be partially caused by a large stack depth since I've basically written an infinitely recursive function. However, if I use Firebug and throw a breakpoint into fetch, it looks like this is not the case. The stack that Firebug shows me is only about 4 or 5 frames deep, even after an hour.
One of the solutions I'm considering is changing my recursive function to an iterative one, but I can't figure out how I would insert the delay in between Ajax requests without spinning. I've looked at the JS 1.7 "yield" keyword but I can't quite wrap my head around it, to figure out if it's what I need here.
Is the best solution just to do a hard refresh on the page periodically, say, once every hour? Is there a better/leaner long-polling design pattern that won't put a hurt on the browser even after running for 8 or 12 hours? Or should I just skip the long polling altogether and use a different "constant update" pattern since I usually know how frequently the server will have a response for me?