Client side page call/scrape?
- by Silvre
Here is the problem:
I have a web application - a frequently changing notification system - that runs on a series of local computers. The application refreshes every couple of seconds to display the new information. The computers only display info, and do not have keyboards or ANY input device.
The issue is that if the connection to the server is lost (say updates are installed and a server must be rebooted), a page not found error is displayed). We must then either reboot all computers that are running this app, OR add a keyboard and refresh the browser, OR try to access each computer remotely and refresh the browser. None of these are good options and result in a lot of frustration.
I cannot change the actual application OR server environment.
So what I need is some way to test the call to the application, and if an error is returned or it times out, continue trying every minute or so until the connection is reestablished.
My idea is to create a client-side page scraper, that makes a JS request to the application (which displays basic HTML), and can run locally on the machine, no server required. If the scrape returns the correct content, it displays it. If not it continues to request the page until the actual page content is returned.
Is this possible? What is the best way to do it?