I'm looking for architectural advice. I have a client who I've built a website for which essentially allows users to view their web cameras remotely.
The current flow of data is as follows:
User opens page to view web camera image.
Javascript script polls url on server ( appended with unique timestamp ) every 1000ms
Ftp connection is enabled for the cameras ftp user.
Web camera opens ftp connection to server.
Web camera begins taking photos.
Web camera sends photo to ftp server.
On image url request:
Server reads latest image on hard drive uploaded via ftp for camera.
Server deleted any older images from the server.
This is working okay at the moment for a small amount of users/cameras ( about 10 users and around the same amount of cameras), but we're starting to worrying about the scalability of this approach.
My original plan was instead of having the files read from the server, the web server would open up an ftp connection to the web server and read the latest images directly from there meaning we should have been able to scale horizontally fairly easily. But ftp connection establishment times were too slow ( mainly due to the fact that PHP out of the ox is unable to persist ftp connections ) and so we abandoned this approach and went straight for reading from the hard drive.
The firmware provider for the cameras state they're able to build a http client which instead of using ftp to upload the image could post the image to a web server. This seems plausible enough to me, but I'm looking for some architectural advice.
My current thought is a simple Nginx/PHP/Redis stack.
Web camera issues post requests of latest image to Nginx/PHP and the latest image for that camera is stored in Redis.
The clients can then pull the latest image from Redis which should be extremely quick as the images will always be stored in memory.
The data flow would then become:
User opens page to view web camera image.
Javascript script polls url on server ( appended with unique timestamp ) every 1000ms
Camera is sent an http request to start posting images to a provided url
Web camera begins taking photos.
Web camera sends post requests to server as fast as it can
On image url request:
Server reads latest image from redis
Server tells redis to delete later image
My questions are:
Are there any greater overheads of transferring images via HTTP instead of FTP?
Is there a simple way to calculate how many potential cameras we could have streaming at once?
Is there any way to prevent potentially DOS'ing our own servers due to web camera requests?
Is Redis a good solution to this problem?
Should I abandon PHP/Ngix combination and go for something else?
Is this proposed solution actually any good?
Will adding HTTPs to the mix cause posting the image to become too slow?
Thanks in advance
Alan