At what point does caching become necessary for a web application?

Posted by Zaemz on Programmers See other posts from Programmers or by Zaemz
Published on 2013-10-22T23:26:51Z Indexed on 2013/10/23 4:09 UTC
Read the original article Hit count: 157

I'm considering the architecture for a web application. It's going to be a single page application that updates itself whenever the user selects different information on several forms that are available that are on the page.

I was thinking that it shouldn't be good to rely on the user's browser to correctly interpret the information and update the view, so I'll send the user's choices to the server, and then get the data, send it back to the browser, and update the view.

There's a table with 10,000 or so rows in a MySQL database that's going to be accessed pretty often, like once every 5-30 seconds for each user. I'm expecting 200-300 concurrent users at one time. I've read that a well designed relational database with simple queries are nothing for a RDBMS to handle, really, but I would still like to keep things quick for the client.

Should this even be a concern for me at the moment? At what point would it be helpful to start using a separate caching service like Memcached or Redis, or would it even be necessary?

I know that MySQL caches popular queries and the results, would this suffice?

© Programmers or respective owner

Related posts about web-applications

Related posts about mysql