De-normalization alternative to specific MYSQL problem?
- by Booker
I am facing quite a specific optimization problem.
I currently have 4 normalized tables of data.
Every second, possibly thousands of users will pull down up-to-date info from these tables using AJAX.
The thing is that I can predict relatively easily which subset of data they need... The most recent 100 or so entries in those 4 normalized tables.
I have been researching de-normalization... but feel that perhaps there is an easier solution.
I was thinking that I could somehow every second run one sql query to condense the needed info, store it in a temp cached table and then have all of the user queries just draw from this. This will allow the complex join of 4 tables to only be run once, and then from there the users just need to do a simple lookup from the cached table.
I really don't know if this is feasible. Comments on this or any other suggestions would be much appreciated.
Thanks!