De-normalization alternative to specific MYSQL problem?

Posted by Booker on Stack Overflow See other posts from Stack Overflow or by Booker
Published on 2010-03-31T16:16:21Z Indexed on 2010/03/31 16:43 UTC
Read the original article Hit count: 309

Filed under:
|
|
|
|

I am facing quite a specific optimization problem.

I currently have 4 normalized tables of data.

Every second, possibly thousands of users will pull down up-to-date info from these tables using AJAX.

The thing is that I can predict relatively easily which subset of data they need... The most recent 100 or so entries in those 4 normalized tables.

I have been researching de-normalization... but feel that perhaps there is an easier solution.

I was thinking that I could somehow every second run one sql query to condense the needed info, store it in a temp cached table and then have all of the user queries just draw from this. This will allow the complex join of 4 tables to only be run once, and then from there the users just need to do a simple lookup from the cached table.

I really don't know if this is feasible. Comments on this or any other suggestions would be much appreciated.

Thanks!

© Stack Overflow or respective owner

Related posts about php

Related posts about mysql