Analyzing Web Application Speed

Posted by Amy on Stack Overflow See other posts from Stack Overflow or by Amy
Published on 2011-02-08T23:22:15Z Indexed on 2011/02/08 23:25 UTC
Read the original article Hit count: 285

Filed under:
|
|
|

I'm a bit confused because the logical/programmer brain in me says that if all things are constant, the speed of a function must be constant.

I am working on a PHP web application with jqGrid as a front end for showing the data. I am testing on my personal computer, so network traffic does not apply. I make an HTTP request to a PHP function, it returns the data, and then jqGrid renders it. What has me befuddled is that sometimes, Firebug reports that this is taking 300-600 milliseconds sometimes, and sometimes, it's taking 3.68 seconds. I can run the request over and over again, with very radically different response times.

The query is the same. The number of users on the system is the same. No network latency. Same code. I'm not running other applications on the computer while testing. I could understand query caching improving performance on subsequent requests, but the speed is just fluctuating wildly with no rhyme or reason.

So, my question is, what else can cause such variability in the response time? How can I determine what's doing it? More importantly, is there any way to get things more consistent?

© Stack Overflow or respective owner

Related posts about php

Related posts about mysql