Frame rate on one of two machines running same code seems to be capped at 60 for no reason

Posted by dennmat on Game Development See other posts from Game Development or by dennmat
Published on 2012-10-06T06:47:16Z Indexed on 2012/10/06 9:51 UTC
Read the original article Hit count: 230

Filed under:
|
|
|
|

ISSUE

I recently moved a project from my laptop to my desktop(machine info below). On my laptop the exact same code displays the fps(and ms/f) correctly. On my desktop it does not. What I mean by this is on the laptop it will display 300 fps(for example) where on my desktop it will show only up to 60. If I add 100 objects to the game on the laptop I'll see my frame rate drop accordingly; the same test on the desktop results in no change and the frames stay at 60. It takes a lot(~300) entities before I'll see a frame drop on the desktop, then it will descend. It seems as though its "theoretical" frames would be 400 or 500 but will never actually get to that and only do 60 until there's too much to handle at 60. This 60 frame cap is coming from no where.

I'm not doing any frame limiting myself.

It seems like something external is limiting my loop iterations on the desktop, but for the last couple days I've been scratching my head trying to figure out how to debug this.

SETUPS

Desktop:

  • Visual Studio Express 2012
  • Windows 7 Ultimate 64-bit

Laptop:

  • Visual Studio Express 2010
  • Windows 7 Ultimate 64-bit

The libraries(allegro, box2d) are the same versions on both setups.

CODE

Main Loop:

while(!abort) {
    frameTime = al_get_time();
    if (frameTime - lastTime >= 1.0) {
        lastFps = fps/(frameTime - lastTime);
        lastTime = frameTime;
        avgMspf = cumMspf/fps;
        cumMspf = 0.0;
        fps = 0;
    }

    /** DRAWING/UPDATE CODE **/

    fps++;
    cumMspf += al_get_time() - frameTime;
}

Note: There is no blocking code in the loop at any point.

Where I'm at

My understanding of al_get_time() is that it can return different resolutions depending on the system. However the resolution is never worse than seconds, and the double is represented as [seconds].[finer-resolution] and seeing as I'm only checking for a whole second al_get_time() shouldn't be responsible.

My project settings and compiler options are the same. And I promise its the same code on both machines.

My googling really didn't help me much, and although technically it's not that big of a deal. I'd really like to figure this out or perhaps have it explained, whichever comes first. Even just an idea of how to go about figuring out possible causes, because I'm out of ideas.

Any help at all is greatly appreciated.

© Game Development or respective owner

Related posts about c++

Related posts about Windows