Why is the framerate (fps) capped at 60?
Posted
by
dennmat
on Game Development
See other posts from Game Development
or by dennmat
Published on 2012-10-06T06:47:16Z
Indexed on
2012/10/06
21:56 UTC
Read the original article
Hit count: 222
ISSUE
I recently moved a project from my laptop to my desktop(machine info below). On my laptop the exact same code displays the fps(and ms/f) correctly. On my desktop it does not. What I mean by this is on the laptop it will display 300 fps(for example) where on my desktop it will show only up to 60. If I add 100 objects to the game on the laptop I'll see my frame rate drop accordingly; the same test on the desktop results in no change and the frames stay at 60. It takes a lot(~300) entities before I'll see a frame drop on the desktop, then it will descend. It seems as though its "theoretical" frames would be 400 or 500 but will never actually get to that and only do 60 until there's too much to handle at 60. This 60 frame cap is coming from no where.
I'm not doing any frame limiting myself.
It seems like something external is limiting my loop iterations on the desktop, but for the last couple days I've been scratching my head trying to figure out how to debug this.
SETUPS
Desktop:
- Visual Studio Express 2012
- Windows 7 Ultimate 64-bit
Laptop:
- Visual Studio Express 2010
- Windows 7 Ultimate 64-bit
The libraries(allegro, box2d) are the same versions on both setups.
CODE
Main Loop:
while(!abort) {
frameTime = al_get_time();
if (frameTime - lastTime >= 1.0) {
lastFps = fps/(frameTime - lastTime);
lastTime = frameTime;
avgMspf = cumMspf/fps;
cumMspf = 0.0;
fps = 0;
}
/** DRAWING/UPDATE CODE **/
fps++;
cumMspf += al_get_time() - frameTime;
}
Note: There is no blocking code in the loop at any point.
Where I'm at
My understanding of al_get_time()
is that it can return different resolutions depending on the system. However the resolution is never worse than seconds, and the double is represented as [seconds].[finer-resolution] and seeing as I'm only checking for a whole second al_get_time()
shouldn't be responsible.
My project settings and compiler options are the same. And I promise its the same code on both machines.
My googling really didn't help me much, and although technically it's not that big of a deal. I'd really like to figure this out or perhaps have it explained, whichever comes first. Even just an idea of how to go about figuring out possible causes, because I'm out of ideas.
Any help at all is greatly appreciated.
EDIT:
Thanks All. For any others that find this to disable vSync(windows only) in opengl:
First get "wglext.h". It's all over the web.
Then you can use a tool like GLee
or just write your own quick extensions manager like:
bool WGLExtensionSupported(const char *extension_name) {
PFNWGLGETEXTENSIONSSTRINGEXTPROC _wglGetExtensionsStringEXT = NULL;
_wglGetExtensionsStringEXT = (PFNWGLGETEXTENSIONSSTRINGEXTPROC) wglGetProcAddress("wglGetExtensionsStringEXT");
if (strstr(_wglGetExtensionsStringEXT(), extension_name) == NULL) {
return false;
}
return true;
}
and then create and setup your function pointers:
PFNWGLSWAPINTERVALEXTPROC wglSwapIntervalEXT = NULL;
PFNWGLGETSWAPINTERVALEXTPROC wglGetSwapIntervalEXT = NULL;
if (WGLExtensionSupported("WGL_EXT_swap_control"))
{
// Extension is supported, init pointers.
wglSwapIntervalEXT = (PFNWGLSWAPINTERVALEXTPROC) wglGetProcAddress("wglSwapIntervalEXT");
// this is another function from WGL_EXT_swap_control extension
wglGetSwapIntervalEXT = (PFNWGLGETSWAPINTERVALEXTPROC) wglGetProcAddress("wglGetSwapIntervalEXT");
}
Then just call wglSwapIntervalEXT(0) to disable vSync and 1 to enable vSync. I found the reason this is windows only is that openGl actually doesn't deal with anything other than rendering it leaves the rest up to the OS and Hardware.
Thanks everyone saved me a lot of time!
© Game Development or respective owner