Simple OpenGL program major slow down at high resolution
Posted
by
Grieverheart
on Game Development
See other posts from Game Development
or by Grieverheart
Published on 2012-06-14T03:05:52Z
Indexed on
2012/09/13
3:52 UTC
Read the original article
Hit count: 320
I have created a small OpenGL 3.3 (Core) program using freeglut. The whole geometry is two boxes and one plane with some textures. I can move around like in an FPS and that's it. The problem is I face a big slow down of fps when I make my window large (i.e. above 1920x1080). I have monitors GPU usage when in full-screen and it shows GPU load of nearly 100% and Memory Controller load of ~85%. When at 600x600, these numbers are at about 45%, my CPU is also at full load. I use deferred rendering at the moment but even when forward rendering, the slow down was nearly as severe. I can't imagine my GPU is not powerful enough for something this simple when I play many games at 1080p (I have a GeForce GT 120M btw). Below are my shaders,
First Pass
#VS
#version 330 core
uniform mat4 ModelViewMatrix;
uniform mat3 NormalMatrix;
uniform mat4 MVPMatrix;
uniform float scale;
layout(location = 0) in vec3 in_Position;
layout(location = 1) in vec3 in_Normal;
layout(location = 2) in vec2 in_TexCoord;
smooth out vec3 pass_Normal;
smooth out vec3 pass_Position;
smooth out vec2 TexCoord;
void main(void){
pass_Position = (ModelViewMatrix * vec4(scale * in_Position, 1.0)).xyz;
pass_Normal = NormalMatrix * in_Normal;
TexCoord = in_TexCoord;
gl_Position = MVPMatrix * vec4(scale * in_Position, 1.0);
}
#FS
#version 330 core
uniform sampler2D inSampler;
smooth in vec3 pass_Normal;
smooth in vec3 pass_Position;
smooth in vec2 TexCoord;
layout(location = 0) out vec3 outPosition;
layout(location = 1) out vec3 outDiffuse;
layout(location = 2) out vec3 outNormal;
void main(void){
outPosition = pass_Position;
outDiffuse = texture(inSampler, TexCoord).xyz;
outNormal = pass_Normal;
}
Second Pass
#VS
#version 330 core
uniform float scale;
layout(location = 0) in vec3 in_Position;
void main(void){
gl_Position = mat4(1.0) * vec4(scale * in_Position, 1.0);
}
#FS
#version 330 core
struct Light{
vec3 direction;
};
uniform ivec2 ScreenSize;
uniform Light light;
uniform sampler2D PositionMap;
uniform sampler2D ColorMap;
uniform sampler2D NormalMap;
out vec4 out_Color;
vec2 CalcTexCoord(void){
return gl_FragCoord.xy / ScreenSize;
}
vec4 CalcLight(vec3 position, vec3 normal){
vec4 DiffuseColor = vec4(0.0);
vec4 SpecularColor = vec4(0.0);
vec3 light_Direction = -normalize(light.direction);
float diffuse = max(0.0, dot(normal, light_Direction));
if(diffuse > 0.0){
DiffuseColor = diffuse * vec4(1.0);
vec3 camera_Direction = normalize(-position);
vec3 half_vector = normalize(camera_Direction + light_Direction);
float specular = max(0.0, dot(normal, half_vector));
float fspecular = pow(specular, 128.0);
SpecularColor = fspecular * vec4(1.0);
}
return DiffuseColor + SpecularColor + vec4(0.1);
}
void main(void){
vec2 TexCoord = CalcTexCoord();
vec3 Position = texture(PositionMap, TexCoord).xyz;
vec3 Color = texture(ColorMap, TexCoord).xyz;
vec3 Normal = normalize(texture(NormalMap, TexCoord).xyz);
out_Color = vec4(Color, 1.0) * CalcLight(Position, Normal);
}
Is it normal for the GPU to be used that much under the described circumstances? Is it due to poor performance of freeglut?
I understand that the problem could be specific to my code, but I can't paste the whole code here, if you need more info, please tell me.
© Game Development or respective owner