Problems when rendering code on Nvidia GPU
Posted
by
2am
on Game Development
See other posts from Game Development
or by 2am
Published on 2013-11-02T15:06:04Z
Indexed on
2013/11/02
16:02 UTC
Read the original article
Hit count: 250
I am following OpenGL GLSL cookbook 4.0, I have rendered a tesselated quad, as you see in the screenshot below, and i am moving Y coordinate of every vertex using a time based sin function as given in the code in the book.
This program, as you see on the text in the image, runs perfectly on built in Intel HD graphics of my processor, but i have Nvidia GT 555m graphics in my laptop, (which by the way has switchable graphics) when I run the program on the graphic card, the OpenGL shader compilation fails.
It fails on following instruction..
pos.y = sin.waveAmp * sin(u);
giving error>>
Error C1105 : Cannot call a non-function
I know this error is coming on the sin(u) function which you see in the instruction. I am not able to understand why? When i removed sin(u) from the code, the program ran fine on Nvidia card. Its running with sin(u) fine on Intel HD 3000 graphics.
Also, if you notice the program is almost unusable with intel HD 3000 graphics, I am getting only 9FPS, which is not enough. Its too much load for intel HD 3000.
So, sin(X) function is not defined in the OpenGL specification given by Nvidia drivers or something else??
© Game Development or respective owner