Understanding GPU clock rates
- by trizicus
I know how to overclock my CPU (mess with multiplier, and bus speed)... However, I've noticed that it seems a bit more complicated with GPU's.
How and where do I start? I've noticed that I can adjust the GPU clock speed in my BIOS.
Card I'm overclocking: http://www.nvidia.com/object/product_geforce_gt_240_us.html
I found that memory bus speed is (Mem Speed * Bus width) / 8. So obviously a good way to overclock the memory bandwidth is to adjust the memory speed.
Now, GPU speed is 550 Mhz. How do I find its speed as well? Do I multiply it by the bus width (128)?
What is ideal GPU speed relative to memory bandwidth?