At what point does the performance gap between GPU & CPU become so great that the CPU is holding back a system?
Posted
by
Matthew Galloway
on Super User
See other posts from Super User
or by Matthew Galloway
Published on 2012-09-21T06:09:49Z
Indexed on
2012/09/21
9:40 UTC
Read the original article
Hit count: 229
I know that generally speaking for gaming performance the GPU is the primary factor which holds back performance, with everything else such as RAM/motherboard/PSU/CPU being secondary in importance to the graphics card.
But at some point the other components ARE going to be significant in holding back the whole system!
For instance nobody would be silly enough to play modern games with 512MB RAM and the very latest graphics cards (such as an HD7970) as I bet the performance increase over such a system with only 512MB but a mid range card would be non-existent! Thus it would be a "waste" for such a person to buy any high end graphics card without resolving first the system's other problems. The same point applies to other components, such as if it only had a Pentium II a current high end graphics card would be wasted on it!
So my core question is how do you determine at what point for your system is spending on extra GPU power be completely "wasted"? (also, a slightly more nuanced question is trying work out at what point might the extra graphics power not be "wasted" but would be "sub optimal" value for money, when the expenditure should then be split around graphics card and other components. As obviously a gamer shouldn't always just spend on upgrading the graphics card! But needs to balance it out)
© Super User or respective owner