How to handle wildly varying rendering hardware / getting baseline

Posted by edA-qa mort-ora-y on Game Development See other posts from Game Development or by edA-qa mort-ora-y
Published on 2012-08-30T12:02:34Z Indexed on 2012/08/30 15:50 UTC
Read the original article Hit count: 336

Filed under:
|

I've recently started with mobile programming (cross-platform, also with desktop) and am encountering wildly differing hardware performance, in particular with OpenGL and the GPU. I know I'll basically have to adjust my rendering code but I'm uncertain of how to detect performance and what reasonable default settings are.

I notice that certain shader functions are basically free in a desktop implemenation but can be unusable in a mobile device. The problem is I have no way of knowing what features will cause what performance issues on all the devices. So my first issue is that even if I allow configuring options I'm uncertain of which options I have to make configurable. I'm wondering also wheher one just writes one very configurable pipeline, or whether I should have 2 distinct options (high/low).

I'm also unsure of where to set the default. If I set to the poorest performer the graphics will be so minimal that any user with a modern device would dismiss the game. If I set them even at some moderate point, the low end devices will basically become a slide-show. I was thinking perhaps that I just run some benchmarks when the user first installs and randomly guess what works, but I've not see a game do this before.

© Game Development or respective owner

Related posts about opengl

Related posts about opengl-es