How can I get the correct DisplayMetrics from an AppWidget in Android?

Posted by Gary on Stack Overflow See other posts from Stack Overflow or by Gary
Published on 2010-04-15T06:36:38Z Indexed on 2010/04/15 6:43 UTC
Read the original article Hit count: 458

I need to determine the screen density at runtime in an Android AppWidget. I've set up an HDPI emulator device (avd). If set up a regular executable project, and insert this code into the onCreate method:

DisplayMetrics dm = getResources().getDisplayMetrics();
Log.d("MyTag", "screen density " + dm.densityDpi);

This outputs "screen density 240" as expected.

However, if I set up an AppWidget project, and insert this code into the onUpdate method:

DisplayMetrics dm = context.getResources().getDisplayMetrics();
Log.d("MyTag", "screen density " + dm.densityDpi);

This outputs "screen density 160". I noticed, hooking up the debugger, that the mDefaultDisplay member of the Resources object here is null in the AppWidget case.

Similarly, if I get a resource at runtime using the Resources object obtained from context.getResources() in the AppWidget, it returns the wrong resource based on screen density. For instance, I have a 60x60px drawable for mdpi, and an 80x80 drawable for hdpi. If I get this Drawable object using context.getResources().getDrawable(...), it returns the 60x60 version.

Is there any way to correctly deal with resources at runtime from the context of an AppWidget?

Thanks!

© Stack Overflow or respective owner

Related posts about android-widget

Related posts about android