How can I detect endianness on a system where all primitive integer sizes are the same?

Posted by Joe Wreschnig on Stack Overflow See other posts from Stack Overflow or by Joe Wreschnig
Published on 2011-01-17T11:02:12Z Indexed on 2011/01/17 11:53 UTC
Read the original article Hit count: 206

Filed under:
|
|
|

(This question came out of explaining the details of CHAR_BIT, sizeof, and endianness to someone yesterday. It's entirely hypothetical.)

Let's say I'm on a platform where CHAR_BIT is 32, so sizeof(char) == sizeof(short) == sizeof(int) == sizeof(long). I believe this is still a standards-conformant environment.

The usual way to detect endianness at runtime (because there is no reliable way to do it at compile time) is to make a union { int i, char c[sizeof(int)] } x; x.i = 1 and see whether x.c[0] or x.c[sizeof(int)-1] got set.

But that doesn't work on this platform, as I end up with a char[1].

Is there a way to detect whether such a platform is big-endian or little-endian, at runtime? Obviously it doesn't matter inside this hypothetical system, but one can imagine it is writing to a file, or some kind of memory-mapped area, which another machine reads and reconstructs it according to its (saner) memory model.

© Stack Overflow or respective owner

Related posts about c

    Related posts about puzzle