How can I detect endianness on a system where all primitive integer sizes are the same?
- by Joe Wreschnig
(This question came out of explaining the details of CHAR_BIT, sizeof, and endianness to someone yesterday. It's entirely hypothetical.)
Let's say I'm on a platform where CHAR_BIT is 32, so sizeof(char) == sizeof(short) == sizeof(int) == sizeof(long). I believe this is still a standards-conformant environment.
The usual way to detect endianness at runtime (because there is no reliable way to do it at compile time) is to make a union { int i, char c[sizeof(int)] } x; x.i = 1 and see whether x.c[0] or x.c[sizeof(int)-1] got set.
But that doesn't work on this platform, as I end up with a char[1].
Is there a way to detect whether such a platform is big-endian or little-endian, at runtime? Obviously it doesn't matter inside this hypothetical system, but one can imagine it is writing to a file, or some kind of memory-mapped area, which another machine reads and reconstructs it according to its (saner) memory model.