Setting enum values to 4-byte strings - why?
Posted
by psychotik
on Stack Overflow
See other posts from Stack Overflow
or by psychotik
Published on 2010-03-29T23:52:01Z
Indexed on
2010/03/29
23:53 UTC
Read the original article
Hit count: 176
I saw code similar to this in the Mac OS SDK:
enum {
kAudioFileStreamProperty_ReadyToProducePackets = 'redy',
kAudioFileStreamProperty_FileFormat = 'ffmt',
kAudioFileStreamProperty_DataFormat = 'dfmt',
kAudioFileStreamProperty_FormatList = 'flst',
kAudioFileStreamProperty_MagicCookieData = 'mgic',
kAudioFileStreamProperty_AudioDataByteCount = 'bcnt',
kAudioFileStreamProperty_AudioDataPacketCount = 'pcnt',
kAudioFileStreamProperty_MaximumPacketSize = 'psze',
kAudioFileStreamProperty_DataOffset = 'doff',
kAudioFileStreamProperty_ChannelLayout = 'cmap',
kAudioFileStreamProperty_PacketToFrame = 'pkfr',
kAudioFileStreamProperty_FrameToPacket = 'frpk',
kAudioFileStreamProperty_PacketToByte = 'pkby',
kAudioFileStreamProperty_ByteToPacket = 'bypk',
kAudioFileStreamProperty_PacketTableInfo = 'pnfo',
kAudioFileStreamProperty_PacketSizeUpperBound = 'pkub',
kAudioFileStreamProperty_AverageBytesPerPacket = 'abpp',
kAudioFileStreamProperty_BitRate = 'brat'
};
It's the first time I've seen this - I assume the compiler assigns the 32-bit integer equivalent of the strings to the enum values. I cannot think of a single good reason why this might be preferred over using simple integers. It looks hideous in a debugger (how do you tell which of these values corresponds to 1919247481
?) and makes debugging just hard in general.
So, is there any reason where assigning such strings to enum values actually makes sense.
© Stack Overflow or respective owner