int considered harmful?
Posted
by Chris Becke
on Stack Overflow
See other posts from Stack Overflow
or by Chris Becke
Published on 2010-04-12T12:13:55Z
Indexed on
2010/04/12
12:23 UTC
Read the original article
Hit count: 338
c++
Working on code meant to be portable between Win32 and Win64 and Cocoa, I am really struggling to get to grips with what the @#$% the various standards committees involved over the past decades were thinking when they first came up with, and then perpetuated, the crime against humanity that is the C native typeset - char, short, int and long.
On the one hand, as a old-school c++ programmer, there are few statements that were as elegant and/or as simple as
for(int i=0; i<some_max; i++)
but now, it seems that, in the general case, this code can never be correct. Oh sure, given a particular version of MSVC or GCC, with specific targets, the size of 'int' can be safely assumed. But, in the case of writing very generic c/c++ code that might one day be used on 16 bit hardware, or 128, or just be exposed to a particularly weirdly setup 32/64 bit compiler, how does use int
in c++ code in a way that the resulting program would have predictable behavior in any and all possible c++ compilers that implemented c++ according to spec.
To resolve these unpredictabilities, C99 and C++98 introduced size_t, uintptr_t, ptrdiff_t, int8_t, int16_t, int32_t, int16_t and so on.
Which leaves me thinking that a raw int
, anywhere in pure c++ code, should really be considered harmful, as there is some (completely c++xx conforming) compiler, thats going to produce an unexpected or incorrect result with it. (and probably be a attack vector as well)
© Stack Overflow or respective owner