Signed and unsigned, and how bit extension works
Posted
by hatorade
on Super User
See other posts from Super User
or by hatorade
Published on 2010-04-24T23:24:28Z
Indexed on
2010/04/24
23:33 UTC
Read the original article
Hit count: 287
datatypes
unsigned short s;
s = 0xffff;
int i = s;
How does the extension work here? 2 larger order bytes are added, but I'm confused whether 1's or 0's are extended there. This is probably platform dependent so let's focus on what Unix does. Would the two bigger order bytes of the int be filled with 1's or 0's, and why?
Basically, does the computer know that s
is unsigned, and correctly assign 0's to the higher order bits of the int? So i
is now 0x0000ffff
? Or since ints
are default signed in unix does it take the signed bit from s
(a 1) and copy that to the higher order bytes?
© Super User or respective owner