Quote:
Original post by Alpha_ProgDes
what is the answer?
guranteed to be less than or equal to an int? or a short?
edit: good luck on your interview... if they ask about your game, make sure you can explain the game, story, and process of making it very clearly.
The answer is whatever the compiler says it is within the constraints set out in the standard: sizeof char <= sizeof short <= sizeof int.
It's perfectly legal to have an IA32 C compiler that defines char as 16 bits. It wouldn't be a good idea because 16 bit values aren't particularly efficent when the CPU is in 32 bit mode - it would require data and instruction size prefixes everywhere.
People sometimes think it's the CPU architecture that dictates the size - which isn't entirely wrong: compiler writers would define the size of the various types to map to sizes the CPU handle well. I remember when PC C compilers had 16 bit ints - before the advent of Win32. Infact, the WinProc in Win32 programming displays the 16 bit legacy:
WPARAM for an int? Surely that's Hungarian Notation for a WORD value, which is 16 bits? Which is indeed what it was in Win3.1 and earlier. (And that's one reason Hungarian Notation is bad - even the people that invented it can't be bothered to go through all their code and update the Hungarian prefixes.)
Skizz