#define vs const
I have been using #defines for all the constants in my games, because I assumed that consts took up memory. I wanted to clear that up, though. Because the const keyword is type safe, does that mean it takes up memory? Or is it just like the #define directive in that it is a literal constant?
Thanks
------------------------------
Jonathan Little
invader@hushmail.com
http://www.crosswinds.net/~uselessknowledge
I believe (and I''ve been known to be wrong) that a constant is stored like a variable. That means when you use it it''s referenced via a pointer, which is 32 bits (same size as an int or float on windows) so it doesn''t save any memory.
-the logistical one-http://members.bellatlantic.net/~olsongt
#define T ThisGetsReplacedInYourCodeByThePreprocessor
const int T = 1; // T is a constant type integer with a value of 1
The preprocessor will go through your code during compilation and replace every occurence of the #define statement as though you type it in.
A const is just that; a constant. It has a type and follows all the rules just like the other types with the exception that it''s constant and subject to a few restrictions as far as scope and alteration.
These are two very different things.
const int T = 1; // T is a constant type integer with a value of 1
The preprocessor will go through your code during compilation and replace every occurence of the #define statement as though you type it in.
A const is just that; a constant. It has a type and follows all the rules just like the other types with the exception that it''s constant and subject to a few restrictions as far as scope and alteration.
These are two very different things.
~deadlinegrunt
OK, that answers the question. But as an extension, is it the same way with a typedef? I mean, would typing
typedef UINT unsigned int
take up memory, instead of typing
#define UINT unsigned int
??
typedef UINT unsigned int
take up memory, instead of typing
#define UINT unsigned int
??
It is possible to change a const value during runtime. Not that you would want to.
quote: Original post by Gromit
It is possible to change a const value during runtime. Not that you would want to.
subject to restriction != impossible to alter
( in case you did want to )
~deadlinegrunt
don''t think that #define ABC nnn don''t take up memory. Both constants & define take up memory. #define only take up memory if u use it inside the code or in the global. Constants just take up memory even u r not using it.
U can try to #define ABCXYZ 0x12345678 and then use it in a program. After build, search for 12345678 in hex value & u can find it in your EXE. So is this using up memory?
(i''m still not sure)
U can try to #define ABCXYZ 0x12345678 and then use it in a program. After build, search for 12345678 in hex value & u can find it in your EXE. So is this using up memory?
(i''m still not sure)
"after many years of singularity, i'm still searching on the event horizon"
quote: Original post by Qoy
would typing
typedef UINT unsigned int
take up memory, instead of typing
#define UINT unsigned int
??
typedef UINT unsigned int
and
#define UINT unsigned int
equate to the same thing at this point: an alias
As far as taking up room, niether one does until you declare an instance of a variable with that alias. ( for sake of conversation )
So to recap this: As a general rule, #define escapes type checking as well as some other quirks if you''re not careful. Keep playing with them and you''ll be sure to find them. ( It''ll be fun when you do too. Hint: make function macros )
#define statement replaces code that you typed with the definition you gave it at compile time through the preprocessor. This circumvents type checking. It is has it''s uses, but can also create problems if you don''t use them correctly.
~deadlinegrunt
They should generate the same code on a good compiler, and here is proof:
; 15 : b = b * defined_constant;
mov ecx, DWORD PTR _b$[ebp]
shl ecx, 1
mov DWORD PTR _b$[ebp], ecx
; 16 : b = b * const_constant;
mov edx, DWORD PTR _b$[ebp]
shl edx, 1
mov DWORD PTR _b$[ebp], edx
Identical code generated on VC++ 6.0 SP3, in debug build.
; 15 : b = b * defined_constant;
mov ecx, DWORD PTR _b$[ebp]
shl ecx, 1
mov DWORD PTR _b$[ebp], ecx
; 16 : b = b * const_constant;
mov edx, DWORD PTR _b$[ebp]
shl edx, 1
mov DWORD PTR _b$[ebp], edx
Identical code generated on VC++ 6.0 SP3, in debug build.
I believe that #define''s take up more memory.
lets say you have an option:
#define blah 42
-or-
const int blah = 42;
now, in the code, you use ''blah'' twenty times.
the preprocessor literally replaces ''blah'' with 42, and the compiler creates space for 20 int''s in the actual exe file (at least this is how it SHOULD work, unless the compiler is super-optimised and notices this kind of stuff).
if you use const however, it just uses pointers to the same piece of memory. Now, this was probably a bad example, because the #define method uses 640 bits (20*32), and the const uses 672 bits (20*32 pointers, plus 1*32 int).
However, in more complex code, where the size of the constant structure/class is larger than the size of a pointer, const starts saving lots of space.
lets say you have 20 128 bit class''s (4 ints each).
#define = 2560 bits
const = 768 bits.
of course, this is all invalidated by optimised compilers, but this is the basic theories behind them.
A man said to the universe:
"Sir I exist!"
"However," replied the universe,
"The fact has not created in me
A sense of obligation."
lets say you have an option:
#define blah 42
-or-
const int blah = 42;
now, in the code, you use ''blah'' twenty times.
the preprocessor literally replaces ''blah'' with 42, and the compiler creates space for 20 int''s in the actual exe file (at least this is how it SHOULD work, unless the compiler is super-optimised and notices this kind of stuff).
if you use const however, it just uses pointers to the same piece of memory. Now, this was probably a bad example, because the #define method uses 640 bits (20*32), and the const uses 672 bits (20*32 pointers, plus 1*32 int).
However, in more complex code, where the size of the constant structure/class is larger than the size of a pointer, const starts saving lots of space.
lets say you have 20 128 bit class''s (4 ints each).
#define = 2560 bits
const = 768 bits.
of course, this is all invalidated by optimised compilers, but this is the basic theories behind them.
A man said to the universe:
"Sir I exist!"
"However," replied the universe,
"The fact has not created in me
A sense of obligation."
This is my signature. There are many like it, but this one is mine. My signature is my best friend. It is my life. I must master it as I must master my life. My signature, without me, is useless. Without my signature, I am useless.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement