As SiCrane says, this is a moot point on currrent compilers. However, if you want to discuss older methods of compilers, than defines are better for small data, worse for things like strings.
As people have pointed out, the data is directly changed for defines. This should make a new variable for each char string, which is wasteful. Consts are much better here.
As for integers, defines save space, because the compiler does not create 20 instances of an integer using defines. It creates a 2-5 byte instruction with the value imbedded in the instruction, with the size depending on the data size. When using consts, it references a variable, which loads the value from memory, and uses a 6 byte instruction, plus references data which may not be in the cache. Pretty small difference, but its there.
But those days seem to be gone, so don''t worry about it.
Rock
#define vs const
February 05, 2000 12:36 PM
i just want to say that using #define''s for every constant in your program is abuse of the pre-processor.
a "shooting offense", as Linus Torvalds would call it.
the typedef and the const are there for a reason -- use them.
#define''s should only be used for macros -- that''s why they exist.
a "shooting offense", as Linus Torvalds would call it.
the typedef and the const are there for a reason -- use them.
#define''s should only be used for macros -- that''s why they exist.
I beg to differ... If defines were intended only for macros, then what type of constants could you use in C? As far as I know, there are no consts in straight C, just defines. So obviously they were created to be used for constants and macros...
If you''re actually interested in how constants and #defines are supposed to be used in C++, why not look at Bjarne Stroustrup''s book, "The C++ Programming Language," pages 95-97 and 160-162. On page 160, he writes about macros,
"Macros are very important in C but have far fewer uses in C++. The first rule about macros is: Don''t use them unless you have to. Almost every macro demonstrates a flaw in the programming language, in the program, or in the programmer."
Macros have been replaced in C++, one, by constants, and two, by inline functions. There is hardly ever a need to use them, and they simply make code more complex and unreadable.
-Derek
"Macros are very important in C but have far fewer uses in C++. The first rule about macros is: Don''t use them unless you have to. Almost every macro demonstrates a flaw in the programming language, in the program, or in the programmer."
Macros have been replaced in C++, one, by constants, and two, by inline functions. There is hardly ever a need to use them, and they simply make code more complex and unreadable.
-Derek
February 05, 2000 05:11 PM
/* unreadable complex C code */
#define PI 3.14159
#define degreesToRadians(a) a * PI / 180
float radA = degreesToRadians(222);
double radB = degreesToRadians(222);
///////////////////////////////////////
// Readable C++ Code
///////////////////////////////////////
const float PI = 3.14159;
template X degreesToRadians(X degrees)
{
return degrees * static-cast PI / 180;
}
float radA = degreesToRadians(222);
double radB = degreesToRadians(222);
#define PI 3.14159
#define degreesToRadians(a) a * PI / 180
float radA = degreesToRadians(222);
double radB = degreesToRadians(222);
///////////////////////////////////////
// Readable C++ Code
///////////////////////////////////////
const float PI = 3.14159;
template X degreesToRadians(X degrees)
{
return degrees * static-cast PI / 180;
}
float radA = degreesToRadians(222);
double radB = degreesToRadians(222);
Good lord, are you saying that C++ is more readable than the C version??? Yikes. Its safer, but....
I''d say for typedefs, there''s no question. Never use a define in place of a typedef.
For macros, I''d say use inline funcs when possible. Its pretty understood that macros are very unsafe.
As for consts, I''d say that if they are optimized into the code now, than use those. But I''m actually curious now, if they are treated like defines when compiled, than did they lose the ability for the debugger to tell you what the value is? I don''t use consts, but that was one of it''s best features.
Rock
I''d say for typedefs, there''s no question. Never use a define in place of a typedef.
For macros, I''d say use inline funcs when possible. Its pretty understood that macros are very unsafe.
As for consts, I''d say that if they are optimized into the code now, than use those. But I''m actually curious now, if they are treated like defines when compiled, than did they lose the ability for the debugger to tell you what the value is? I don''t use consts, but that was one of it''s best features.
Rock
Yeah, I always use typedefs (I was just curious about that one) and inline funcs (sometimes, if it''s REALLY simple I use a macro) but I also use #defines instead of consts...
quote: Original post by Rock2000
...
As for consts, I''d say that if they are optimized into the code now, than use those. But I''m actually curious now, if they are treated like defines when compiled, than did they lose the ability for the debugger to tell you what the value is? I don''t use consts, but that was one of it''s best features.
Rock
It''s not so much that const are treated as defines, as it is simply more optimal to inline the value of the const than it is to use a variable reference in most cases. So consts still have values for when using your debugger. Except that the compiler feels free to stick in its value whenever a variable reference is made in order to make code faster and/or smaller.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement