Advertisement

Use of define over const

Started by July 13, 2001 02:15 PM
16 comments, last by the_grip 23 years, 7 months ago
Is there any reason most books use #define instead of using const (i.e. screen size, etc.)? i know it is usually considered better coding to go with const if you are using C++. "You call him Dr. Grip, doll!"
"You call him Dr. Grip, doll!"
It''s two completely different things. #defines let you replace any text with any text in the code. Consts are constants, actual values with actual types attached. When it makes sense to use constants, then I agree, you should use constants. But when the typed nature of constants becomes a bother and a potential future obstacle to changing types within your program, it makes sense to to just use #define. Really, it rarely causes as much problems as it is made out to, and it''s certainly nowhere near as bad as using goto.
BetaShare - Run Your Beta Right!
Advertisement
using const is, I guess "cleaner" in that the it''s not just a simple pre-processor text substitution, but I worked with C for a very long time and old habits die hard. I still use #define.
Heya,

The main difference between them, is that const is type safe!
I must admit that I can''t give you a nice example right now, but I will post it if I''ve found it

Also when you don''t reference to a const, the const object will not be created and only substitution is applied. So there is no overhead. This is actually a main difference between const in C++ and const in C!

Gr,
BoRReL





quote:
Original post by the_grip
Is there any reason most books use #define instead of using const (i.e. screen size, etc.)?

i know it is usually considered better coding to go with const if you are using C++.

"You call him Dr. Grip, doll!"


The only reasons why most books use #define instead of const is simply because #define is an old C standard for constants. My teacher in college had the same problem. She would freak out when she saw me use const as she had learned that constants in C should be declared using #define. It''s an old habbit. You can easily replace the #define with a const. Heck, you should replace #defines by const...



"And that''s the bottom line cause I said so!"

Cyberdrek
Headhunter Soft
A division of DLC Multimedia

Resist Windows XP''s Invasive Production Activation Technology!

"gitty up" -- Kramer
[Cyberdrek | ]
My big thing with #defines is that they are EXTREMELY powerful if you know how to use them, since they also come along with a cool set of preprocesor operators to assist you. Lemme take a snipit from the half-life source code for a second:

  #define SetThink( a ) ThinkSet( static_cast <void (CBaseEntity::*)(void)> (a), #a )  


Basically, the a substitutes the actual paramater, while the #a , which is one of those special preprocesor operators, takes the a and puts it in quotes . It then becomes odvious that the ThinkSet function requires a function pointer AND a char name of the function, which is so slickly and easily implemented here. It would be hard to do this is such a simple manner otherwise.
Advertisement
Another difference is that ''const'' are treated like variables that can''t be modified, so any expressions
involving ''const'' are evaluated at runtime, not compile time.

#define NUM_OBJ 30
#define OBJ_SIZE 5
...

totalObjSize = NUM_OBJ * OBJ_SIZE;

The right side of the assignment statement above is evaluated at compile time.

const int NUM_OBJ = 30;
const int OBJ_SIZE = 5;

...

totalObjSize = NUM_OBJ * OBJ_SIZE:

The right side above gets evaluated at runtime.
One of my co-workers was "bitten in the ass" as he put it on Friday due to using #defines:

#define DSP_START_ADDR    0x4000#define BUFFER_START_ADDR DSP_START_ADDR + 0x1000#define BUFFER_END_ADDR   DSP_START_ADDR + 0x1FFFint get_buf_size (){  return BUFFER_END_ADDR - BUFFER_START_ADDR;}  

Obviously, this returns a size of 0x2FFF instead of 0xFFF. What, it's not obvious? Thank you #define obfuscation.

If you used const instead, this wouldn't happen.

BTW, this target was in C (NT device development kit doesn't support C++, at least the one we have), so this has nothing to do with preferring C over C++. const has been in C since the dark ages; it's just that people don't use it when they should.

PS: I'd also like to add that the AP above is incorrect. const variables can be, and usually are, evaluated at compile time. Use your favorite compiler to compile his code and you will find that it compiles to the exact same thing both ways.

Edited by - Stoffel on July 16, 2001 12:17:24 PM
On the side of define:

Well, if your co-worker was being smart about things, he would have put parens around his expressions like most careful programmers are wont to do.


On the side of const:

If you use a variable very frequently, sometimes using const will net you a small memory savings, since it''s reading from one location in memory, rather than inserting the value multiple times throughout your code. (This makes more of a difference with strings)
const can make your exe smaller rather than using define. with define everytime u use the var that var is put into th exe. with const you have one declaration and then the exe goes back to tthat declaration for the value. there for if you define a=20 and use a 500 times, its in the exe 500 times. if you say const a=20 an duse it 500 times its in your exe 1nce. disregard that if your compiler has super intelligent optimization.

Reality Makes Me XIC
I don''t do spelling, I hack code: passion is my feul. Use my programs, experience genius.
http://www.x-i-c.com/
I am XiCI don't do talk, I code: passion is my feul. Use my programs, experience XiC. http://www.x-i-c.com/

This topic is closed to new replies.

Advertisement