Advertisement

#define vs const

Started by January 31, 2000 08:43 PM
26 comments, last by Qoy 24 years, 10 months ago
#define UINT unsigned int
is the same as
typedef unsigned int UINT

The compiler handles them the same, replacing the UINT with unsigned int when UINT is seen in the code. As for #define constancts and const variables, const variables are regular variables that take up memory like regular variables. When you use #define ABC 5, wherever the computer sees ABC it will replace it with 5. Memory is still used because the compiler has to store the 5 so it can use it latter. It''s like literal strings. if you use say "hello" in your program, and parse the exe, you will eventually come across hello. The hello will be loaded into memory when the program is loaded into memory to run. To make a long story short, is doesn''t matter what you use, because memory is always going to be used and if there is a difference in how much it is insignificantly small.

Domini
Mithrandir,

Your example using integers would save very little space because for the most part a pointer to an integer and an integer take up the same memory. Your example would be more compelling when thought of in the light of character constants versus character defines. You are correct in stating that the value is substituted throughout the code. Therefore this will take up more memory unless you implement some sort of memory sharing for your classes. Macros for functions that accept pointers will also take up more room in memory. A smart compiler may be able to recognize this and get around it though. Anyway just thought I would make your example clearer.

Kressilac

Derek Licciardi (Kressilac)Elysian Productions Inc.
Advertisement
This thread is very interesting to me because I have wondered the same thing before.... I alway though a const was replaced with the value in the source code. My compliler even replaces it with the proper value (instead of the name of the const) when it produces error messages.

About worring about memory: I am currently making a HUGE program that is limited to an environment where only 7mb of ram are availible . I am always trying to save a few bytes.

www.trak.to/rdp

Yanroy@usa.com

--------------------

You are not a real programmer until you end all your sentences with semicolons; (c) 2000 ROAD Programming
You are unique. Just like everybody else.
"Mechanical engineers design weapons; civil engineers design targets."
"Sensitivity is adjustable, so you can set it to detect elephants and other small creatures." -- Product Description for a vibration sensor

Yanroy@usa.com

const and #define''s of constants in a modern compiler will emit almost identical code. This is because duplicated constants in code will be tend to be merged by the compiler into a single instance in the data segment of the code.

For example:
#define COW "cow string"
and
const char cow_string[] = "cow string";
will both put the string "cow string" in the data segment, and references to either COW or cow_string will be represented by the address of "cow string" in the data segment.

For simple numerical constants, is is assumed the value won''t change for const variables, so the compiler will feel free to stick in the actual value into the emitted code, which means that if you try to change the value of the const, you may still get values of the original const in computation.
#define UINT unsigned int
is the same as
typedef unsigned int UINT

but only if you don''t do strict typechecking. If you have that on or run through lint the TYPEDEF will throw errors if you do something like:

UINT x;
unsigned int y;

x = 3;
y = x;

Of course this example doesn''t illustrate the benefits of strict type checking, but if you''ve ever done any windows programming you''ve got an idea of how important that is to you (LPARAM) (LPCSTR) blah blah

-the logistical one-http://members.bellatlantic.net/~olsongt
I believe you took my post slightly out of context there. The reason for the post being does one way or the other take up memory. Given the context of the post and misc. other variants that may or may not be there, I don''t believe the information I was trying to give was any more off base than the original post I made to begin with.

These are two very different things.

~deadlinegrunt

Advertisement
To my knowledge, #define''s just expand into the code, whereas const''s create actual variables with constant values in memory. Thus, using #define''s make your program larger, while using const''s just uses more memory.

Brent Robinson
"What if this is as good as it gets?"
"The computer programmer is a creator of universes for which he alone is the lawgiver...No playwright, no stage director, no emperor, however powerful, has ever exercised such absolute athority to arrange a stage or a field of battle and to command such unswervingly dutiful actors or troops." - Joseph Weizenbaum-Brent Robinson
Ok, here''s the deal. After looking at the Disassembly of a little test program, I found to my surprise that #defines and const int (and I assume this is true for most types) were implemented the exact same way. They both inserted the actual integer value into the code. What the program is doing is pretty much just assigning the #defined number 12 (0Ch) to the variable d, then testing if d > (const int blah = 8) and if so, assigning d to blah.

Code Produced:
push ebp ; just init stuff
mov ebp,esp
push ecx

mov dword ptr [d],0Ch ; set d to 12
cmp dword ptr [d],8 ; compare d to 8
jle main(0x00401028)+18h
mov dword ptr [d],8 ; if true set d to 8

Very interesting...

Derek Day
Oh, and by the way, that was with all optimization off.

-Derek
yo you need to leave you slimy fool. what derek saids cool an stuff but you don''t have to get down on you r knees fruitcake boy.

matyou

This topic is closed to new replies.

Advertisement