Advertisement

Are #define good to use?

Started by May 14, 2002 08:12 PM
37 comments, last by The Lion King 22 years, 6 months ago
quote: Original post by jenova
i absolutely agree with your post. my comments weren''t necessarily directed at you. my reference to religion is to display the warring nature that seems to errupt over one''s preference in coding style.

Absolutely. This is probably a contentious comment on these forums, but I find a lot of posters here largely lacking in maturity to see that there *are* objectively better ways of doing some things. Not all issues are a matter of religion, and not all beauty is in the eye of the beholder.

There are certain topics I''ve mentioned on Gamedev that have provoked utter outrage. I''ve sometimes raised the same topics on "more mature" forums and managed to get good consensus, and I think that''s a result of people being more ready to discard current beliefs and prejudices, and to *really* think about opposing opinions. It is necessary to challenge one''s own beliefs to work towards enlightenment.

[ C++ FAQ Lite | ACCU | Boost | Python | Agile Manifesto! ]
quote: Original post by Stoffel
I did this to show two things:
1) file-scope constants should be static.

If we''re talking about C++ (which we are), file-scope statics are deprecated in favour of anonymous namespaces.

[ C++ FAQ Lite | ACCU | Boost | Python | Agile Manifesto! ]
Advertisement
quote: Original post by thedo
#define == global search and replace BY THE COMPILER. No runtime overhead as the compiler puts in the #defined value before compilation.

Bad because - try and debug a program using #defines in - you cant as the program only has the #defined.

Dangerous because - it can be almost anything and just try to spot it in a debugger.

Neil


1) reduced runtime overhead would be a absolute correct statement. however, overhead may still appear on RISC architectures if the constant value is immediately needed and re-ordering cannot be done to avoid data dependancy pipeline stalling.

2) i''m pretty sure MS intellisense for VS.NET displays the value of #define constants while debugging. **i will have to check when i get home** .

2) dangerous only when used absolutely improperly. however, the type checking of const wins here, because the compiler can save your....

To the vast majority of mankind, nothing is more agreeable than to escape the need for mental exertion... To most people, nothing is more troublesome than the effort of thinking.
To the vast majority of mankind, nothing is more agreeable than to escape the need for mental exertion... To most people, nothing is more troublesome than the effort of thinking.
quote: Original post by Useless Hacker
If #DEFINE is so bad (and I'm not disputing that), why do Microsoft's DirectX headers for C++ use them so much?

I'd guess to be compatible with C

There are some things you can define that can't be typedef'ed (like #define PASCAL __stdcall)

And there are several things you can do with define macros, that's can't be done with other methods (anything that uses __LINE__, __FILE__, # or ## for instance)

I think the most important rule with #define's, IS_TO_MAKE_THEM_ALL_CAPS, unless you have a really, really good reason not to (suppose #define new DEBUG_NEW)

So unless you're doing something like the above cases, you should prefer templates or constants that provide type information.

And I think a point missed, is that #define's are better to use than hard-coding magic numbers or duplicating strings.

[edited by - Magmai Kai Holmlor on May 21, 2002 12:42:09 PM]
- The trade-off between price and quality does not exist in Japan. Rather, the idea that high quality brings on cost reduction is widely accepted.-- Tajima & Matsubara
quote: Original post by Magmai Kai Holmlor
And I think a point missed, is that #define''s are better to use than hard-coding magic numbers or duplicating strings.

I don''t understand what you''re saying here...
Isn''t that true that most of the Microsoft Header Files include #define. Evenn most of the people think that Microsoft have hired a programer to define #define every day.

Anyway, if #define is used in place of an int, then we can easily change the value from the header instead of going through the whole code and searching for problem.

It''s comfy or is it not?



I can survive anything ... even NUKES!!!

The Lion King
Advertisement
hm... why not try:
#define SOMECODE 0x12345678const char *someMsg = "You should be able to find this.\n";int main(){  printf(someMsg);    // forget about cout for now   return SOMECODE;    // in the exe shoud have bytes: 78 56 34 12} 


I believe you could find both data in your .exe file.
"after many years of singularity, i'm still searching on the event horizon"
quote: Original post by The Lion King
Anyway, if #define is used in place of an int, then we can easily change the value from the header instead of going through the whole code and searching for problem.

typedef?
quote: Original post by The Lion King
Anyway, if #define is used in place of an int, then we can easily change the value from the header instead of going through the whole code and searching for problem.

I''m interested to know how you think using const int prevents doing this.

[ C++ FAQ Lite | ACCU | Boost | Python | Agile Manifesto! ]

This topic is closed to new replies.

Advertisement