Wasn''t hungarian notation invented only in an attempt to make the world adhere to another of Microsoft''s standards?
If programmers were smart enough to figure variable types from their context it wouldn''t be needed it either.
-Mezz
How many of you uses Hungarian notations?
Notation is useful, but only in the most common and limited sense (things like m_, g_ and p). In the full hungarian sense I don''t think it is useful.
1) Some argue that hungarian notation makes code more readable. It''s been said in this thread. Of course, it is only more readable if you are actually used to it and you like it. If you aren''t familiar with it then it looks like a complete nightmare. I believe the majority don''t use hungarian notation (don''t bother to debate this as I don''t have any proof, but how much code do you see with it and how much without), as such to most people I believe code with hungarian notation is unreadable. People generally understand the basic m_, g_, p and a few other ones don''t do any harm.
2) Things like ''dw'' can still be quite ambiguous. What is a ''dw''? Well, we all know what it should be on this higher-level that we are working on but the underlying machine can be quite different. I expect people to question what I mean about that. The answer is that people used to think quite differently to what we do today about what WORD meant, and it was different on each machine. The meaning of it (or common use of it) has changed since then and although we do think of these things commonly as standard sizes, can you be sure that ''dw'' is the same on each machine. Although it commonly is...you can''t be 100% sure.
3) Anyone who uses ''dw'' as a prefix for things like numCattle might wish to continue to fool themselves that this is a good idea. This should be an int (unsigned in some cases), not a DWORD. In that way you avoid (2) and by using the int you guarantee that your code is using the fastpath type for the machine it is running on (it''s a portability and optimization thing). If you want a type of a definate size (which an int isn''t) then make sure (2) doesn''t catch you out...a good way is to define your own sized types (uint32 or u32, etc). Things like numCattle rarely need a definate size...question yourself about whether or not you need to use a sized type.
4) Someone mentioned that you might want to change a ''dw'' to something else at a later date. Someone replied that if you use get() and set() then this doesn''t ripple any changes through.
a) get() and set() aren''t going to stop you having to change ''dw''. You personally would still have to change all the references to this variable that don''t use get() and set().
b) Surely your get() and set() also return dw''s? That means you need to change the signature of your get() and set(), which is going to cause a ripple in other peoples code anyway. This is a bad thing, not only because of the ripples, but because your change to the underlying implementation has caused a change in your interface (even if it is private). Not always avoidable in some cases, but you should minimize this sort of thing and ''dw'' get() and set() helps to cause it.
c) Can you also be sure that friend classes you are not responsible for use your get() and set()? No. You cannot guarantee that this would cause a serious ripple. (debate about friends now anticipated :-) ).
5) Re-usability. I mentioned above that I didn''t believe the majority used hungarian. If you want other people to use your code, they aren''t going to bother if you force hungarian onto them because your interface uses it. Sure, they can do a cast but then again they may not want to. I am sure, this is the reason why most publically available code doesn''t use hungarian.
I normally find that those who use hungarian feel their lives have change for the better because of it. I also find that these people normally change their mind about it at a later date, when some big problem smacks them in the face.
1) Some argue that hungarian notation makes code more readable. It''s been said in this thread. Of course, it is only more readable if you are actually used to it and you like it. If you aren''t familiar with it then it looks like a complete nightmare. I believe the majority don''t use hungarian notation (don''t bother to debate this as I don''t have any proof, but how much code do you see with it and how much without), as such to most people I believe code with hungarian notation is unreadable. People generally understand the basic m_, g_, p and a few other ones don''t do any harm.
2) Things like ''dw'' can still be quite ambiguous. What is a ''dw''? Well, we all know what it should be on this higher-level that we are working on but the underlying machine can be quite different. I expect people to question what I mean about that. The answer is that people used to think quite differently to what we do today about what WORD meant, and it was different on each machine. The meaning of it (or common use of it) has changed since then and although we do think of these things commonly as standard sizes, can you be sure that ''dw'' is the same on each machine. Although it commonly is...you can''t be 100% sure.
3) Anyone who uses ''dw'' as a prefix for things like numCattle might wish to continue to fool themselves that this is a good idea. This should be an int (unsigned in some cases), not a DWORD. In that way you avoid (2) and by using the int you guarantee that your code is using the fastpath type for the machine it is running on (it''s a portability and optimization thing). If you want a type of a definate size (which an int isn''t) then make sure (2) doesn''t catch you out...a good way is to define your own sized types (uint32 or u32, etc). Things like numCattle rarely need a definate size...question yourself about whether or not you need to use a sized type.
4) Someone mentioned that you might want to change a ''dw'' to something else at a later date. Someone replied that if you use get() and set() then this doesn''t ripple any changes through.
a) get() and set() aren''t going to stop you having to change ''dw''. You personally would still have to change all the references to this variable that don''t use get() and set().
b) Surely your get() and set() also return dw''s? That means you need to change the signature of your get() and set(), which is going to cause a ripple in other peoples code anyway. This is a bad thing, not only because of the ripples, but because your change to the underlying implementation has caused a change in your interface (even if it is private). Not always avoidable in some cases, but you should minimize this sort of thing and ''dw'' get() and set() helps to cause it.
c) Can you also be sure that friend classes you are not responsible for use your get() and set()? No. You cannot guarantee that this would cause a serious ripple. (debate about friends now anticipated :-) ).
5) Re-usability. I mentioned above that I didn''t believe the majority used hungarian. If you want other people to use your code, they aren''t going to bother if you force hungarian onto them because your interface uses it. Sure, they can do a cast but then again they may not want to. I am sure, this is the reason why most publically available code doesn''t use hungarian.
I normally find that those who use hungarian feel their lives have change for the better because of it. I also find that these people normally change their mind about it at a later date, when some big problem smacks them in the face.
quote: Original post by Mezz
Wasn't hungarian notation invented only in an attempt to make the world adhere to another of Microsoft's standards?
Ah, everyone loves a good conspiracy theory. Can't Microsoft do something without someone saying they are just doing that to take control over something? I mean, what possible good does it do them if they did create the whole Hungarian Notation JUST so people followed "their standard"?
quote: Original post by Mezz
If programmers were smart enough to figure variable types from their context it wouldn't be needed it either.
This has been stated before. What could you possibly naem a variable that contains the amount of buttons on the screen. ButtonCount could be one, but you can't tell from the context whether it is a byte, integer, word, dword. What could you naem it that would tell you the type without actually stating the type?
And yes, knowing the type IS very important. Ie, in this example:
for (int x = 0; x < ButtonCount; x++){ buttons[x]->Update();}
Looks like it would work fine yes? Too bad ButtonCount is an unsigned int. If you have your compiler warning set high enough you SHOULD get a warning on this. Or perhaps ButtonCount is just a byte and now you are wasting memory using an integer when you only needed a byte. No compiler warning on this one.
Or better yet, how can you tell from context whether ButtonCount is a local variable, global variable, or a member variable?
quote: Original post by FordPrefect
True, but the idea is the same. WPARAM is word paramater, LPARAM is long paramater. It has the same problem as hungarian notation.
Not really. It's a bad idea to use Hungarian Notation on variable types for this very reason. It makes no sense to typedef a variable type and put in the new variable type name the type it was typedef'd from (I hope you followed that). I mean, you might as well not use your new typedef. Microsoft made a mistake here, plain and simple.
quote: Original post by FordPrefect
Hardly. No matter how well you plan out your program, things like this are going to happen. If you can plan out your code to such minute detail and have the final working program ahdere to it strictly, then you are a much better programmer than I.
True you will have changes, but they should be small and few. And it is the same with any part of programming: changes will be made, plain and simple. That's no reason to shy away from using a variable name, just because there is a CHANCE that you'll have to change it later. If anything, putting in the variable type forces you to think whether it is the best type to use.
quote: Original post by FordPrefect
Obviously it is, and this is not what I'm suggesting. The name of a variable should reflect what the variable represents, not how it is represented, as that detail can be tentative.
That makes sense from a coding standpoint, but not from a maintaining standpoint. Out of curiousity, have you ever had to maintain a 500,000 line program written by someone else? I've found Hungarian Notation to be a life saver in these instances.
Also, what about using g for global, m for member variable? These should always stay the same throughout development and once again it makes mainting someone elses code 100x easier.
- Houdini
Edited by - Houdini on December 4, 2000 4:10:44 PM
- Houdini
quote: Original post by freakchild
Notation is useful, but only in the most common and limited sense (things like m_, g_ and p). In the full hungarian sense I don't think it is useful.
Agreed that m_, g_ and p are extremely useful.
quote: Original post by freakchild
1) Some argue that hungarian notation makes code more readable. It's been said in this thread. Of course, it is only more readable if you are actually used to it and you like it. If you aren't familiar with it then it looks like a complete nightmare. I believe the majority don't use hungarian notation (don't bother to debate this as I don't have any proof, but how much code do you see with it and how much without), as such to most people I believe code with hungarian notation is unreadable. People generally understand the basic m_, g_, p and a few other ones don't do any harm.
Completely subjective, as you pointed out, and cannot be argued for or against.
quote: Original post by freakchild
2) Things like 'dw' can still be quite ambiguous. What is a 'dw'? Well, we all know what it should be on this higher-level that we are working on but the underlying machine can be quite different. I expect people to question what I mean about that. The answer is that people used to think quite differently to what we do today about what WORD meant, and it was different on each machine. The meaning of it (or common use of it) has changed since then and although we do think of these things commonly as standard sizes, can you be sure that 'dw' is the same on each machine. Although it commonly is...you can't be 100% sure.
As far as I know, a WORD universally 2 bytes, like a byte is 8 bits, a nibble is 4 bits, etc. But in the chance that I'm wrong, you can just use what DWORD are typedef'd as: unsigned long. ul would be the prefix.
quote: Original post by freakchild
3) Anyone who uses 'dw' as a prefix for things like numCattle might wish to continue to fool themselves that this is a good idea. This should be an int (unsigned in some cases), not a DWORD. In that way you avoid (2) and by using the int you guarantee that your code is using the fastpath type for the machine it is running on (it's a portability and optimization thing). If you want a type of a definate size (which an int isn't) then make sure (2) doesn't catch you out...a good way is to define your own sized types (uint32 or u32, etc). Things like numCattle rarely need a definate size...question yourself about whether or not you need to use a sized type.
The first argument has nothing to do with Hungarian Notation. You are prolly right, they should name it iNumCattle. If they DID want 4 bytes only, you suggest using your own defined types. This is exactly what a DWORD is: typedef unsigned long DWORD;
quote: Original post by freakchild
4) Someone mentioned that you might want to change a 'dw' to something else at a later date. Someone replied that if you use get() and set() then this doesn't ripple any changes through.
a) get() and set() aren't going to stop you having to change 'dw'. You personally would still have to change all the references to this variable that don't use get() and set().
b) Surely your get() and set() also return dw's? That means you need to change the signature of your get() and set(), which is going to cause a ripple in other peoples code anyway. This is a bad thing, not only because of the ripples, but because your change to the underlying implementation has caused a change in your interface (even if it is private). Not always avoidable in some cases, but you should minimize this sort of thing and 'dw' get() and set() helps to cause it.
c) Can you also be sure that friend classes you are not responsible for use your get() and set()? No. You cannot guarantee that this would cause a serious ripple. (debate about friends now anticipated :-) ).
Changing what Get() returns and Set() accepts has nothing to do with Hungarian Notation. If you don't use Hungarian Notation and you change the type then it will still ripple throughout your program. Using or not using Hungarian Notation does not effect this. In fact, having to change your variable name after a type change is the EASIEST part of the change. The "rippples" is what you have to worry about. This is one reason why you must really plan before hand so your variables types don't change, whether you use Hungarian Notation or not.
quote: Original post by freakchild
5) Re-usability. I mentioned above that I didn't believe the majority used hungarian. If you want other people to use your code, they aren't going to bother if you force hungarian onto them because your interface uses it. Sure, they can do a cast but then again they may not want to. I am sure, this is the reason why most publically available code doesn't use hungarian.
Most people who do not use Hungarian Notation follow no standard at all. Meaning the interface for ANY library will most likely be different from the person who uses it. So it wouldn't matter if it is Hungarian Notation or some other person's personal "standard".
This is also a good reason to use Hungarian Notation. It's one of the most widely used "standards" out there. IMO, standards in coding is a GOOD THING (to a point of course). One of the first things a group of people trying to write a program do is create a standard naming convention. If everyone uses their own naming convention on the same program it quickly becomes a huge mess.
And as for people not bothering using code that has Hungarian Notation... a lot of people use DirectX...
- Houdini
Edited by - Houdini on December 4, 2000 4:37:31 PM
- Houdini
Where I used to work all the variables had standard names, i.e.
lives, count, level, and so on..
and these were a mixture of 8 and 16 bits.
You wouldn''t believe the nightmare bugs that this caused because the compilers and linkers were too dumb to notice you extern''ng a variable as an int when it was declared as a char elsewhere.
Using hungarian notation made this a hell of a lot easier because people knew exactly the variable type from the name.
lives, count, level, and so on..
and these were a mixture of 8 and 16 bits.
You wouldn''t believe the nightmare bugs that this caused because the compilers and linkers were too dumb to notice you extern''ng a variable as an int when it was declared as a char elsewhere.
Using hungarian notation made this a hell of a lot easier because people knew exactly the variable type from the name.
Maybe we should generalize the problem and say that some kind of prefixing is a good idea. Hungarian Notation is just a very good one because almost all programmers understand it and it fits nicely into the very wide used Win32 and DX SDKs...
Tim--------------------------glvelocity.gamedev.netwww.gamedev.net/hosted/glvelocity
Gimme a good ol'' i for an index variable over a iIndex or dwIndex or whatever letter you feel like for a prefix anyday.
The one problem with everyone argument is, Hungarian notation is neither the be all and end all of style nor an evil style made by the devil himself. Basically, follow whatever style you wish, but follow it all throughout your code. I''d take code that follows a consistant style over code that uses about 5 different ones depending on the programmers mood(sort of like mine) any day.
-----------------------------
A wise man once said "A person with half a clue is more dangerous than a person with or without one."
The Micro$haft BSOD T-Shirt
The one problem with everyone argument is, Hungarian notation is neither the be all and end all of style nor an evil style made by the devil himself. Basically, follow whatever style you wish, but follow it all throughout your code. I''d take code that follows a consistant style over code that uses about 5 different ones depending on the programmers mood(sort of like mine) any day.
-----------------------------
A wise man once said "A person with half a clue is more dangerous than a person with or without one."
The Micro$haft BSOD T-Shirt
-----------------------------A wise man once said "A person with half a clue is more dangerous than a person with or without one."The Micro$haft BSOD T-Shirt
I think that hungarian notation is a good thing, but I still don''t use it . And about comments, I think comments make code overly hard to use at times, a single comment per function or piece complex code is enough.
http://www.gdarchive.net/druidgames/
http://www.gdarchive.net/druidgames/
I like to comment much....
But I notice the following error that most heavy-commenters make: They forget to comment WHAT a function does and WHAT it returns. If I want to understand code or just use it I need to know what the fuctions do and return before I can go deeper. And then I like to see what a variable does, what scope it has and what type it is. m_pOctree for example tells me exactly that
Tim
--------------------------
glvelocity.gamedev.net
www.gamedev.net/hosted/glvelocity
But I notice the following error that most heavy-commenters make: They forget to comment WHAT a function does and WHAT it returns. If I want to understand code or just use it I need to know what the fuctions do and return before I can go deeper. And then I like to see what a variable does, what scope it has and what type it is. m_pOctree for example tells me exactly that
Tim
--------------------------
glvelocity.gamedev.net
www.gamedev.net/hosted/glvelocity
Tim--------------------------glvelocity.gamedev.netwww.gamedev.net/hosted/glvelocity
I use it almost exclusively unless I''m testing something out... but when code goes to "final" version, I make sure that it''s notated properly. I don''t like reading code that isn''t notated.
(Note 1: Seems a lot of people who don''t use Hungarian notation like to also insult it...)
(Note 2: Odd that Microsoft meandered from strict H.N. in previous DirectX releases, to half-and-half in DirectX 8, isn''t it?)
MatrixCubed
http://www.MatrixCubed.org
(Note 1: Seems a lot of people who don''t use Hungarian notation like to also insult it...)
(Note 2: Odd that Microsoft meandered from strict H.N. in previous DirectX releases, to half-and-half in DirectX 8, isn''t it?)
MatrixCubed
http://www.MatrixCubed.org
[ Odyssey Project ]
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement