Advertisement

How many of you uses Hungarian notations?

Started by November 30, 2000 11:55 PM
81 comments, last by vbisme 24 years, 1 month ago
quote: Original post by freakchild

1) Some argue that hungarian notation makes code more readable. It's been said in this thread. Of course, it is only more readable if you are actually used to it and you like it. If you aren't familiar with it then it looks like a complete nightmare. I believe the majority don't use hungarian notation (don't bother to debate this as I don't have any proof, but how much code do you see with it and how much without), as such to most people I believe code with hungarian notation is unreadable. People generally understand the basic m_, g_, p and a few other ones don't do any harm.


quote: Original post by Houdini

Completely subjective, as you pointed out, and cannot be argued for or against.




Sorry, but I think my (subjective) belief that most people don't and won't use hungarian notation is an argument against. I am subjective because I do not have facts, but I do believe that my opinions about the majority not wanting to use it are correct.


quote: Original post by freakchild

2) Things like 'dw' can still be quite ambiguous. What is a 'dw'? Well, we all know what it should be on this higher-level that we are working on but the underlying machine can be quite different. I expect people to question what I mean about that. The answer is that people used to think quite differently to what we do today about what WORD meant, and it was different on each machine. The meaning of it (or common use of it) has changed since then and although we do think of these things commonly as standard sizes, can you be sure that 'dw' is the same on each machine. Although it commonly is...you can't be 100% sure.

Original post by Houdini

As far as I know, a WORD universally 2 bytes, like a byte is 8 bits, a nibble is 4 bits, etc. But in the chance that I'm wrong, you can just use what DWORD are typedef'd as: unsigned long. ul would be the prefix.




I understand what you say, but I'm not sure that you got the gist over what I was saying. People did previously think of WORDs as being relevant to what the machine could handle effectively, nothing to do with a relation to 16-bits. Some people even used to compare performance of a computer depending on the word size.



quote: Original post by freakchild

3) Anyone who uses 'dw' as a prefix for things like numCattle might wish to continue to fool themselves that this is a good idea. This should be an int (unsigned in some cases), not a DWORD. In that way you avoid (2) and by using the int you guarantee that your code is using the fastpath type for the machine it is running on (it's a portability and optimization thing). If you want a type of a definate size (which an int isn't) then make sure (2) doesn't catch you out…a good way is to define your own sized types (uint32 or u32, etc). Things like numCattle rarely need a definate size…question yourself about whether or not you need to use a sized type.


quote: Original post by Houdini

The first argument has nothing to do with Hungarian Notation. You are prolly right, they should name it iNumCattle. If they DID want 4 bytes only, you suggest using your own defined types. This is exactly what a DWORD is: typedef unsigned
long DWORD;



I don't think that what I wrote about this was unreasonable and I understand that you think DWORD is 4 bytes. Of course, without realising that a WORD wasn't always 2 bytes then you would. It has everything to do with hungarian notation, or at least the misuse of it. My point is, DWORD is not universally 4 bytes, just commonly, modernly, etc. But what you want to achieve by using it, to define is as 4 bytes won't cut it in all circles. Hence why I suggest defining your own, that does guarantee this.

quote: Original post by freakchild

4) Someone mentioned that you might want to change a 'dw' to something else at a later date. Someone replied that if you use get() and set() then this doesn't ripple any changes through.

a) get() and set() aren't going to stop you having to change 'dw'. You personally would still have to change all the references to this variable that don't use get() and set().
b) Surely your get() and set() also return dw's? That means you need to change the signature of your get() and set(), which is going to cause a ripple in other peoples code anyway. This is a bad thing, not only because of the ripples, but because your change to the underlying implementation has caused a change in your interface (even if it is private). Not always avoidable in some cases, but you should minimize this sort of thing and 'dw' get() and set() helps to cause it.
c) Can you also be sure that friend classes you are not responsible for use your get() and set()? No. You cannot guarantee that this would cause a serious ripple. (debate about friends now anticipated :-) ).




quote: Original post by Houdini

Changing what Get() returns and Set() accepts has nothing to do with Hungarian Notation. If you don't use Hungarian Notation and you change the type then it will still ripple throughout your program. Using or not using Hungarian Notation does not effect this. In fact, having to change your variable name after a type change is the EASIEST part of the change. The "rippples" is what you have to worry about. This is one reason why you must really plan before hand so your variables types don't change, whether you use Hungarian Notation or not.



I didn't say it had anything to do with hungarian notation. I said that someone pointed out one thing, and then someone pointed out another that is related to a common situation (problem) with using hungarian notation. I would have pointed out that this is probably because they didn't plan it properly in the first place, but is something debatable - there are plenty of other reasons, as such not worth starting a war over.

quote: Original post by freakchild

5) Re-usability. I mentioned above that I didn't believe the majority used hungarian. If you want other people to use your code, they aren't going to bother if you force hungarian onto them because your interface uses it. Sure, they can do a cast but then again they may not want to. I am sure, this is the reason why most publically available code doesn't use hungarian.




quote: Original post by Houdini

Most people who do not use Hungarian Notation follow no standard at all. Meaning the interface for ANY library will most likely be different from the person who uses it. So it wouldn't matter if it is Hungarian Notation or some other person's personal "standard".



Your first sentence there…Is that subjective? Or at least the same 'subjective' that you accuse my comments of being. I think it is a bit of a sweeping generalization to say it.

You do miss my point about the rest of the matter though…most people hate hungarian notation (subjective POV) and thats why it isn't used much in publically available things, its not going to make people want to rush out and use it.

The 'interface for ANY library' being different…thats obviously true, but the point is by avoiding hungarian they make the library more appealing to the masses. Of course, only the ANSI standard is the lowest common denominator (or it should be).

quote: Original post by Houdini

This is also a good reason to use Hungarian Notation. It's one of the most widely used "standards" out there. IMO, standards in coding is a GOOD THING (to a point of course). One of the first things a group of people trying to write a program do is create a standard naming convention. If everyone uses their own naming convention on the same program it quickly becomes a huge mess.



Given what I said about WORD I don't see that you are getting my point. The definition of WORD or DWORD can only 'probably' be held as a standard, so it isn't necessarily a good thing to define it and rely on it being a size. I agree that a standard is a good thing, but don't wish to go into it (perhaps another topic).

quote: Original post by Houdini

And as for people not bothering using code that has Hungarian Notation… a lot of people use DirectX…



People don't bother with things they don't like…period. Although to avoid that sort of smart arsed comment I should have added 'unless they have to' and I think this is that case. You see, not a lot of people have a choice about using DirectX, what with it being a commercial standard that many publishers require you support.

I am quite sure that if people could use a version of DirectX without hungarian notation then they would choose that version (is this the subjective majority again?).

I don't know anyone that uses DirectX and adopts the exact same notation structure because of it. I don't see a horde of game developers shouting about it and having a party because DirectX has enlightened them. More to the point, most people who develop games wrap around DirectX and _part_ of the reason behind that it because they don't like hungarian notation.


Edited by - freakchild on December 4, 2000 6:40:52 PM

Edited by - freakchild on December 4, 2000 6:49:52 PM

Edited by - freakchild on December 4, 2000 6:55:36 PM
quote: Original post by freakchild

Sorry, but I think my (subjective) belief that most people don''t and won''t use hungarian notation is an argument against. I am subjective because I do not have facts, but I do believe that my opinions about the majority not wanting to use it are correct.



No, your subjective belief is when you said that HN looks like a nightmare to people who aren''t used to it. That''s like saying something looks "big" to you. What is big to you is not necessarily "big" to someone else. Personally I thought HN made things easier to understand which is why I, and I''m assuming most of the others who also switched, switched.

quote: Original post by freakchild

I understand what you say, but I''m not sure that you got the gist over what I was saying. People did previously think of WORDs as being relevant to what the machine could handle effectively, nothing to do with a relation to 16-bits. Some people even used to compare performance of a computer depending on the word size.



You are right, I''m not getting the gist =). I understand what you are saying about WORD and DWORD (assuming it is correct) but I don''t see how it has to do with HN. DWORD and WORD are used by people who both use and don''t use HN. So, HN aside, I agree with you.

quote: Original post by freakchild

I don''t think that what I wrote about this was unreasonable and I understand that you think DWORD is 4 bytes. Of course, without realising that a WORD wasn''t always 2 bytes then you would. It has everything to do with hungarian notation, or at least the misuse of it. My point is, DWORD is not universally 4 bytes, just commonly, modernly, etc. But what you want to achieve by using it, to define is as 4 bytes won''t cut it in all circles. Hence why I suggest defining your own, that does guarantee this.



I still don''t understand this. It''s almost like you are saying that people who use DWORD automatically use HN, and vise-versa. If WORD isn''t necessarily 2 bytes then I agree it''s a bad thing to use when you mean 2 bytes, but it really has nothing to do with HN. Most of the time HN docs don''t even mention DWORD and WORD, just the normal types in C/C++. These two issues are totally seperate.

quote: Original post by freakchild

I didn''t say it had anything to do with hungarian notation. I said that someone pointed out one thing, and then someone pointed out another that is related to a common situation (problem) with using hungarian notation. I would have pointed out that this is probably because they didn''t plan it properly in the first place, but is something debatable - there are plenty of other reasons, as such not worth starting a war over.



But it isn''t a "common situation (problem) with using hungarian notation". It''s a "common situation (problem) whether you use HN or not". HN has nothing to do with it.

quote: Original post by freakchild

Your first sentence there...Is that subjective? Or at least the same ''subjective'' that you accuse my comments of being. I think it is a bit of a sweeping generalization to say it.

You do miss my point about the rest of the matter though...most people hate hungarian notation (subjective POV) and thats why it isn''t used much in publically available things, its not going to make people want to rush out and use it.

The ''interface for ANY library'' being different...thats obviously true, but the point is by avoiding hungarian they make the library more appealing to the masses. Of course, only the ANSI standard is the lowest common denominator (or it should be).



No, I was generalizing, as you stated in your second sentance. As for your point... just because most people "hate HN" doesn''t mean that HN is a bad thing. Most kids hate doing their homework, but it is still a good thing to do. I''m not saying that you are saying HN is a bad thing, so don''t get me wrong. The problem is that nobody is taught proper naming conventions when they learn programming. Of course people are going to be hesitant to use HN when they''ve been off on their own using whatever technique they like. It''s not till people are forced into using it at, say, a company, that the usefullness becomes apparent. Once again this is a broad generalization, but it is based on what I''ve seen personally and what people have stated on this board.


quote: Original post by freakchild

Given what I said about WORD I don''t see that you are getting my point. The definition of WORD or DWORD can only ''probably'' be held as a standard, so it isn''t necessarily a good thing to define it and rely on it being a size. I agree that a standard is a good thing, but don''t wish to go into it (perhaps another topic).

Original post by Houdini

And as for people not bothering using code that has Hungarian Notation… a lot of people use DirectX…



People don''t bother with things they don''t like…period. Although to avoid that sort of smart arsed comment I should have added ''unless they have to'' and I think this is that case. You see, not a lot of people have a choice about using DirectX, what with it being a commercial standard that many publishers require you support.

I am quite sure that if people could use a version of DirectX without hungarian notation then they would choose that version (is this the subjective majority again?).

I don''t know anyone that uses DirectX and adopts the exact same notation structure because of it. I don''t see a horde of game developers shouting about it and having a party because DirectX has enlightened them. More to the point, most people who develop games wrap around DirectX and _part_ of the reason behind that it because they don''t like hungarian notation.



For the people that don''t like HN, they don''t have to use it. I, and a lot of others, have personally found it very useful and will go on using it. To me it''s the same as those people who hate comments and think they clutter code, and the others who use it and believe it helps people. (NOTE: I''m not saying which group HN users are which are the non-HN users, I''m just making a comparison). Those who hate HN won''t be swayed by what we say, and those who love HN won''t be swayed by what others say.


- Houdini
- Houdini
Advertisement
quote: Original post by Houdini

Original post by freakchild

Sorry, but I think my (subjective) belief that most people don''t and won''t use hungarian notation is an argument against. I am subjective because I do not have facts, but I do believe that my opinions about the majority not wanting to use it are correct.



No, your subjective belief is when you said that HN looks like a nightmare to people who aren''t used to it. That''s like saying something is "big". What is big to you is not necessarily "big" to someone else. Personally I thought HN made things easier to understand which is why I, and I''m assuming most of the others who also switched, switched.

quote: Original post by freakchild

I understand what you say, but I''m not sure that you got the gist over what I was saying. People did previously think of WORDs as being relevant to what the machine could handle effectively, nothing to do with a relation to 16-bits. Some people even used to compare performance of a computer depending on the word size.



You are right, I''m not getting the gist =). I understand what you are saying about WORD and DWORD (assuming it is correct) but I don''t see how it relates with HN. DWORD and WORD are used by people who both use and don''t use HN. So, HN aside, I agree with you.

quote: Original post by freakchild

I don''t think that what I wrote about this was unreasonable and I understand that you think DWORD is 4 bytes. Of course, without realising that a WORD wasn''t always 2 bytes then you would. It has everything to do with hungarian notation, or at least the misuse of it. My point is, DWORD is not universally 4 bytes, just commonly, modernly, etc. But what you want to achieve by using it, to define is as 4 bytes won''t cut it in all circles. Hence why I suggest defining your own, that does guarantee this.



I still don''t understand this. It''s almost like you are saying that people who use DWORD automatically use HN, and vise-versa. If WORD isn''t necessarily 2 bytes then I agree it''s a bad thing to use when you mean 2 bytes, but it really has nothing to do with HN. Most of the time HN docs don''t even mention how to use HN with DWORD and WORD, rather just the normal types in C/C++. These two issues are totally seperate.

quote: Original post by freakchild

I didn''t say it had anything to do with hungarian notation. I said that someone pointed out one thing, and then someone pointed out another that is related to a common situation (problem) with using hungarian notation. I would have pointed out that this is probably because they didn''t plan it properly in the first place, but is something debatable - there are plenty of other reasons, as such not worth starting a war over.



I don''t see how it''s a "common situation (problem) with using hungarian notation". It''s a "common situation (problem) whether you use HN or not". HN has nothing to do with it. Types of variables will chagne whether you use HN or not, and it will affect the rest of your program.

quote: Original post by freakchild

Your first sentence there…Is that subjective? Or at least the same ''subjective'' that you accuse my comments of being. I think it is a bit of a sweeping generalization to say it.

You do miss my point about the rest of the matter though…most people hate hungarian notation (subjective POV) and thats why it isn''t used much in publically available things, its not going to make people want to rush out and use it.

The ''interface for ANY library'' being different…thats obviously true, but the point is by avoiding hungarian they make the library more appealing to the masses. Of course, only the ANSI standard is the lowest common denominator (or it should be).



No, I was generalizing, as you stated in your second sentance. As for your point… just because most people "hate HN" doesn''t mean that HN is a bad thing. Most kids hate doing their homework, but it is still a good thing to do. I know you aren''t saying that HN is a bad thing, so don''t get me wrong. The problem is that nobody is taught proper naming conventions when they learn programming. So of course people are going to be hesitant to use HN when they''ve been off on their own using whatever technique they like. They would be hesitant to use ANY standard that is different from what they are using.

It''s not till people are forced into using it at, say, a company, that the usefullness becomes apparent. Once again this is a broad generalization, but it is based on what I''ve seen personally and what people have stated on this board.


quote: Original post by freakchild

Given what I said about WORD I don''t see that you are getting my point. The definition of WORD or DWORD can only ''probably'' be held as a standard, so it isn''t necessarily a good thing to define it and rely on it being a size. I agree that a standard is a good thing, but don''t wish to go into it (perhaps another topic).

People don''t bother with things they don''t like…period. Although to avoid that sort of smart arsed comment I should have added ''unless they have to'' and I think this is that case. You see, not a lot of people have a choice about using DirectX, what with it being a commercial standard that many publishers require you support.

I am quite sure that if people could use a version of DirectX without hungarian notation then they would choose that version (is this the subjective majority again?).

I don''t know anyone that uses DirectX and adopts the exact same notation structure because of it. I don''t see a horde of game developers shouting about it and having a party because DirectX has enlightened them. More to the point, most people who develop games wrap around DirectX and _part_ of the reason behind that it because they don''t like hungarian notation.



For the people that don''t like HN, they don''t have to use it. I, and a lot of others, have personally found it very useful and will go on using it. To me it''s the same as those people who hate comments because they clutter code, and the others who use it and believe it helps. (NOTE: I''m not saying which group HN users are and the non-HN users are, I''m just making a comparison). Those who hate HN won''t be swayed by what we say, and those who love HN won''t be swayed by what others say.


- Houdini

- Houdini
Let me ask you, if you are assigned to work on an existing program which is rather huge and you find it doesn''t use HN or any other similar notation (just plain names like ''samples'' or ''gettwo''), would you stick with this convention when you add code to the program or would your new code adhere to i.e. Hungarian Notation? In the first case you remain consistent with the rest of the code, but you are further contributing to the eventual unreadability of the program. In the other case your own code may be easier to read but there will be two different styles within the same program.
quote: Original post by spunge

Let me ask you, if you are assigned to work on an existing program which is rather huge and you find it doesn't use HN or any other similar notation (just plain names like 'samples' or 'gettwo'), would you stick with this convention when you add code to the program or would your new code adhere to i.e. Hungarian Notation? In the first case you remain consistent with the rest of the code, but you are further contributing to the eventual unreadability of the program. In the other case your own code may be easier to read but there will be two different styles within the same program.


Most companies employ standards that programmers must follow when it comes to naming conventions and the like. In this case I would be forced to follow the standards layed, whether I agree with them or not.

If the company didn't employ naming standards, or I was working on someone elses code for personal use at home, I would STILL stick to the convention previously used. Mixing naming conventions results in some very unreadable and confusing code and should be avoided at all costs.

If the program is small I may just actually do a "Find/Replace" and rename the functions/variables. This is assuming, of course, that the program is mine to do with as I please. I would never rename someone elses variables/functions and hand that code back to him/her


- Houdini


Edited by - Houdini on December 5, 2000 12:51:31 PM
- Houdini
> Mixing naming conventions results in some very unreadable and
> confusing code and should be avoided at all costs.

I am of the same opinion. The program in question is written by a scientist (molecular biology) and is full of strange variable names and a ton of globals. Thank God for software analyzing tools.
Advertisement
Okay Houndini, I can well appreciate your comments and think what you replied to what I said is fair.

I think we got confused about the DWORD issue and why we were talking about it. You''re right, DWORD isn''t hungarian notation it''s something separate. I was addressing the problems pointed out by changing dw''s to something else. Yes, this was a little off topic and not really relevant to the actual notation, or even a fault with it. I won''t talk about it any more.

About the subjective issues though...There are probably many inter-related issues here, but HN is a nightmare to someone who doesn''t read it, I think I state the obvious there. It just looks like a complete mess. It time they may feel differently after learning it, but even when it becomes more familiar I think the feedback in this thread has suggested that it''s quite unpopular. Whether they can read it or not, I think people don''t like to do it.

I accept that some people do like and do think it''s a good thing...but I think you have probably seen which side of the fence I sit on. Don''t get me wrong about this though...I personally hate it*, but I''ve done things I hate just for the sake of the others in many occassions. In this case, its not something I worry about too much. I think that if I did like HN then I would not try to use it*, because I would just have to respect the fact that the other people coming onto my team probably don''t want to use it.

* Actually, I did say this earlier on but I do use the commonly used m_, g_, p type things and even b for bool. Rather I avoid the complete extent of the HN system - which is quite exhaustive.

I guess with this I lie somewhere in the middle here in that I don''t find it useful to read ddpfPixelFormat and all that sort of thing (which is of course not official HN, just inspired by it). I don''t use dw or i, like I say numCattle is just as descriptive to me. I think there are those like yourself who may use it to a more extreme extent, then there are the others who use nothing...a class member or pointer just looks like any other variable. Thinking it through I think your comment about these people not having any standards (the one I accused you of a sweeping generalization over) might actually be the case for these people.

I''d rather have people learn to indent code than learn HN... I hate to waste days trying to figure out where the f****ing brackets open and close!

Just a sidenote which has not been mentioned yet:
Windows dynamic link libraries (plain dll''s that is) use the hungarian prefix notation......thought I''d add that too.

regards,
CJ
Personally, I don''t use Hungarian notation, simply because I find it a bit ugly and a bit of a hassle. I''ll probably switch to it someday though . I admit, I haven''t read all the posts in the thread, so forgive me if i''m just about to repeat what''s already been said. However, the fact is that in a large project involving a large team you *must* have some sort of ''standard'' naming convention - be it Hungarian or not. That is indisputable. Seeing as (usually ) H.N does actually give a clear indication as to what sort of a variable you''re dealing with, why not use it? You''re unlikely to find a more concise and legible form of notation elsewhere, and by using H.N you give others who may someday read your code less of a headache trying to figure out what this or that variable is - especially if they''re from microsoft (although if they were, then i''d be worried). Besides, if everyone used H.N, the world would be such a better place, wouldn''t it ?

/me thinks i''m going to switch to H.N right now

Nick - Head Designer, Llamasoft.net

--
Visit our website...

Llamasoft.net
Games, goodies and ingenuity
Nick - Head Designer, Llamasoft.net--Visit our website...Llamasoft.netGames, goodies and ingenuity

This topic is closed to new replies.

Advertisement