quote: Original post by freakchild
1) Some argue that hungarian notation makes code more readable. It's been said in this thread. Of course, it is only more readable if you are actually used to it and you like it. If you aren't familiar with it then it looks like a complete nightmare. I believe the majority don't use hungarian notation (don't bother to debate this as I don't have any proof, but how much code do you see with it and how much without), as such to most people I believe code with hungarian notation is unreadable. People generally understand the basic m_, g_, p and a few other ones don't do any harm.
quote: Original post by Houdini
Completely subjective, as you pointed out, and cannot be argued for or against.
Sorry, but I think my (subjective) belief that most people don't and won't use hungarian notation is an argument against. I am subjective because I do not have facts, but I do believe that my opinions about the majority not wanting to use it are correct.
quote: Original post by freakchild
2) Things like 'dw' can still be quite ambiguous. What is a 'dw'? Well, we all know what it should be on this higher-level that we are working on but the underlying machine can be quite different. I expect people to question what I mean about that. The answer is that people used to think quite differently to what we do today about what WORD meant, and it was different on each machine. The meaning of it (or common use of it) has changed since then and although we do think of these things commonly as standard sizes, can you be sure that 'dw' is the same on each machine. Although it commonly is...you can't be 100% sure.Original post by Houdini
As far as I know, a WORD universally 2 bytes, like a byte is 8 bits, a nibble is 4 bits, etc. But in the chance that I'm wrong, you can just use what DWORD are typedef'd as: unsigned long. ul would be the prefix.
I understand what you say, but I'm not sure that you got the gist over what I was saying. People did previously think of WORDs as being relevant to what the machine could handle effectively, nothing to do with a relation to 16-bits. Some people even used to compare performance of a computer depending on the word size.quote: Original post by freakchild
3) Anyone who uses 'dw' as a prefix for things like numCattle might wish to continue to fool themselves that this is a good idea. This should be an int (unsigned in some cases), not a DWORD. In that way you avoid (2) and by using the int you guarantee that your code is using the fastpath type for the machine it is running on (it's a portability and optimization thing). If you want a type of a definate size (which an int isn't) then make sure (2) doesn't catch you out…a good way is to define your own sized types (uint32 or u32, etc). Things like numCattle rarely need a definate size…question yourself about whether or not you need to use a sized type.
quote: Original post by Houdini
The first argument has nothing to do with Hungarian Notation. You are prolly right, they should name it iNumCattle. If they DID want 4 bytes only, you suggest using your own defined types. This is exactly what a DWORD is: typedef unsigned
long DWORD;
I don't think that what I wrote about this was unreasonable and I understand that you think DWORD is 4 bytes. Of course, without realising that a WORD wasn't always 2 bytes then you would. It has everything to do with hungarian notation, or at least the misuse of it. My point is, DWORD is not universally 4 bytes, just commonly, modernly, etc. But what you want to achieve by using it, to define is as 4 bytes won't cut it in all circles. Hence why I suggest defining your own, that does guarantee this.quote: Original post by freakchild
4) Someone mentioned that you might want to change a 'dw' to something else at a later date. Someone replied that if you use get() and set() then this doesn't ripple any changes through.
a) get() and set() aren't going to stop you having to change 'dw'. You personally would still have to change all the references to this variable that don't use get() and set().
b) Surely your get() and set() also return dw's? That means you need to change the signature of your get() and set(), which is going to cause a ripple in other peoples code anyway. This is a bad thing, not only because of the ripples, but because your change to the underlying implementation has caused a change in your interface (even if it is private). Not always avoidable in some cases, but you should minimize this sort of thing and 'dw' get() and set() helps to cause it.
c) Can you also be sure that friend classes you are not responsible for use your get() and set()? No. You cannot guarantee that this would cause a serious ripple. (debate about friends now anticipated :-) ).
quote: Original post by Houdini
Changing what Get() returns and Set() accepts has nothing to do with Hungarian Notation. If you don't use Hungarian Notation and you change the type then it will still ripple throughout your program. Using or not using Hungarian Notation does not effect this. In fact, having to change your variable name after a type change is the EASIEST part of the change. The "rippples" is what you have to worry about. This is one reason why you must really plan before hand so your variables types don't change, whether you use Hungarian Notation or not.
I didn't say it had anything to do with hungarian notation. I said that someone pointed out one thing, and then someone pointed out another that is related to a common situation (problem) with using hungarian notation. I would have pointed out that this is probably because they didn't plan it properly in the first place, but is something debatable - there are plenty of other reasons, as such not worth starting a war over.quote: Original post by freakchild
5) Re-usability. I mentioned above that I didn't believe the majority used hungarian. If you want other people to use your code, they aren't going to bother if you force hungarian onto them because your interface uses it. Sure, they can do a cast but then again they may not want to. I am sure, this is the reason why most publically available code doesn't use hungarian.
quote: Original post by Houdini
Most people who do not use Hungarian Notation follow no standard at all. Meaning the interface for ANY library will most likely be different from the person who uses it. So it wouldn't matter if it is Hungarian Notation or some other person's personal "standard".
Your first sentence there…Is that subjective? Or at least the same 'subjective' that you accuse my comments of being. I think it is a bit of a sweeping generalization to say it.
You do miss my point about the rest of the matter though…most people hate hungarian notation (subjective POV) and thats why it isn't used much in publically available things, its not going to make people want to rush out and use it.
The 'interface for ANY library' being different…thats obviously true, but the point is by avoiding hungarian they make the library more appealing to the masses. Of course, only the ANSI standard is the lowest common denominator (or it should be).quote: Original post by Houdini
This is also a good reason to use Hungarian Notation. It's one of the most widely used "standards" out there. IMO, standards in coding is a GOOD THING (to a point of course). One of the first things a group of people trying to write a program do is create a standard naming convention. If everyone uses their own naming convention on the same program it quickly becomes a huge mess.
Given what I said about WORD I don't see that you are getting my point. The definition of WORD or DWORD can only 'probably' be held as a standard, so it isn't necessarily a good thing to define it and rely on it being a size. I agree that a standard is a good thing, but don't wish to go into it (perhaps another topic).quote: Original post by Houdini
And as for people not bothering using code that has Hungarian Notation… a lot of people use DirectX…
People don't bother with things they don't like…period. Although to avoid that sort of smart arsed comment I should have added 'unless they have to' and I think this is that case. You see, not a lot of people have a choice about using DirectX, what with it being a commercial standard that many publishers require you support.
I am quite sure that if people could use a version of DirectX without hungarian notation then they would choose that version (is this the subjective majority again?).
I don't know anyone that uses DirectX and adopts the exact same notation structure because of it. I don't see a horde of game developers shouting about it and having a party because DirectX has enlightened them. More to the point, most people who develop games wrap around DirectX and _part_ of the reason behind that it because they don't like hungarian notation.
Edited by - freakchild on December 4, 2000 6:40:52 PM
Edited by - freakchild on December 4, 2000 6:49:52 PM
Edited by - freakchild on December 4, 2000 6:55:36 PM