I’ve been developing solutions with databases for more than 11 years now, and it seems I’ve “developed” a rather controversial opinion about naming columns in my tables: I always give them a 3 or 4 character type prefix, i.e. intGroupID, nvcTitle, dtmCreated, bitPlayerHater, etc. I’ve worked with several other developers who all absolutely despised the old-school prefix convention.
(yeah, I know, I didn’t invent anything here, I’m just refusing to give it up:)
My primary reasoning is to provide as much information as possible to my fellow developers when they attempt to understand the structure of the data. Knowing the type of the columns instantly gives you (or me, at least) a better mental image of what you’re dealing with. And you usually don’t have the same intellisense support from the IDE when you’re writing queries compared working with C# or VB.NET.
So far nobody has been able to come up with the killer argument that could change my mind on this particular topic. I have a couple of other equally controversial naming conventions which increases clarity, but the column prefix seems to piss more people off.
Why is prefixing database columns considered such a bad practice?
It's called "Hungarian Notation".
As a developer (and data architect), I find it worthless. It doesn't provide much information.
It only provides a quick, inaccurate gloss on part of the type information. It omits length, for example.
In more complex database environments, where the objects are BLOB's, it provides no information at all about the type of object in the blob.
It makes changing data type painful.
It's necessary to remember an obscure prefix. Is it vcName
, strName
or uniName
?
SQL handles type conversions automatically, making fussy type-specific naming largely irrelevant.
Most Important: It provides no useful documentation on the meaning of the data. My experience is that people corrupt the meaning almost always. They're rarely (if ever) confused on whether it's int or string; and when they want to know, they simply describe the table using TOAD or some other tool that gives the ACTUAL type, not a partial summary of the intended type.
[Having said that it's approximately useless, I realize that this is probably not the "killer argument" you're looking for. It would help if you could update your question with the actual reasons, point-by-point, that you feel are the value of Hungarian Notation, so they can be addressed point by point.]