Search code examples
language-agnosticbytenibble

Usage of nibble in Programming


Simple question as title says. I saw some questions on StackOverflow(and in internet) about use cases of nibbles, but I don't understand why do we need to use nibbles. I mean byte is the smallest unit of memory in computing so performing operations on it to manipulate half of it does not seem to be efficient. for example there is article on GeeksForGeeks about swapping nibbles in byte. So if there is need of such thing as nibble why it is not defined as data type(like byte or int or any other) in any nonarchaic programming language?

I know history of bits, nibbles or bytes. Also read wikipedia articles and googled lots of stuff to find the answer on my question, but I was not able to. Maybe this will mark as opinion based and closed but I just wanna have some discussion about this topic, I don't know other good place to ask the same question, so please be kind.


Solution

  • “Those who don't know history are doomed to repeat it.” by Edmund Burke

    In the old days, when memory was scarce (20K was considered a lot), terminals were more like typewriters (think keypunch) there was the use of BCD (Binary Coded Decimal). Even the Wikipedia article on Nibble notes BCD.

    Each nibble represented a decimal digit. The nibbles went from 0-9 and a byte could represent two decimal digits. Now when these bytes were printed in code dumps on Green-bar when your program crashed, you could look at the Hex dump portion and read the vales as straight up decimal digits, no need to do the binary conversion of the bits to decimal in your head.

    So does that mean all code was written using BCD? No, if you had a problem with some values you would just add some lines to convert the values to BCD (think debug statements) then read the BDC in the dump instead of doing all of the conversions. You could try to do all of the work in BCD but as with many types, sometimes not all of the functions are present and so you need to change types. Same with BCD sometimes you could not do things with BCD that you could do with integers. Make sense?


    Added question from comment

    Is there any need for nibble right now in modern programming world? and if yes why ?

    I would not expect to see nibbles used with code that is created using modern high level programming language compilers but with microcontrollers I would not be surprised to see nibbles being used.

    Basically if a nibble is not hard coded as part of a set of instructions for a processor I would not expect to see a nibble being used.