Hi everyone my question as dumb as it seems is the following:
Using the man crypt()
salt is a two-character string chosen from the set [a–zA–Z0–9./]
and it's a 12 bits
how is that since it's a two-character string it should be a 16 bits hence a char size is a byte ??
If you count total number of characters in the set you'll see there are 64 (2^6) elements: 26 latin uppercase letters, 26 lowercase, 10 digits plus 2 extra characters: period and slash.
Two 6-bit characters -- 12 bits.
Evidently, crypt() must be mapping salt characters to bit sequences using something-else-than-ASCII (for user convenience, I suspect).