Search code examples
pythonencryptionascii

Why is a whitespace character only represented by 6 bits in ASCII?


I have written a code in python to represent strings using their ASCII counterparts. I have noticed that every character is replaced by 7 bits (as I expected). The problem is that every time I include a space in the string I am converting it is only represented by 6 bits instead of 7. This is a bit of a problem for a Vernam Cipher program I am writing where my ASCII code is always a few bits smaller than my key due to spaces. Here is the code and output below:

string = 'Hello t'
ASCII = ""
for c in string:
    ASCII += bin(ord(c)) 
ASCII = ASCII.replace('0b', ' ')

print(ASCII)

Output: 1001000 1100101 1101100 1101100 1101111 100000 1110100

As can be seen in the output the 6th sequence of bits which represents the space character has only 6 bits and not 7 like the rest of the characters.


Solution

  • Instead of bin(ord(c)), which will automatically strip leading bits, use string formatting to ensure a minimum width:

    f'{ord(c):07b}'