Search code examples
ccs50

Ascii to binary calculation is flawed


#include <cs50.h>
#include <stdio.h>
#include <string.h>
#include <stdlib.h>

const int BYTE = 8;
void print_bulb(int bit);

int main(void)
{
    string message = get_string("message: ");
    int v = 0;
    for (int i = 0, len = strlen(message); i < len; i++)
    {
        int bits[BYTE];
        v = message[i];
        for (int j = 0; j < 8; j++)
        {
            bits[j] = v % 2;
            printf("%i\n" , bits[j]);
        }
    } 
}

void print_bulb(int bit)
{
    if (bit == 0)
    {
        // Dark emoji
        printf("\U000026AB");
    }
    else if (bit == 1)
    {
        // Light emoji
        printf("\U0001F7E1");
    }
}

My initial idea was to mod the ascii values by 2 and store them in an array since mod only outputs 1s and 0s but I quickly realized that that doesn't work at all. The code outputs 000000011111111 for inputs HI which is not the expected 01001000 01001001

Would there be a better approach to this problem using mod or is there something else I should be trying instead?


Solution

    1. Do not use magic numbers. In limits.h you have a very handy macrodefinition called CHAR_BIT.
    2. Do all arithmetic using unsigned numbers.
    3. You do not need to call strlen.
    4. You forgot to divide by two.
    5. People better read binary numbers if they are displayed from the most significant bit.
    int main(void)
    {
        string message = get_string("message: ");
        int v = 0;
        while (*message)
        {
            int bits[CHAR_BIT];
            unsigned char v = *message++;
            for (int j = CHAR_BIT - 1; j >= 0; j--)
            {
                bits[j] = !!(v & (1 << j));            
                printf("%i" , bits[j]);
            }
            printf("\n");
        } 
    }
    

    https://godbolt.org/z/oan56czGb