Search code examples
ccs50caesar-cipher

Why do I get negative values for my ciphertext?


This is a problem set from CS50x. It's a Caesar cipher. It needs to shift all alphabetic letters (uppercase and lowercase) by the key which is input during run-time. It should preserve case, symbols, and numbers.

I thought the code was correct, but I kept getting negative ASCII values for larger key values. On paper/mathematically it shouldn't give negative values. There's something that I am not understanding.

// Caesar cipher//
#include <cs50.h>
#include <stdio.h>
#include <string.h>
#include <stdlib.h>

//cipher key//
int K[];

int main(int argc, string argv[])
{

//checking for only one command-line argument//
    if(argc == 2)
    {

        //checking if key is digit//
          for(int i = 0, n=strlen(argv[1]); i < n; i++)
          {
              K[i] = (argv[1][i]-'0');
              if((K[i] < 0) || (K[i] > 9 ))
            {
                printf("Usage: ./caesar key\n");
                return 1;
                break;
            }
          }


          string p = get_string("plaintext: ");
          printf("ciphertext: ");
          char c[strlen(p)];
          for(int i = 0, n = strlen(p); i < n; i++)
          {

              if(((p[i] > 64) && (p[i] < 91)) || ((p[i] > 96) && (p[i] < 123)))
              {

                 int k = atoi(argv[1]);

                 c[i] = (p[i]+(k % 26));



                 if (c[i] > 122)
                 {
                     c[i] =  (c[i] % 122) + 96;
                 }

                 else if ((c[i] > 90) && (c[i] < 97))
                 {
                     c[i] = (c[i] % 90) + 64;
                 }


                 else
                 {
                     c[i] =  c[i];
                 }
              }
              else
              {
               c[i] = p[i];
              }
             printf("%c",c[i]);
          }

          printf("\n");

    }

    else
    {
      printf("Usage: ./caesar key\n");
      return 1;
    }


}

For example a key of 100 and a plaintext of "z" should give me a value of 118 (v), but I get -112 (which doesn't exist in ASCII).


Solution

  • The error is in the fact that char is signed in your C implementation and in this line:

    c[i] = (p[i]+(k % 26));
    

    When p[i] + k % 26 (no parentheses are needed) exceeds 127, the result does not fit in a char in your implementation, and it is converted to char in an implementation-defined way, likely by wrapping modulo 256 (equivalently, using the low eight bits). Thus, with a key of 100 and the character 'z' with value 122, the result is 122 + 100 % 26 = 122 + 22 + 144, which gets stored in c[i] as −112.

    This can be easily fixed by changing:

    char c[strlen(p)];
    

    to:

    unsigned char c[strlen(p)];