I have a program that grabs a user input number and adds the appropriate ending (1st, 11th, 312th, etc) So I gave it to my friend to find bugs/ break it. The code is
int number;
printf("\n\nNumber to end?: ");
scanf("%d",&number);
getchar();
numberEnder(number);
when 098856854345712355 was input, scanf passed 115573475 to the rest of the program. Why? 115573475 is less than the max for an int.
098856854345712355
is too big of a number for int.
long long number;
printf("\n\nNumber to end?: ");
scanf("%lld",&number);
getchar();
numberEnder(number);
The answer to why you get a specific garbage answer is GIGO. The standard doesn't specify the result for bad input and it's implementation specific. I would hazard a guess that if you apply a 32 bit mask to the input, that's the number you would get.
Just out of curiosity, I looked up an implementation of scanf.
It boils down to
…
res = strtoq(buf, (char **)NULL, base);
…
*va_arg(ap, long *) = res;
…
Which should truncate off the high-bits in the cast. That would support the bit-mask guess, but it would only be the case for this implementation.