I wrote some simple code in Python, JavaScript and C. I found that the Python and JavaScript results are the same, but C gives me another -wrong- result and I can't understand what is the error.
C code:
int fact(int n){
if(n==1){
return 1;
}else{
return (n*fact(n-1));
}
}
int main(void){
printf("%i \n",fact(13));
}
JS code:
function fact(n){
if (n==1){
return (1);
}else{
return (n*fact(n-1));
}
}
console.log(fact(13));
Python code:
def fact(n):
if(n == 0):
return 1
else:
return n * fact(n - 1)
print(fact(13))
Can you explain?
Being interpreted languages, they (probably) automatically choose the data type for variables according to the size of the data.
However, in C you specified that "int"
has to be used - and it is way too small to hold 13!
If you switch "int"
for "unsigned long long int"
(yes, use "long" twice), then your program will return proper results for a longer time, until it fails again - exceeding the size of 64-bits.