We just stumbled on a bug in the Android version of our app that is caused by a compiler difference on iOS and Android. Curious if someone can explain the difference and how to make the Android native C compiler work like iOS. Here is a simplified version of the problem.
double zone = -7 / 24.0;
char command[80] = { '\0' };
// other stuff
command[7] = zone * 24.0;
Ignore the possible roundoff errors in the above. This is a simplified example.
On iOS, command[7] gets the value -7 put into. This is what I would expect since there should be an automatic cast from double to char (or int) when doing the assignment.
On Android, using the native C compiler, we get 0 put into command[7]. If we explicitly cast it like
command[7] = (int)(zone * 24.0);
Then it gets the same result as iOS.
Does anyone know why the two compilers might be generating different code? If there a compiler flag on Android to make the compiler behave as in iOS? The app has been extensively tested on iOS and we are a bit leery about this problem.
In your Android C implementation, char
is unsigned, and the conversion from -7 in double
to char
is producing zero. (The behavior of this conversion when the value cannot be represented in the destination type is not defined by the C standard.)
In iOS, char
is signed, and the conversion of -7 in double
to char
produces -7.
The compiler may have an -fsigned-char
switch that will make char
signed, or you could change char command[80]…
to signed char command[80]…
.
This explains why command[7] = (int)(zone * 24.0);
gets the desired results:
int
before assigning it to command[7]
, the -7 in double
is converted to -7 in int
. Then this int
is converted to an unsigned char
. That conversion is defined by the C standard, and the result is CHAR_MAX+1-7 (249 in typical C implementations with unsigned char
). That is not the same result on iOS (-7) and Android (249), but they are represented with the same bits. Presumably, they act the same in whatever further use you put them to.