I'm messing around with the source code for an old Java game written in the early nineties. If I remember correctly, it's written for JDK 1.1.
Somewhere in the code, int primitives (in the range of 0 to about 120) are converted to chars. Here's an example:
char c = (char)(i+32);
This causes a problem for ints greater than 95. Here's the code and some of the output from a test case:
for(int i = 120; i >= 0; i--)
System.out.println(i + " -> " + (char)(i+32));
Output:
...
100 -> ?
99 -> ?
98 -> ?
97 -> ?
96 -> ?
95 ->
94 -> ~
93 -> }
92 -> |
91 -> {
90 -> z
89 -> y
88 -> x
87 -> w
...
3 -> #
2 -> "
1 -> !
0 ->
The integer value seems to be lost since the index goes past the bounds of normal character values.
This seems to be the root cause of a bug on the client-side portion of the game's UI. This encoded integer is sent back to the client, which then performs the inverse operation (subtracting 32 from the char and casting to get an int back).
It seems that the '?' is taken literally by the client-side processing module, as the bar is redundantly filled with the mapped integer value for '?' until the server starts sending back values smaller than 95.
char
in Java is a 16bit Unicode character. It is possible the old code was expecting to treat the int values as bytes and you could then convert the bytes to chars specifying different character sets until you get one that makes sense (i.e. new String(byteArrayData, "ASCII")
).
What you will need to take account of is that bytes are signed in Java and so range from -128->+127. If your old game code was expecting to use the values in the extended ascii set (> 127), then you'll need to subtract 256 from any int > 127 to get the correct byte value. See: How does Java convert int into byte? for more details.