Search code examples
decimalhexreverse-engineering

Deconstructing Pokémon glitches?


(I apologize if this is the wrong place to ask this. I think it's definitely programming related, though if this belongs on some other site please let me know)

I grew up playing Pokémon Red and Blue, games that were great fun but are somewhat notorious for having numerous exploitable glitches (for example, see this ridiculous speedrun of the game that uses memory corruption to turn the item screen into a hex editor).

Recently, I found an interesting speedrun of the game that uses a glitch called the "ZZAZZ glitch" to corrupt important memory locations and allow the player to almost immediately win the game. According to the author's description of the speedrun, the ZZAZZ glitch works as follows:

To start a Trainer battle, the game needs to load a lot of data, such as [...] the money he'll concede if defeated. When it loads the money is where things can get really ugly. For reasons that are beyond me, money is stored in a completely different manner, the game uses a data structure of three bytes and instead of converting the value to binary, it stores it in "human" representation. For example, $123456 would be stored as 0x123456 instead of 0x01E240, the proper conversion.

[Some invalid entries in the Trainer table] point to location with invalid money data. When the game tries to perform arithmetic with these data in said structure, it goes nuts and starts overwriting huge portions of RAM. More specifically, for every block of three bytes, two of them will contain 0x9999 (the maximum amount of money a trainer could give). This pattern repeats itself many times through RAM. To see this better, I recommend pausing the video on the emulator after the ZZAZZ trainer is faced and set VBA's memory viewer to 0xD070.

This analysis makes sense, but as I programmer I can't help but wonder how on earth the programmers wrote the code that would make this possible. No approach I can think of for writing a function that converts a hexadecimal-encoded decimal number to decimal would ever start filling random blocks of memory with 0x9999 if the input wasn't a valid hexadecimal-encoded decimal number.

My question is - without specifically designing the algorithm to fail this way, is there a straightforward implementation of a conversion from hexadecimal-coded decimal to decimal that could result in this sort of memory corruption when fed in an invalid value?

Again, if this is off-topic, my apologies. My thoughts are that other programmers on this site may have also grown up playing this game, and it sounds like an interesting exercise in reverse-engineering to try to figure out how a glitch like this could be possible.


Solution

  • Mystery solved! It looks like user TheZZAZZGlitch figured out what causes this.

    The glitch is triggered when the game tries to compute an extremely large integer. Internally, the game has a routine that repeatedly adds values to simulate a multiplication. It seems to write bytes as it goes, shifting over an output write position. The code is designed to cut off any value that exceeds 0x009999 so that the player doesn't earn more than $9999 from a trainer battle (the values are stored in hexadecimally-coded decimal). However, the game forgets to reset the output pointer when this occurs, so if an extremely large number is generated, the game will repeatedly write the pattern 0x009999 across RAM by shifting the write pointer over and writing 0x99 to two out of every three bytes.

    Hope this helps!