The specifications for TrueType and OpenType specify a checkSumAdjustment in the 'head' or 'bhed' table of an Sfnt. Both specifications describe how to calculate this value but I can't find any information on why this value exists and what it is used for.
Bonus question: Why do I have to subtract from 0xB1B0AFBA?
The point of this value is to allow font engines to detect corruption in the font without actually having to parse all the font data first. Ideally, the checksum would be all the way at the start of the file, but thanks to needing to unify various font formats, isn't. Instead it's in the head table. Silly, but we're stuck with it.
Each table in a font has its own checksum value, so that the engine can verify that parts of a font are correct "as is", but to make things easier the font itself also has a master checksum that's even easier to compute (find its value offset in the byte stream by parsing a minimal amount of data, then sum the entire bytestream as LONGs while treating the four bytes where this checksum is located as 0x00000000) and can be used to determine whether the font followed the OpenType spec when it was encoded without needing to look up what every table says its checksum is, where it starts, how long it is, and then running the same checksum computation several times for different parts of the byte stream. If the master checksum fails, it doesn't even matter if the checksums for individual tables turn out to be correct: there's something wonky about this font.
The subtraction from 0xB1B0AFBA is pretty much just "for historical reasons" because OpenType unified several specs, rather than starting from scratch, so there's some baggage from older formats left in it (the "OS/2" table, for instance, is a general metadata table and has nothing to do with the OS/2 (Warp) operating system anymore, nor has it for a very long time).