Search code examples
hashcrccrc32adler32

Can a CRC32 engine be used for computing CRC16 hashes?


I'm working with a microcontroller with native HW functions to calculate CRC32 hashes from chunks of memory, where the polynomial can be freely defined. It turns out that the system has different data-links with different bit-lengths for CRC, like 16 and 8 bit, and I intend to use the hardware engine for it.

In simple tests with online tools I've concluded that it is possible to find a 32-bit polynomial that has the same result of a 8-bit CRC, example:

  • hashing "a sample string" with 8-bit engine and poly 0xb7 yelds a result 0x97
  • hashing "a sample string" with 16-bit engine and poly 0xb700 yelds a result 0x9700
  • ...32-bit engine and poly 0xb7000000 yelds a result 0x97000000 (with zero initial value and zero final xor, no reflections)

So, padding the poly with zeros and right-shifting the results seems to work. But is it 'always' possible to find a set of parameters that make 32-bit engines to work as 16 or 8 bit ones? (including poly, final xor, init val and inversions)

To provide more context and prevent 'bypass answers' like 'dont't use the native engine': I have a scenario in a safety critical system where it's necessary to prevent a common design error from propagating to redundant processing nodes. One solution for that is having software-based CRC calculation in one node, and hardware-based in its pair.


Solution

  • Yes, what you're doing will work in general for CRCs that are not reflected. The pre and post conditioning can be done very simply with code around the hardware instructions loop.

    Assuming that the hardware CRC doesn't have an option for this, to do a reflected CRC you would need to reflect each input byte, and then reflect the final result. That may defeat the purpose of using a hardware CRC. (Though if your purpose is just to have a different implementation, then maybe it wouldn't.)