Search code examples
cstm32halcrc16

STM32 HAL_CRC 16 Bit


I try to use the HAL_CRC on my STM32L4 in order to calculate a 16 bit CRC, but somehow I always get the same result no matter what the input is...

The CRC init

hcrc.Instance = CRC;
hcrc.Init.CRCLength = CRC_POLYLENGTH_16B; //as I have a 16bit polynome
hcrc.Init.DefaultPolynomialUse = DEFAULT_POLYNOMIAL_DISABLE;
hcrc.Init.GeneratingPolynomial = 0x1021; //MCRF4xx polynome
hcrc.Init.DefaultInitValueUse = DEFAULT_INIT_VALUE_ENABLE; //I want to init with 0xFFFF
hcrc.Init.InputDataInversionMode = CRC_INPUTDATA_INVERSION_BYTE; //input inversion
hcrc.Init.OutputDataInversionMode = CRC_OUTPUTDATA_INVERSION_ENABLE; //output inversion
hcrc.InputDataFormat = CRC_INPUTDATA_FORMAT_BYTES; //I have byte input
if (HAL_CRC_Init(&hcrc) != HAL_OK)
{
    Error_Handler();
}

and then the calculation is called with

uint32_t result;
uint8_t pBuffer[3] = {0x33, 0x33, 0x55};
result = HAL_CRC_Calculate(&hcrc,pBuffer,3);

but the result is always 0xe000ed04, I'd expect 0xC91B for this specific case but at least it should change if a change the input. Does anyone spot an issue with this code snippet? I couldn't find any sample codes for 16bit CRC with the HAL Library.

I'm aware that the return value of HAL_CRC_Calculate() is a uint32_t, so my result would be the two lower bytes - in this case 0xed04. At least that's my interpretation of the function description.


Solution

  • The documentation indicates that you need to enable the CRC hardware clock with __HAL_RCC_CRC_CLK_ENABLE();. Are you doing that?