When reading data from a RFID device you will find a CRC-CCITT over the payload. "The CRC is initialized with 0x3791 instead of the usual value 0xFFFF." How can I define the function, that checks that the CRC is ok.
sample
data: { 0x02, 0x40, 0x00, 0x00, 0x00, 0x00, 0xA0 }
CRC: { 0x60, 0xE7 }
another sample
data: { 0x02, 0x41, 0x00, 0x00, 0x00, 0x00, 0xA4 }
CRC: { 0x6F, 0xA5 }
The only way I could get this to work was by implementing the bit-by-bit algorithm (TMS37157 datasheet Figure 52).
UINT16 rfid_get_crc(const UINT8 * data, INT8 size)
{
static BOOL lsb;
static BOOL rxdt;
static UINT16 crc;
static UINT8 bits;
static UINT8 byte;
static UINT8 i;
const UINT16 RFID_CRC_INIT = 0x3791;
crc = RFID_CRC_INIT;
for (i=0; i<size; i++)
{
bits = 8;
byte = data[i]; // Next byte
while (bits --> 0)
{
lsb = crc & 1; // Store LSB
crc >>= 1; // Shift right 1 bit
rxdt = byte & 1;
if (rxdt)
crc |= 0x8000; // Shift in next bit
if (lsb) // Check stored LSB
crc ^= 0x8000; // Invert MSB
if (0x8000 == (crc & 0x8000)) // Check MSB
crc ^= 0x0408; // Invert bits 3 and 10
byte >>= 1; // Next bit
}
}
return crc;
}