Analyzing a X.509 certificate with ECDSA signature I found a 0x00 byte I can't explain.
The signature of a X.509 certificate should be given in an ASN.1 BIT STRING structure.
For ECDSA the signature consists of two integers (r, s) coded as ASN.1 SEQUENCE of two ASN.1 INTEGERs.
In my example (generated with openssl) I got this:
ASN.1 tag for BIT STRING
| length
| |
| | ASN.1 tag for SEQEUNCE
| | | length
| | | |
| | | | ASN.1 tag for INTEGER ASN.1 tag for INTEGER
| | | | | length | length
| | | | | | | |
| | | | | | 0x21 bytes integer value | | 0x21 bytes integer value
| | | | | | ____________|____________ | | ____________|____________
v v v v v v / \ v v / \
... 03 49 00 30 46 02 21 00 D5 F4 76 43 ... A2 BD 95 02 21 00 DF 01 30 24 ... 50 12 32
^
|
why is there a 0x00 byte?
Why is there an extra 0x00 byte before the ASN.1 SEQUENCE tag?
First octet of ASN.1 BIT_STRING
is Unused Bits
indicator. Bit strings are not always aligned to full octets (1 byte). Since minimum encoded data size is 1 byte, Unused bits
will store a number of how much bits are not used in encoded value. For example, your data length is exactly 11 bits. To encode 11-bit long string you will use 2 bytes, 16 bits. However, you must indicate that your data length is 11 bits by setting Unused Bits
octet to 5 (16-11). Possible values for Unused Bits
octet are 0-7. More details: https://learn.microsoft.com/en-us/windows/win32/seccertenroll/about-bit-string