I need to know what encoding
https://msdn.microsoft.com/en-us/library/bb347054(v=vs.110).aspx
ECDsaCng.SignData Method (Byte[])
is using for it's byte array by default and how to convert it to a DER format accepted by
https://www.openssl.org/docs/manmaster/crypto/i2d_ECDSA_SIG.html
so that I can verify my C# generated ECDSA signature in OpenSSL using the method ECDSA_do_verify
.
I know its SHA1 and I know how to load that digest, I just don't know what the byte encoding is for the ECDsaCng.SignData
method, nor how to convert it to DER format, if I even need to do that.
An ECDSA signature is the value-pair (r, s)
.
Windows CNG emits this as a concatenation of the two values, and since .NET does not reinterpret the data, this is the de facto .NET format. (r
is the first half of the array, s
is the second half)
OpenSSL expects a DER encoded structure of SEQUENCE(INTEGER(r), INTEGER(s)). This loosely means { 0x30, payload_length, 0x02, r_length, r_bytes[0]...r_bytes[r_length-1], 0x02, s_length, s_bytes[0]...s_bytes[s_length-1] }
; though it's slightly trickier than that because padding bytes are required when r_bytes[0] or s_bytes[0] >= 0x80 (since that makes it appear as a negative number).
Unfortunately, there's not a general purpose DER encoder exposed in the framework by default.
If you're trying to run on Linux, you might be better served by using .NET Core, since the ECDSA implementation for Linux already does this data translation.