ec = OpenSSL::PKey::EC.new('secp256k1')
ec.generate_key
signature = ec.dsa_sign_asn1("A" * 64)
refute ec.dsa_verify_asn1("A" * 32, signature) # Fails here
Given the test code above, why dsa_sign_asn1
and dsa_verify_asn1
only considers the first 32 bytes of the provided data?
Environment:
Ruby 3.0, Ubuntu 21.04 running with multipass on Windows 10. OpenSSL::VERSION
is 2.2.0
When signing, not the data itself is signed, but the hash of the data. This is necessary on the one hand to be able to sign longer messages and on the other hand for security reasons (s. here).
For secp256k1 typically a digest with an output size of 256 bit is used (s. here), e.g. SHA256.
If you take a digest with a larger output size, the leftmost n bits of the hash are considered according to NIST FIPS 186-4 (s. here, where n is the key size, i.e. bit size of the generator order, 256 bit for secp256k1).
This is the reason why in the posted example the verification is successful: Only the first 32 bytes are considered, which are identical.
If the hashed value of the data is used instead, the verification fails as expected:
signature = ec.dsa_sign_asn1(OpenSSL::Digest::SHA256.digest("A" * 64))
verified = ec.dsa_verify_asn1(OpenSSL::Digest::SHA256.digest("A" * 32), signature) # false