Search code examples
winapisignaturepkcs#7signtool

CryptVerifyMessageSignature returns original message with an additional header, why?


I have used signtool to produce a PKCS#7 bundle that contains:

  • a message
  • a digital signature
  • the signer certificate.

The command that builds this PKCS#7 file is the following:

$signtool sign /f signer_certificate.pfx /p ∙∙∙ /fd sha512 /p7 . /p7ce Embedded /p7co 0 my_file

This outputs a file named my_file.p7 in DER form (ASN.1).

Now I wrote a C++ program to verify that, from the bundle, we can extract both the certificate and the message. For this I call CryptVerifyMessageSignature:

#include <windows.h>
#include <wincrypt.h>
#include <vector>

std::vector<BYTE> InputPkcs7Data;
// read input file into InputPkcs7Data (CreateFile, GetFileSizeEx, ReadFile).
// out of scope.

CRYPT_VERIFY_MESSAGE_PARA Parameters = {};
Parameters.cbSize = sizeof(Parameters);
Parameters.dwMsgAndCertEncodingType = PKCS_7_ASN_ENCODING | X509_ASN_ENCODING;
Parameters.hCryptProv = NULL;
Parameters.pfnGetSignerCertificate = NULL;
Parameters.pvGetArg = NULL;

DWORD DecodedMessageLength = 0;
PCCERT_CONTEXT SignerCertificate = NULL;
BOOL Result = CryptVerifyMessageSignature(&Parameters, 0, InputPkcs7Data.data(), InputPkcs7Data.size(), NULL, &DecodedMessageLength, &SignerCertificate);

std::vector<BYTE> DecodedMessage;
DecodedMessage.assign(DecodedMessageLength, 0);

Result = CryptVerifyMessageSignature(&Parameters, 0, InputPkcs7Data.data(), InputPkcs7Data.size(), DecodedMessage.data(), &DecodedMessageLength, &SignerCertificate);

if (Result == FALSE) {
    wprintf(L"Error: %lx\n", GetLastError());
}
else
{
    // inspect Memory
    CertFreeCertificateContext(SignerCertificate);
}

My problem here is that, even if CryptVerifyMessageSignature succeeds, What I get into DecodedMessage is not the original message. There are always several leading bytes that seem like garbage at first, but that contain information about the message length. I managed to understand that DecodedMessage is sequenced as follows, depending on the length of original message:

  • OriginalMessageLength ⩽ 127 bytes:
    • 0x04
    • a byte representing the length of the message
    • the message
  • 128 bytes ⩽ OriginalMessageLength ⩽ 255 bytes:
    • 0x04
    • 0x81
    • a byte representing the length of the message
    • the message
  • 256 bytes ⩽ OriginalMessageLength ⩽ 65535 bytes:
    • 0x04
    • 0x82
    • two bytes representing, in Big Endian, the length of the message
    • the message
  • 65536 bytes ⩽ OriginalMessageLength ⩽ 16777215 bytes:
    • 0x04
    • 0x83
    • three bytes representing, in Big Endian, the length of the message
    • the message
  • OriginalMessageLength ⩾ 16777216 bytes:
    • 0x04
    • 0x84
    • four bytes representing, in Big Endian, the length of the message
    • the message. (I did not try with messages larger than 4GiB).

This is really unexpected to me. Where does this “header information” stem from, and is there a way I can avoid getting it when calling CryptVerifyCertificate?

Is it possible that signtool is not the right tool for generating the signed message?


Solution

  • It seems that the decoded message is an OCTET STRING encoded using Distinguished Encoding Rules as per https://en.wikipedia.org/wiki/X.690#DER_encoding. 04 is the tag number for OCTET STRING.