I have scripted a Python script (Python v3.9) to give me the little endian output of a XOR encrypted string. And then I tried to write a C++ program that will decode those bytes by using the same XOR key. My Python script follows.
import itertools
stringMessage = "TEST STRING !@#"
xorKey = "Hello324234523"
def xor(message, key):
toret = ''
for c, k in zip(message, itertools.cycle(key)):
toret += chr(ord(c) ^ ord(k))
return toret
encrypted = xor(stringMessage, xorKey)
print("".join("\\x{:02x}".format(ord(c)) for c in encrypted))
The final result is \x1c\x20\x3f\x38\x4f\x60\x66\x66\x7b\x7d\x73\x15\x13\x73\x6b
, which is then copied manually in to the C++ source like so..
char encryptedMessage[] = "\x1c\x20\x3f\x38\x4f\x60\x66\x66\x7b\x7d\x73\x15\x13\x73\x6b";
char xorKey[] = "Hello324234523";
char decryptedMessage[sizeof encryptedMessage];
int j = 0;
for (int i = 0; i < sizeof encryptedMessage; i++) {
if (j == sizeof xorKey - 1) j = 0;
decryptedMessage[i] = encryptedMessage[i] ^ xorKey[j];
j++;
}
printf("-------------------------\n");
printf(decryptedMessage);
But when I compile and execute my C++ program, instead of getting TEST STRING !@#
as a result, I get TEST STRING !@#e∟ ?8O`ff{}s§‼sk
. I don't understand what the extra e∟ ?8O`ff{}s§‼sk
string is.
Changed for (int i = 0; i < sizeof shellcode; i++) {
to for (int i = 0; i < sizeof shellcode - 1; i++) {
and it now works!