I want to build a function to easily convert a string containing hex code (eg. "0ae34e") into a string containing the equivalent ascii values and vice versa. Do I have to cut the Hex string in pairs of 2 values and gue them together again or is there a convenient way to do that?
thanks
Based on binascii_unhexlify()
function from Python:
#include <cctype> // is*
int to_int(int c) {
if (not isxdigit(c)) return -1; // error: non-hexadecimal digit found
if (isdigit(c)) return c - '0';
if (isupper(c)) c = tolower(c);
return c - 'a' + 10;
}
template<class InputIterator, class OutputIterator> int
unhexlify(InputIterator first, InputIterator last, OutputIterator ascii) {
while (first != last) {
int top = to_int(*first++);
int bot = to_int(*first++);
if (top == -1 or bot == -1)
return -1; // error
*ascii++ = (top << 4) + bot;
}
return 0;
}
#include <iostream>
int main() {
char hex[] = "7B5a7D";
size_t len = sizeof(hex) - 1; // strlen
char ascii[len/2+1];
ascii[len/2] = '\0';
if (unhexlify(hex, hex+len, ascii) < 0) return 1; // error
std::cout << hex << " -> " << ascii << std::endl;
}
7B5a7D -> {Z}
An interesting quote from the comments in the source code:
While I was reading dozens of programs that encode or decode the formats here (documentation? hihi:-) I have formulated Jansen's Observation:
Programs that encode binary data in ASCII are written in such a style that they are as unreadable as possible. Devices used include unnecessary global variables, burying important tables in unrelated sourcefiles, putting functions in include files, using seemingly-descriptive variable names for different purposes, calls to empty subroutines and a host of others.
I have attempted to break with this tradition, but I guess that that does make the performance sub-optimal. Oh well, too bad...
Jack Jansen, CWI, July 1995.