I'm working with some Japanese C code that prints text in Shift-JIS. In the original code, the text is written directly in kana in string literals. In my editor that expects UTF-8, it shows up as nonsense.
In particular this code likes to use "large" versions of English letters: http://www.rikai.com/library/kanjitables/kanji_codes.sjis.shtml e.g. \x82\x60 = A (a large "A", \x41 in ASCII). I thought I'd write a CPP macro to convert those from ASCII, like:
#define LARGE_LETTER(x) "\x82\x" (x+31)
But obviously, this macro doesn't quite work, and I'm not sure how to make it work, if it's even possible. Can you build string escape sequences like this?
Well to start off with, why not have a lot of defines
#define LARGE_LETTER_A "\x82\x60"
#define LARGE_LETTER_B "\x82\x61"
…
#define LARGE_LETTER_Z "\x82\x7A"
Usage
char *str = "foo " LARGE_LETTER_A " baz";
Next, you can take it up a notch with primitive cat
#define PRIMITIVE_CAT(a, ...) a ## __VA_ARGS__
#define LARGE_LETTER(x) PRIMITIVE_CAT(LARGE_LETTER_, x)
Usage:
char *str = "foo " LARGE_LETTER(A) " baz";