I am working with Apple's ScriptingBridge
framework, and have generated a header file for iTunes that contains several enum
s like this:
typedef enum {
iTunesESrcLibrary = 'kLib',
iTunesESrcIPod = 'kPod',
iTunesESrcAudioCD = 'kACD',
iTunesESrcMP3CD = 'kMCD',
iTunesESrcDevice = 'kDev',
iTunesESrcRadioTuner = 'kTun',
iTunesESrcSharedLibrary = 'kShd',
iTunesESrcUnknown = 'kUnk'
} iTunesESrc;
My understanding was that enum
values had to be integer-like, but this definition seems to violate that rule. Furthermore, it seems as though treating these enum
values as integers (in an NSPredicate
, for example) doesn't do the right thing.
I added the enum
declaration above to a C file with an empty main
function, and it compiled using i686-apple-darwin9-gcc-4.0.1
. So, while these kinds of enum
s may not conform to the C standard (as Parappa points out below), they are at least being compiled to some type by gcc.
So, what is that type, and how can I use it, for instance, in a format string?
C99, TC3 reads:
6.4.4.4 §2:
An integer character constant is a sequence of one or more multibyte characters enclosed in single-quotes, as in 'x'. [...]
6.4.4.4 §10:
An integer character constant has type int. The value of an integer character constant containing a single character that maps to a single-byte execution character is the numerical value of the representation of the mapped character interpreted as an integer. The value of an integer character constant containing more than one character (e.g., 'ab'), or containing a character or escape sequence that does not map to a single-byte execution character, is implementation-defined. If an integer character constant contains a single character or escape sequence, its value is the one that results when an object with type char whose value is that of the single character or escape sequence is converted to type int.
In most implementations, it's safe to use integer character constants of up to 4 one-byte characters. The actual value might differ between different systems (endianness?) though.
This is actually already defined in the ANSI-C89 standard, section 3.1.3.4:
An integer character constant is a sequence of one or more multibyte characters enclosed in single-quotes, as in 'x' or 'ab'. [...]
An integer character constant has type int. The value of an integer character constant containing a single character that maps into a member of the basic execution character set is the numerical value of the representation of the mapped character interpreted as an integer. The value of an integer character constant containing more than one character, or containing a character or escape sequence not represented in the basic execution character set, is implementation-defined. In particular, in an implementation in which type char has the same range of values as signed char, the high-order bit position of a single-character integer character constant is treated as a sign bit.