If I write a #define that performs an operation using other preprocessor constants, is the final value computed each time the macro appears at runtime? Does this depend on optimizations in the compiler, or is it covered under a standard?
Example:
#define EXTERNAL_CLOCK_FREQUENCY 32768
#define TIMER_1_S EXTERNAL_CLOCK_FREQUENCY
#define TIMER_100_MS TIMERB_1_S / 10
Will the operation 32768 / 10 occur at runtime every time I use the TIMER_100_MS macro?
I would like to avoid the following:
#define EXTERNAL_CLOCK_FREQUENCY 32768
#define TIMER_1_S EXTERNAL_CLOCK_FREQUENCY
#define TIMER_100_MS 3276
A compiler is required to be able to evaluate constant integral expressions because they are necessary for calculating things like array sizes at compile time. However, the standard only says they "can" -- not "must" -- do so. Therefore, only a brain-dead compiler would not evaluate a constant integral expressions at compile time, but a simple check of the assembly output for an unconventional compiler would verify each case.
Macros are simply textual substitution, so in your example writing TIMER_100_MS
in a program is a fancy way of writing 32768 / 10
.
Therefore, the question is when the compiler would evaluate 32768 / 10
, which is a constant integral expression. I don't think the standard requires any particular behavior here (since run-time and compile-time evaluation is indistinguishable in effect), but any halfway decent compiler will evaluate it at compile time.