I'm having a theoretical (non-deterministic, hard to test, never happened in practice) hardware issue reported by hardware vendor where double-word write to certain memory ranges may corrupt any future bus transfers.
While I don't have any double-word writes explicitly in C code, I'm worried the compiler is allowed (in current or future implementations) to coalesce multiple adjacent word assignments into a single double-word assignment.
The compiler is not allowed to reorder assignments of volatiles, but it is unclear (to me) whether coalescing counts as reordering. My gut says it is, but I've been corrected by language lawyers before!
Example:
typedef struct
{
volatile unsigned reg0;
volatile unsigned reg1;
} Module;
volatile Module* module = (volatile Module*)0xFF000000u;
// two word stores, or one double-word store?
module->reg0 = 1;
module->reg1 = 2;
(I'll ask my compiler vendor about this separately, but I'm curious what the canonical/community interpretation of the standard is.)
The behavior of volatile
seems to be up to the implementation, partly because of a curious sentence which says: "What constitutes an access to an object that has volatile-qualified type is implementation-defined".
In ISO C 99, section 5.1.2.3, there is also:
3 In the abstract machine, all expressions are evaluated as specified by the semantics. An actual implementation need not evaluate part of an expression if it can deduce that its value is not used and that no needed side effects are produced (including any caused by calling a function or accessing a volatile object).
So although requirements are given that a volatile
object must be treated in accordance with the abstract semantics (i.e not optimized), curiously, the abstract semantics itself allows for the elimination of dead code and data flows, which are examples of optimizations!
I'm afraid that to know what volatile
will and will not do, you have to go by your compiler's documentation.