I'm trying to use memset to set all values in an enum array to a single value, but I'm not seeing the correct results. The first memset works, the second does not. My code:
// definitions
#define NUM_THREADS 1
enum ThreadState
{
INITIALIZING,
READY_TO_CALCULATE,
CALCULATED,
READY_TO_UPDATE,
UPDATED
};
// Later on, in the code...
ThreadState Thread_States[NUM_THREADS];
// Somehow this works - after this statement, every entry in Thread_States is INITIALIZING
memset(Thread_States, INITIALIZING, NUM_THREADS* sizeof(ThreadState));
// ... later on (or even immediately right after) ...
// Failure - after this statement, every entry in Thread_States is 16843009
memset(Thread_States, READY_TO_CALCULATE, NUM_THREADS* sizeof(ThreadState));
As explained in the comments, the first time I call memset, the values are set to what I expect (INITIALIZING, i.e., 0). When I run the second statement, I don't see the values set to READY_TO_CALCULATE (i.e., 1). Rather, they're set to 16843009, when I check the debugger.
Is there a reason this relatively simple use of memset is inconsistent in its behavior?
Thank you.
The memset
function sets each byte of the memory to the second argument (after the second argument is truncated). As enumerations are (normally) the size of int
you will get the wrong result. The only time it will work is for an enumeration value of zero, as it will then set all bytes to zero.
If you use e.g. READY_TO_CALCULATE
you will set each byte to 1
, which will create int
values of 0x01010101
instead of 0x00000001
.