I am going through some format spec and I am encountering the following:
if (flags & 1) { ... }
Now, according to the same documentation flags is:
so the flags is 3 bytes. The operation flags & 1 I suppose is doing a bitwise operation between the flags and ..? 1 expressed in a 24 bit long manner? could someone explain me a bit more? Thank you!
The spec should list the available flags and you can then just use the individual bits to toggle them on/off.
For instance the Python ssl module has multiple constants that we can use to change some options (flags).
Let's see if TLS version 1.0 option is enabled:
>>> import ssl
>>> ctx = ssl.create_default_context()
>>> bin(ctx.options)
'0b10000010010100100000000001010100'
>>> int(ssl.PROTOCOL_TLSv1)
3
>>> bin(ssl.PROTOCOL_TLSv1)
'0b11'
>>> bool(ctx.options & ssl.PROTOCOL_TLSv1)
False
We can see that the first two bits (3 in decimal) are not set and therefore this option is not enabled so let's enable it:
>>> ctx.options |= ssl.PROTOCOL_TLSv1
>>> bin(ctx.options)
'0b10000010010100100000000001010111'
>>> bool(ctx.options & ssl.PROTOCOL_TLSv1)
True
I guess this has a similar purpose in your case.