What's the difference between 160000000UL
and (unsigned long) 160000000
?
PROBLEM:
I have a #define
with the following:
#define SYS_CLK (160000000UL / 1000UL)
SYS_CLK is different depending on the system clock. I want to know if I can change it to:
#define SYS_CLK ((unsigned long)(RCC_MAX_FREQUENCY/*get freq from the system. It returns e.g. 160000000U*/) / 1000UL)
And RCC_MAX_FREQUENCY
is defined as:
#define RCC_MAX_FREQUENCY 168000000U
Are they the same?
(I checked this, but it's not my question.)
What's the difference between
160000000UL
and(unsigned long) 160000000
?
160000000UL
is unsigned at pre-processing time, (unsigned long) 160000000
is signed.
When constants are used in a macro at pre-processing time (e.g. #if math
), the width (int, long, long long
) and cast are irrelevant as the width used is (u)intmax_t
. The sign-ness is relevant.
In general, do not force wider constant width with L
or LL
unless needed. Do use U
when sign-ness is needed.
Casting to (unsigned long)
or using L
can force the constant to be at least 64-bit, when a value about 160,000,000 only needs 32-bit. Using an excessive wide type incurs down-convert warnings. IMO, coders too often ignore/disable such warnings as they are deemed noisy - when in fact they are useful.
Below is OK:
#define SYS_CLK ((unsigned long)(RCC_MAX_FREQUENCY) / 1000UL)
Yet I would recommend:
#define SYS_CLK (RCC_MAX_FREQUENCY / 1000u)