Search code examples
cfloating-pointembeddedreal-timeinteger-overflow

What is the correct way to determine `ticks` from clock `freq` while avoiding overflow in C on a microcontroller?


What is the correct way to multiply and divide ints in C to avoid overflow? I want to determine how many ticks of a timer running at freq (in Hz) will take to make a delay (in ms). This should be ticks = freq * delay / 1000.

But, that line looks dangerous to me. If we write (freq * delay) / 1000 we run the risk of overflow. If instead we write freq * (delay / 1000), we will get into floats - which are unnecessary and error prone, esp. on a microcontroller.

What is the correct way to do this?


Code is running on a Cortex M4 ARM microcontroller. Ticks are from SysTick timer. All vars are uint32_t or volatile uint32_t.


Solution

  • Your suggested:

    freq * (delay / 1000)
    

    Is not a floating point operation. That would require:

    freq * (delay / 1000.0)
    

    But you are correct that it is not necessary. Far better would be:

    (freq / 1000) * delay
    

    So long as freq is a multiple of 1000, that will result in no loss of precision, and avoids at least a "premature" overflow (i.e. an overflow that occurs at lower values due to a poor choice of operation order).

    An overflow of course remains possible, but this expression gives the maximum range possible for delay without resorting to a larger type. For example if the expression is of type uint32_t, delay can be up to 232/1000, or nearly 72 minutes.

    Critically, the reordering enables the range of delay to be deterministic across any system. It is no longer dependent on the value of freq - it will always be 72 minutes.

    If 72 minutes is not long enough, you might even then consider a lower resolution delay (whole seconds for example) before resorting a larger data type (which will be less efficient and may add atomicity issues). Long delays seldom require millisecond precision, and your clock may not be millisecond precise over that length of time in any case - even TCXO's are accurate to typically 2ppm, giving a potential 1ms drift after 500 seconds.