Search code examples
timeravratmega

Mega2560 Timer and uSec


I know there are a bunch of questions on here about timers and how to configure and use them, I have looked through all I could find but can't figure out what I am doing wrong.

I need a class that contains basically the same functionality as the Arduino micros() function. I want to stay with straight AVR. Here is what I have so far, I am using Timer4 so I don't step on any toes, this is a 16bit timer and I am using a prescale of 8 which should give me .5us every clock cycle using a Mega2560, wouldn't this equate to TCNT4 = 2 = 1us?

To verify that my timing functions are correct I created a simple program that only contains the Timer and a couple of delays from "util/delay.h". The resulting output is not what I expected. So here is my issue, I am not sure if the _delay_us is actually delaying the right time or if my timer/math is off.

I realize that there are no checks for overflows or anything, I am focusing on simply getting the timer to output the correct values first.

SystemTime:


class SystemTime{
    unsigned long ovfCount = 1;

    public:
        SystemTime();
        void Overflow();
        uint32_t Micro();
        void Reset();
};

/**
 * Constructor
 */
SystemTime::SystemTime() {

    TCCR4B |= (1 << CS41);  //Set Prescale to 8
    TIMSK4 |= (1 << TOIE4); //Enable the Overflow Interrupt
}

/**
 * Increase the Overflow count
 */
void SystemTime::Overflow(){

    this->ovfCount++;
}

/**
 * Returns the number of Microseconds since start
 */
uint32_t SystemTime::Micro(){
    uint32_t t;

    t = (TCNT4 * 2) * this->ovfCount;

    return t;
}

/**
 * Resets the SystemTimer
 */
void SystemTime::Reset(){
    this->ovfCount = 0;
    TCNT4 = 0;

}

SystemTime sysTime;

ISR(TIMER4_OVF_vect){
    sysTime.Overflow();
}

Main:


#include "inttypes.h"
#include "USARTSerial.h"
#include "SystemTime.h"
#include "util/delay.h"

#define debugSize 50

void setup(){

    char debug1[debugSize];
    char debug2[debugSize];
    char debug3[debugSize];
    char debug4[debugSize];


    uSerial.Baudrate(57600);
    uSerial.Write("Ready ...");

    uint32_t test;

    sysTime.Reset();

    test = sysTime.Micro();
    sprintf(debug1, "Time 1: %lu", test);
    _delay_us(200);

    test = sysTime.Micro();
    sprintf(debug2, "Time 2: %lu", test);
    _delay_us(200);

    test = sysTime.Micro();
    sprintf(debug3, "Time 3: %lu", test);
    _delay_us(200);

    test = sysTime.Micro();
    sprintf(debug4, "Time 4: %lu", test);

    uSerial.Write(debug1);
    uSerial.Write(debug2);
    uSerial.Write(debug3);
    uSerial.Write(debug4);

}

void loop(){

}

Output:


Ready ...
Time 1: 0
Time 2: 144
Time 3: 306
Time 4: 464

Update:

Thanks for helping me out, I wanted to post the working code just in case someone else is having problems or needs to know how this can be done. One thing to keep in mind is the time it takes to do the Micros calculation. It looks like (at least on my Mega2560) that it takes around 36us to perform the calculation so either the timer prescale needs to be adjusted or the math to eliminate the double multiplications. None the less this class works as is, but is by no means optimized.

#define F_CPU 16000000L

#include <stdio.h>
#include <avr/interrupt.h>

class SystemTime {
    private:
        unsigned long ovfCount = 0;

    public:
        SystemTime();
        void Overflow();
        uint32_t Micro();
        void Reset();

};

/*
    *   Constructor, Initializes the System Timer for keeping track
    * of the time since start.
    */
SystemTime::SystemTime() {
    TCCR4B |= (1 << CS41);  //Set Prescale to 8
    TIMSK4 |= (1 << TOIE4); //Enable the Overflow Interrupt

    //Enable Interrupts
    sei();
}

/**
* Increase the Overflow count
*/
void SystemTime::Overflow() {
    this->ovfCount++;
}

/**
    * Resets the SystemTimer
    */
void SystemTime::Reset() {
    this->ovfCount = 0;
    TCNT4 = 0;
}

/**
    * Returns the number of Microseconds since start
    */
uint32_t SystemTime::Micro() {
    uint32_t t;

    t = (TCNT4 * 0.5) + ((this->ovfCount * sizeof(this->ovfCount)) * 0.5);

    return t;
}

SystemTime sysTime;

ISR(TIMER4_OVF_vect) {
    sysTime.Overflow();
}

Solution

  • Assuming that your MCU really runs on 16 MHz, I would change the following things in your code.

    • If one timer increment is 0.5 μs, then you should divide TCNT4's value by 2, not multiply. Because it is TCNT4 times 0.5 μs.
    • Also the this->ovfCount usage is wrong as well. The elapsed microseconds from startup is equal to: TCNT4 * 0.5 + this->ovfCount * 65535 * 0.5. So the current increment number (TCNT4) multiplied by 0.5 μs plus the overflow count (this->ovfCount) multiplied by the max increment count (216-1 = 65535) multiplied by 0.5 μs.

      uint32_t SystemTime::Micro(){
          uint32_t t;
      
          t = (TCNT4 * 0.5) + this->ovfCount * 65535 * 0.5;
      
          return t;
      }
      
    • Finally, I cannot see you enabling global interrupts anywhere with sei().