Search code examples
avrmicrochipatmelatmelstudioattiny

Self-written micros() for attiny13a works 10x slower


I am trying to make an analog of the Arduino function micros(). To do this, I have programmed a timer as indicated in the code.


#include <avr/io.h>
#include <avr/interrupt.h>


volatile uint32_t micros = 0; // micros from start

volatile uint32_t t = 0;

ISR(TIM0_OVF_vect){
    micros += 60; // ((1/9600000)*8)*(256-184) = 0,00006 seconds = 60 microseconds
    TCNT0 = 184;
}

inline void timer_ini(){
        SREG |= (1<<7);
        TCCR0B |= (1<<CS01); // 8x 
        TCNT0 = 184;
        TIMSK0 |= (1<<TOIE0); // enable ovf interupt
};


int main(void)
{   
    timer_ini();
    DDRB = 0b00011000;
    PORTB =0b00011000;
    while (1) 
    {
        if(t<micros){
            t+= 1000000; // 1 second delay
            PORTB ^= 0b11000; 
        }       
    }

}

In general, I have a variable micros, which contains the value of microseconds from the beginning of the microcontroller power on. Its value is incremented every 60 microseconds using the Timer/Counter overflow interrupt. The overflow has been set to a frequency of 9.6 MHz (fuse bits are checked). The frequency division is taken as 8x. As a result, 60 microseconds were obtained by the formula: ((1/9600000)8)(256-184) = 0.00006 seconds = 60 microseconds.

But when I program the LED to blink every 1 second (t+=1000000), in reality it blinks every 10 seconds. And when I specify a value of 100 milliseconds (t+=100000), then it blinks every second.

I checked the calculations manually and in excel, but it still doesn't work. I have programmed a similar program in arduino ide and it works the way I want it to.


Solution

  • By default, the CLKDIV8 flux was programmed. I set its value to non-programmed and everything worked as it should.