Search code examples
cavradcatmega16

255 char. instead of 1023, Unable to set reference voltage


I am using ATmega16 in my project. I want to get the value=1023 when I give 5V as input, and 0 when input = 0. The readADC function reads the particular channel of micro-controller ADC. The frequency of my clock is 4MHz. But on reading the input, I'm getting 255 as the maximum value instead of 1023. Someone who knows about avr programming can help !!!

My code:

#include <avr/delay.h>
#include <avr/io.h>
#include <avr/interrupt.h>

unsigned char ReadADC(unsigned char ch)
{
   ch = ch&0b00000111;
   ADMUX&=0xF8;
   ADMUX |= ch;
   ADCSRA |= (1<<ADSC);
   while(ADCSRA & (1<<ADSC);   //wait for conversion to complete
   return(ADC);
}

int main(void)
{
   SegDataDDR = 0xFF;
   SegCntrlDDR = 0xF3;
   SegCntrlPort = 0xF3;
   SegDataPort = 0x00;

ADMUX = (1<<REFS0) | (0<<REFS1);
ADCSRA = (1<<ADEN) | (1<<ADPS2) | (1<<ADPS0);   // ADEN set to turn ADC on
// clock frequency divided by 32 for operable 125KHz frequency

while(1) 
{ 
   ADCSRA |= (1<<ADSC);    // start conversion
   unsigned char value = ReadADC(0);    // Reads the value of the ADC
}

Solution

  • In your code, the line

    unsigned char value = ReadADC(0);
    

    is restricting you to only 8 bits. You need to change the type of value to something larger, like an unsigned short, which can hold 16 bits worth of data. If you make this change, you must also change the definition of ReadADC to

    unsigned short ReadADC(unsigned char ch)