I have succesfully programmed a T89C51AC2 microcontroller into using its in-built ADC in 8-bit mode. However I would like to operate this in-built ADC in 10-bit mode. Below please find the code for an 8-bit mode operation:
<code>
#include <T89C51AC2.h> //T89C51AC2 Library
unsigned char PAN_DAT[5]; //Array containing 5 spaces
unsigned int value_converted;
void main(void)
{
unsigned int value;
ADCF=0x07; //Set P1.0, P1.1, P1.2 as ADC
ADCON=0x20; //Enable ADC function
ADCLK=0x00; //Pre-Sacler Time=0us
value = ADC(0xF9,0x01); //Initialize ADC for pin 1 of P1 using mask
PAN_DAT[0] = (value/100)|0x30; //Extract 'units' value and convert to ASCII
PAN_DAT[1] = ('.'); //Separate 'units' from 'tenths' with '.'
PAN_DAT[2] = ((value%100)/10)|0x30; //Extract 'tenths' value and convert to ASCII
PAN_DAT[3] = (value%10)|0x30; //Extract 'hundredths' value and convert to ASCII
}
unsigned char ADC(unsigned char channel,mask)
{
unsigned int value; //Integer space declaration 'value'
ADCON=ADCON&channel; //select ADC pin using ADC control
ADCON=ADCON|mask; //Mask for I/O pin
ADCON=ADCON|0x08; //Start ADC conversion in 8-bit (ADSST=1)
while((ADCON&0x10)!=0x10); //Wait ADEOC=1 bit4
value=ADDH; //Save result in 'value'
ADCON=ADCON&0xEF; //Clear ADEOC (flag=0)
return(value); //Return 'value' to main
}</code>
Now I know that in order to use the 10-bit mode operation an interrupt must be used. Currently I have arrived here in programming:
<code>
#include <T89C51AC2.h> //T89C51AC2 Library
unsigned char PAN_DAT[5]; //Array containing 5 spaces
unsigned int value_converted;
bit EOC = 0; //End-of-Conversion Flag
void main(void)
{
unsigned int value;
ADCF=0x07; //Set P1.0, P1.1, P1.2 as ADC
ADCON=0x20; //Enable ADC function
ADCLK=0x00; //Pre-Sacler Time=0us
EA = 1; //Enable Interrupts
EADC = 1; //Enable ADC Interrupt
value = ADC(0xF9,0x01); //Initialize ADC for pin 1 of P1 using mask
PAN_DAT[0] = (value/100)|0x30; //Extract 'units' value and convert to ASCII
PAN_DAT[1] = ('.'); //Separate 'units' from 'tenths' with '.'
PAN_DAT[2] = ((value%100)/10)|0x30; //Extract 'tenths' value and convert to ASCII
PAN_DAT[3] = (value%10)|0x30; //Extract 'hundredths' value and convert to ASCII
}
unsigned char ADC(unsigned char channel,mask)
{
unsigned int value; //Integer space declaration 'value'
ADCON=ADCON&channel; //select ADC pin using ADC control
ADCON=ADCON|mask; //Mask for I/O pin
ADCON=ADCON|0x48; //Start ADC conversion in 10-bit (ADSST=1, PSIDLE = 1)
while(!EOC); //Wait end-of-conversion
EOC = 0; //Clear software flag
value = value_converted;
return(value); //Return 'value' to main
}
void IT_ADC(void) //ADC Interrupt
{
ADCON=ADCON&0xEF; //Clear ADEOC (flag=0)
value_converted = ADDH<<2; //Save first 8-bits in 'value_converted'
value_converted |= (ADDL & 0x03); //Save last 2-bit in 'value_converted'
EOC = 1; //Set end-of-conversion flag
}
</code>
Can someone please identify where the problem is?
Regarding the ADC interrupt 'IT_ADC' I know that it should be called in order for the program to execute it but I do not know where to call it and also, when I read the datasheet it said that:
"An interrupt end-of-conversion will occurs when the bit ADEOC is actived and the bit EADC is set. For re-arming the interrupt the bit ADEOC must be cleared by software."