I am used to programming PC's and smartphones using high level languages, microcontrollers are a new territory for me. Are they somehow different, more untrustworthy, requiring different techniques? Here is bit of code to write and read to EEPROM running on a Arduino Mega: (there is an Ethernet Shield attached, not used here)
#include <EEPROM.h>
int addr = 0;
int val;
byte value;
void setup()
{
Serial.begin(9600);
}
void loop()
{
val = 9;
EEPROM.write(addr, val);
delay(500);
addr = addr + 1;
if (addr == 20) addr = 0;
value = EEPROM.read(addr);
Serial.print(addr);
Serial.print("\t");
Serial.print(value);
Serial.println();
}
Heres what comes out:
1 91
2 91
3 9
4 9
5 9
6 9
7 9
8 9
9 9
10 9
11 9
12 202
13 202
14 202
15 202
16 202
17 202
18 202
19 202
0 9
1 89
2 91
3 9
4 9
5 9
6 9
7 9
8 9
9 9
10 9
11 9
12 9
13 9
14 9
15 9
16 9
..... In general address 1 and 2 are always flaky and it takes two writes to change memory locations above ~10.
I can switch out another board and still get similar oddities.
How can I adapt my programming to this seemingly flaky performance?
Simply enough, your code is wrong.
Logically step through it. You are writing to an EEPROM at address addr
. You then wait 500ms, increment addr
, and then read from the new addr
. The addr
you read from is therefore not the addr
you write to.