I have a python script that sends a stream of data, and then I have a Linux embedded computer receiving the data (code written in C++). Most of the times it works, however, I am noticing I get data corrupted when sending specific patterns of bytes. I have been struggling with this for a while and I don't know how to solve it.
Python script (sender):
serial = serial.Serial("COM2", 115200, timeout=5)
all_bytes = [0x63,0x20,0x72,0x69,0x67,0x68,0x74,0x73,0x20,0x61,0x6e,0x64,0x20,0x72,0x65,0x73,0x74,0x72,0x69,0x63,0x74,0x69,0x6f,0x6e,0x73,0x20,0x69,0x6e,0x0a,0x68,0x6f,0x77,0xff,0x20,0xf0,0x8b]
fmt = "B"*len(all_bytes)
byte_array = struct.pack(fmt,*all_bytes)
serial.write(byte_array)
C++ code (receiver)
typedef std::vector<uint8_t> ustring; // ustring = vector containing a bunch of uint8_t elements
// configure the port
int UART::configure_port()
{
struct termios port_settings; // structure to store the port settings in
cfsetispeed(&port_settings, B115200); // set baud rates
cfsetospeed(&port_settings, B115200);
port_settings.c_cflag &= ~PARENB; // set no parity, stop bits, data bits
port_settings.c_cflag &= ~CSTOPB;
port_settings.c_cflag &= ~CSIZE;
port_settings.c_cflag |= CS8;
port_settings.c_cflag |= CREAD | CLOCAL; // turn on READ & ignore ctrl lines
port_settings.c_cc[VTIME] = 10; // n seconds read timeout
//port_settings.c_cc[VMIN] = 0; // blocking read until 1 character arrives
port_settings.c_iflag &= ~(IXON | IXOFF | IXANY); // turn off s/w flow ctrl
port_settings.c_lflag &= ~(ICANON | ECHO | ECHOE | ISIG); // make raw
port_settings.c_oflag &= ~OPOST; // make raw
tcsetattr(fd, TCSANOW, &port_settings); // apply the settings to the port
return(fd);
}
int UART::uart_read(ustring *data,int buffer_size)
{
// Buffer
uint8_t * buf = new uint8_t[buffer_size];
// Flush contents of the serial port
//tcflush(fd, TCIOFLUSH);
//usleep(1000);
ustring data_received;
// Read
int n_bytes = 0;
while (n_bytes < buffer_size)
{
int n = read( fd, buf , buffer_size );
// Some bytes were read!
if (n > 0)
{
n_bytes+=n;
// Add to buffer new data!
for( int i=0; i<n; i++ )
{
data_received.push_back(buf[i]);
}
}
}
// String received
*data = data_received;
cout << "Data received..." << endl;
print_ustring(data_received);
delete[] buf;
return read_valid;
}
int main()
{
UART uart_connection;
vector<uint8_t> data;
vector<uint8_t> *data_ptr = &data;
int status = uart_connection.uart_read(data_ptr,36);
return 0;
}
This is what's happening:
If I send the following bytes (from python):
0x632072696768747320616e64207265737472696374696f6e7320696e0a686f77ff20f08b
This is what I am receiving (in C++ program):
0x632072696768747320616e64207265737472696374696f6e7320696e0a686f77ffff20f0
As you can see there are a few bytes at the end (the CRC) that are changed, the rest seems fine. But it doesn't always happen, it only happens when sending some specific pattern of bytes.
Let's say I send the following for instance (some other pattern):
0x6868686868686868686868686868686868686868686868686868686868686868b18cf5b2
I get exactly what I am sending in the above pattern!
Do you think it might be Pyserial changing my unsigned bytes to ASCII? I have no clue what's going on. I have been struggling with this for days!
EDIT
For anyone interested, apparently the problem was that struct termios needs to be initialized right after declaring it.
Here is the code that solved it:
// configure the port
int UART::configure_port()
{
struct termios port_settings; // structure to store the port settings in
tcgetattr(fd, &port_settings);
// Open ttys4
fd = open("/dev/ttyS4", O_RDWR | O_NOCTTY );
if(fd == -1) // if open is unsucessful
{
//perror("open_port: Unable to open /dev/ttyS0 - ");
printf("open_port: Unable to open /dev/ttyS4. \n");
}
else
{
fcntl(fd, F_SETFL, 0);
/* get the current options */
printf("port is open.\n");
cfsetispeed(&port_settings, B9600); // set baud rates
cfsetospeed(&port_settings, B9600);
port_settings.c_cflag &= ~PARENB; // set no parity, stop bits, data bits
port_settings.c_cflag &= ~CSTOPB; //Stop bits = 1
port_settings.c_cflag &= ~CSIZE; // clear mask
port_settings.c_cflag |= CS8; // data bits = 8
port_settings.c_cflag &= ~CRTSCTS; // Turn off hardware flow control
port_settings.c_cflag |= CREAD | CLOCAL; // turn on READ & ignore ctrl lines
port_settings.c_cc[VMIN] = 0; // blocking read until 1 character arrives
// port_settings.c_cc[VTIME] = 10; // n seconds read timeout
port_settings.c_iflag &= ~(IXON | IXOFF | IXANY); // turn off s/w flow ctrl
port_settings.c_lflag &= ~(ICANON | ECHO | ECHOE | ISIG); // make raw -- NON Cannonical mode
// port_settings.c_iflag |= IGNPAR; // Input parity options
// port_settings.c_oflag &= ~OPOST; // make raw
tcsetattr(fd, TCSANOW, &port_settings); // apply the settings to the port
}
return(fd);
}
Your program fails to properly perform termios initialization.
1. The struct termios port_settings
needs to be initialized by calling tcgetattr() before any modifications.
2. Since you've configured non-canonical mode, the VMIN needs to be defined. For whatever reason you have the statement commented out.
As you can see there are a few bytes at the end (the CRC) that are changed,...
Looks like an input byte with value 0xFF was duplicated.
This will occur when PARMRK and INPCK are set and IGNPAR is not set.
Per the termios man page:
PARMRK If this bit is set, input bytes with parity or framing errors are marked when passed to the program.
...
Therefore, a valid byte \377 is passed to the program as two bytes, \377 \377, in this case.
Since your termios structure is never properly initialized, these settings are possibly configured for your program.
ADDENDUM
So I called "tcgetattr(fd, &port_settings);" RIGHT AFTER declaring it, and RIGHT BEFORE opening the port.
No, despite any positive results that is illogical code.
The file descriptor is not valid, and if the return code were checked (as all system calls should be), you would realize that tcgetattr() failed.
The macro cfmakeraw() is the guide that indicates the salient values for proper termios configuration for non-canonical mode:
termios_p->c_iflag &= ~(IGNBRK | BRKINT | PARMRK | ISTRIP | INLCR | IGNCR | ICRNL | IXON);
termios_p->c_oflag &= ~OPOST;
termios_p->c_lflag &= ~(ECHO | ECHONL | ICANON | ISIG | IEXTEN);
termios_p->c_cflag &= ~(CSIZE | PARENB);
termios_p->c_cflag |= CS8;
So your program should look like:
...
if (tcgetattr(fd, &port_settings) < 0) {
printf("Error from tcgetattr: %s\n", strerror(errno));
return -1;
}
cfsetospeed(&port_settings, B9600);
cfsetispeed(&port_settings, B9600);
port_settings.c_cflag |= (CLOCAL | CREAD); /* ignore modem controls */
port_settings.c_cflag &= ~CSIZE;
port_settings.c_cflag |= CS8; /* 8-bit characters */
port_settings.c_cflag &= ~PARENB; /* no parity bit */
port_settings.c_cflag &= ~CSTOPB; /* only need 1 stop bit */
port_settings.c_cflag &= ~CRTSCTS; /* no hardware flowcontrol */
/* setup for non-canonical mode */
port_settings.c_iflag &= ~(IGNBRK | BRKINT | PARMRK | ISTRIP | INLCR | IGNCR | ICRNL | IXON);
port_settings.c_lflag &= ~(ECHO | ECHONL | ICANON | ISIG | IEXTEN);
port_settings.c_oflag &= ~OPOST;
/* fetch bytes as they become available */
port_settings.c_cc[VMIN] = 1;
port_settings.c_cc[VTIME] = 1;
if (tcsetattr(fd, TCSANOW, &port_settings) != 0) {
printf("Error from tcsetattr: %s\n", strerror(errno));
return -1;
}