I'm using the DMA (described as PDC in the datasheet) of SAM4SD16C with USART 0 peripheral. I've set a timer which generates a interrupt each ms. Each 5 ms a data transfert should be performed via DMA. An other interrupt should occur when TXEMPTY flag is set.
To see when the transmission starts and ends I toggle an Ouptut and watch it on oscilloscope. And then I realized that the end of reception is varying in time by 20 µs (my main clock is 120MHz)... Which in my project is not acceptable. Meanwhile, the start of transmission is 100ns precise, so there is no problem concerning this point.
I'm wondering if there is a way to have a better control on DMA time transfer.
As discussed in comments above, the imprecision of End Of Reception instant is due to baudrate value. This imprecision is around the baudrate period and probably an additional bus idle time.