In the perspective of an RTOS task what does offset and jitter means? My understanding is that offset is the maximum time by which the task can be delayed once it is in READY state and jitter is the additional execution time the task takes from the expected value. Is this understanding correct? Thanks in advance...
Given a periodic task model in which tasks are released periodically, offset and jitter are often defined as follows: