Other tasks want a definite time delay,
The accuracy of any delay period can be specified with a granularity of
1 tick period. That is because the next tick interrupt that occurs
after requesting the delay is taken as the first tick of the delay. So,
if you ask for a delay immediately before a tick interrupt occurs then
the first delay period will be a tiny fraction of one tick. Likewise,
if you ask for a delay immediately after a tick interrupt occurs the
first period of the delay will be one complete tick period.
but the time slippage issue affects accuracy of the delay time. How
does such question solve?
As per my previous post – unless you are writing interrupt service
routines that take more than one entire tick period to execute (so
potentially two ticks occur during the execution of a single interrupt
service routine) then there is no time slippage issue. If you interrupt
genuinely do take that long then I would suggest drastically
re-architecting your system as the goal should be for interrupt to be as
short as feasibly possible – the deferred interrupt description in the
book text shows you how to do that.
About the SysTick priority explanation, has not explained in FreeRTOS
tutorial Cortex-M3 edition , where can I get information about the
SysTick interrupt explanation?
I’m not sure exactly what the book says about this, hopefully it is in
the code comments.