The RTC benchmark measures the time between 2 calls to the SIGIO handler.
In my example RTC freq=2048HZ I used 0.48ms periods, and the max jitter
is exactly 2 * 0.48ms = 0.96ms.
The same thing happens on the audio card:
If I use 1.45ms audio fragments, then max delay between two write() calls is
2.9ms ( 2 * 1.45ms)
When I reduce the fragmentsize to 0.7ms , the max registered peak was 1.4ms =
2 * 0.7ms.
Maybe in the audio case, the same phenomen of "lost IRQ" happens,
But it's interesting that the jitter depends on the IRQ frequency.
(maybe only for very low-latencies)
look at the diagrams , you can see very clearly that the peaks are a multiple
of the IRQ period.
Maybe on lower IRQ frequencies ( intervals > 5-10ms ), these peaks will
not show up , because that there are no losses of IRQs ?
But if my HD is blocking the IRQs for max 0.7ms using 0.7ms IRQ period,
why should it block for max 1.45ms by using 1.45ms IRQ periods ?
regards,
Benno.
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@vger.rutgers.edu
Please read the FAQ at http://www.tux.org/lkml/