Re: Human tIming perception (was: RT patch)

From: Richard B. Johnson
Date: Tue May 31 2005 - 16:11:41 EST


On Tue, 31 May 2005, Lee Revell wrote:

On Tue, 2005-05-31 at 12:59 -0400, Steve Finney wrote:
It takes (IIRC) about a 10 ms or
so difference in the sounded sequence
for someone to be able to report that there's been a change, but
a cnange in the timing of the person's finger movements occurs
(_immediately_) at perturbations smaller than 10 ms. That is, there
appears to be some dissociation between conscious perception and
perceptual/motor behavior.

Any decent guitar player who has used their computer as an effects unit
could tell you this. I can easily perceive the difference between 1.3
and 2.6, and 2.6 and 5ms latencies. And there's at least one person
(also a guitarist, who I have added to the cc:) who swears he cam
perceive the difference between 0.6 and 1.3ms. Soundcard ADCs typically
add 1.5ms latency in each direction, so the actual floor seems to be
around 3-5ms.

Lee


Well MIDI runs at 31,250 bits/second or 3,906 bytes per second.
After much research by Dave Smith in the early 80s, the MIDI
spec was published in 1983. The data-rate was based upon not
being able to hear the difference in the simultaneity of a
6-note chord (a triad with both hands on the piano). That
equates to 1/3906 * 6 = 0.00154 seconds. (1.54 ms).

Note that because only one note start or stop can sent at a
time, this information was essential for sending and receiving
chords.

Cheers,
Dick Johnson
Penguin : Linux version 2.6.11.9 on an i686 machine (5537.79 BogoMips).
Notice : All mail here is now cached for review by Dictator Bush.
98.36% of all statistics are fiction.
-
To unsubscribe from this list: send the line "unsubscribe linux-kernel" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html
Please read the FAQ at http://www.tux.org/lkml/