Bigamp - Sooner or later the folks that dont understand the jitter thing will hear it or they will eventually upgrade their systems to the point where they hear it. Until then, it is like trying to explain how the earth looks from space.
Jitter is simply inaccuracies in the timing of the bits that make up the data stream. It is like a clock that ticks every second. If the clock ticks at exactly each second time interval, then it is said to have no jitter. If some ticks come at .999 seconds and others at 1.001 seconds, then the overall time will be accurate over many seconds, but there is jitter in the timing. This is how actual real-time systems work. They all have some amount of jitter.
The effect of this jitter on the D/A conversion is to create frequency modulation in the analog output. This means that the point at which the top of the cymbol crash was supposed to occur actually occurs maybe 1 nanosecond later and then the trailing ringing of the cymbol comes maybe 1 nsec earlier. The rate of this change can be anywhere from 10hZ to 100kHz or higher. If there is only one sample that occurs at the wrong time, it will never be heard, but typical jitter signature is usually a constantly changing time error. This is why it is audible. The brain detects these things just like it detects moving objects with eyesight.
The time error has both amplitude and frequency or spectra characterisitics. Each CD player or computer audio device has different amplitude and spectra for its jitter, so they can sound very different from each other, even though the data is always the same. Data errors are very uncommon for both CD players and computer audio. Jitter is the difference that you are hearing, if you hear a difference.
Steve N.
Empirical Audio