The title is:"There's No Such Thing As Digital..."


Subtitled: "A Conversation With Charles Hansen, Gordon Rankin and Steve Silberman". It's an interesting read if you're not yet familiar with this particular topic...or have only considered it briefly. I wouldn't call myself a digital expert, but I can see no reason to quibble with it one bit:

www.audiostream.com/content/draft

Enjoy.
128x128ivan_nosnibor
Hello Johnnyb53. Thanks for the overview.

I didn't think there was any such thing as 100% jitter free? I thought all timing sources had jitter? Have you also overlooked the fact that some CDP's even reasonably priced ones, use high speed drives and buffer into memory for CD playback?

Computers were designed for word processing and have some of their own challenges likes noisy fans, cheap and very nasty RFI / EMI emitting power supplies, CPU interruptions etc.

Finally, jitter is a timing error. At what point does that become audible? Can our ears really detect say 400 pico seconds of jitter? If not then even a basic CDP is acceptable.
Kiwi, I suppose if I want to clutter my post with un-ending qualifiers, I should have said "drastically reduces jitter" instead of "eliminates." Still, if you take the time to check the spectrum analyses of several Stereophile test reports, you'll see that the asynchronous USB DACs have very low jitter that is barely visible in the graphs.

Second, I didn't know that reasonably priced CD players were speed-reading and buffering the data first. Can you name some? Since the audio press is making a big deal about the just-released $6,000 Parasound CDP featuring this read-and-buffer feature, it didn't seem that it had gone mainstream yet.

Third, your summary of what computers are for is narrow and dismissive. I landed in Silicon Valley in 1980 and worked in high tech computers from then until the end of 2006. Word processing applications were a relative latecomer. Also, when the industry switched to graphics-based interface and displays, the desktop-publishing apps were some of the most CPU-intensive applications available, right up there with finite element analysis and solids modeling.

Along with that, my MacBook Pro has a 2 Ghz processor and 8 GB RAM, far more processing power than a typical--or even expensive--CD player or DAC. It has an aluminum housing (well-shielded), and no fan. AS I MENTIONED BEFORE, the Audirvana software (and several other packages) can be configured to turn off all CPU interruptions. It's called "hog mode." Look it up.

Once a music data file is buffered in RAM, the clock is reset. It doesn't matter how much jitter was in the stream before, for the moment, bits is just bits. At that point when it enters a new stream to the DAC, it's coming with a fresh and reset clock and is not subject to additional jitter coming from reading a wobbling plastic disc.

Finally, it was Ed Meitner of Museatex who discovered and published about jitter over 20 years ago. At that time listening tests revealed that jitter became audible around 200 ps. This presented a challenge to the industry as the most popular receiving chip at the time was only accurate to 20 ns.

And anyway, at some point arguing the numbers becomes a moot point. With standard CDPs not only did I find myself not enjoying the music, I even noticed that the family got more irritable when the music was playing. With my current computer setup, I can actually enjoy digitally-sourced music. I still play a lot of records and go to live concerts for reference, but the computer-based (especially high-res) digital playback is closing the gap.
Can our ears really detect say 400 pico seconds of jitter? If not then even a basic CDP is acceptable.

We can detect perhaps as much as 50ps of jitter while basic CDP can go as high as few nanoseconds.

The reason why jitter is so audible, in spite of small levels, is because it creates sidebands that have no harmonic relation with root frequency. If jitter comes from 60Hz noise then these sidebands will be +/- 60Hz apart from the root frequency and not that audible, but if jitter is caused by higher frequency resulting sidebands will be further away hence more audible. In reality there is some uncorrelated jitter coming from random noise and correlated jitter caused by particular interference frequencies. Also, instead of one root frequency we have whole bunch of them (music) and jitter turns into hash that is proportional to amplitude of the signal (undetectable without signal).

Computer data has no jitter because it has no timing. Data is stored on hard disk without timing. It goes thru all sorts of buffers before it is send out. It can be send out as data when we have wireless or network based DAC but it can also be converted to asynchronous S/Pdif stream. The very moment of this conversion creates jitter.

I also have quibbles with this article. It seems to concentrate on DAC clock, that is usually not that bad, placing less attention to delivery of the signal. It should state that both are equally important.

Assuming perfect buffering of the CD stream signal has to be delivered to DAC with very short transitions to reduce threshold uncertainty - requiring perfect source/cable/dac characteristic impedance match (to avoid reflections on impedance boundaries) or perfectly quiet system with perfect shield on the cable to avoid noise induced jitter when transitions are slow. Since both are, being system dependent, very difficult to do possible solution is to reclock the signal just before the DAC. I have DAC with reclocking built in and it is very clean sounding but Steve found that external reclocking works better.
"I didn't think there was any such thing as 100% jitter free? I thought all timing sources had jitter?"

There is no such thing. Marketing BS and an outright lie.

"At what point does that become audible? Can our ears really detect say 400 pico seconds of jitter?"

Depends on the system. Resolving systems can easily demonstrate difference between 20psec and 100psec of jitter, and it's not subtle. Resolving means ultra-low noise floor and distortion. Usually no active preamp to achieve this.

"I also have quibbles with this article. It seems to concentrate on DAC clock, that is usually not that bad, placing less attention to delivery of the signal."

That is interesting, given that many DACs don't have an internal clock, except maybe for the Async USB interface master clocks. These are usually the important ones. This is where the jitter starts in a USB system. It will even have an effect on additional reclocking.

"I have DAC with reclocking built in and it is very clean sounding but Steve found that external reclocking works better."

That is primarily due to the separation of power systems, putting the master clock on its own power system, separate from the DAC circuits. If you can do this effectively inside the DAC, that is fine too. Pretty awkward to have two power cords though...

Steve N.
Empirical Audio
That is interesting, given that many DACs don't have an internal clock, except maybe for the Async USB interface master clocks

Steve, you're missing all network DACs including Ethernet, Firewire, Wireless etc. You also forgot about asynchronous reclocking DACs.

Typical DACs contain Phase Locked Loop (PLL) that contains adjustable oscillator and phase detector. Phase detector compares average phase of incoming signal and adjusts internal oscillator (a clock) to match it. DAC is clocked from this adjustable clock and not from the input signal. Quality of this internal clock (jitter) is very important.