Digital cable optimal length?


Last time I asked about optimal phono cable length, i got mostly answers like 1.5 m or less. I had experimented since then using 6 ft long RCA/RCA as phono cable and compared with same make 1.5 RCA/RCA cable as phono for along time with all my three TT set ups and result was same.

it does DETERIORATE the sound quality drastically as the cable gets longer (i had tried 4ft long also)

Now the question about digital cable.
Would having 3 m long BALANCED /BALANCED digital cable have similar results?
Have you tried?

Thanks,
nilthepill
what sonic differences can one expect to hear when using a cable too short or too long?
Too short means confusion and lack of coherence. Musical phrasing is smeared, timing seems subtly off, the image is out of focus. Instruments are harder to place and lack definition. Soundstage is vague. It's easy to spot the difference if you compare two lengths of the same cable.

I've never heard too long but I imagine the effect must be about the same, since the problem--timing of internal reflections--is the same.
Thanks Tobias, Almarg. I did search under 'length' and found more results. I quite don't understand the technical rationale other than maintaining certain impedance, I do believe that it would be unwise for me to buy longer cable to just try out, chances are it would be a waste of money.

I am not sure buying a 'cheap' 1.5 m and 3m digital cable form Radioshack would give me some conclusive result or not. But if does not cost too much, I might try this.

Theaudiotweak's answer 1.42 m is rather curious. Could you care to explain, Tom?
I owned both .5 and 1m Kimber Orchids for use between my CAL Delta and either Sigma or Alpha DAC(back when). The .5m was consistantly terrible(in a number of audible ways), compared to the 1m(which was wonderful). Rat Shack cables(like everything else from them) would be a total waste of money. These are inexpensive, but highly regarded, by everyone that has auditioned them: (http://www.cs1.net/products/canare/LV-77S_digital_audio.htm) They won't break the bank, at any length you choose, either. NO- I've no "connection" with the company. =8^)
I quite don't understand the technical rationale other than maintaining certain impedance

At high frequencies (much higher than audio frequencies), what are known as transmission line effects come into play, for electrical signals travelling through cables. One of those effects is that if the impedance of the cable, the connector, and the load (destination) device are not precisely the same (and they never are), some fraction (usually a small fraction) of the incoming energy will be reflected back toward the source (instead of being absorbed by the load).

When that reflection arrives at the source, it will again encounter an imperfect impedance match, and so some small fraction of it will be re-reflected back to the original destination.

The length of the cable affects the amount of time that is required for that two-way round-trip. When that re-reflection arrives at the load, it (or most of it, the part that is not re-reflected once again) will sum together with the original waveform, resulting in small but significant distortion of the original waveform.

With a digital signal that is used for clocking, as well as to convey data, what is important is that whichever edge of the signal is being used by the destination device for clocking (by "edge" I mean a transition from either low to high or high to low, i.e., 0 to 1 or 1 to 0, and actually some applications use both edges) are as "clean" and undistorted as possible, or else jitter results (meaning small fluctuations in the timing of the clock period). Typically the middle area of a transition edge is what is responded to by the destination device, so the cable length should be such that the re-reflection does not arrive at that time. That time, in turn, will depend on the risetime (or falltime) of the edge (the time it requires to transition from high to low or low to high). Quoting from myself in the thread I linked to above:

If the input impedance of the dac and the impedance of the cable don't match precisely, a portion of the incident signal would be reflected back to the transport. A portion of that reflection would then re-reflect from the transport to the dac. The two-way reflection path, assuming propagation time of roughly 2 nanoseconds per foot, would be 12ns for the 1m cable, and 18ns for the 1.5m cable.

I don't know what the typical risetimes/edge rates are for transport outputs, but it does seem very conceivable that the extra 6ns could move the arrival time of the re-reflection sufficiently away from the middle area of the edge of the original incident waveform so that it would not be responded to by the digital receiver at the dac input.

Hope that clarifies more than it confuses!

Regards,
-- Al