Why do digital coax cables matter?


Could somebody please explain this to me? Why does a $100 dollar cable sound better than a $50 cable. Why is silver coax better than copper coax? Why do the quality of connectors matter in the digital realm?

I'm currently needing a cable for a Stello U3. Some people claim that are no discernible differences out there b/t different levels of coax cables. They say the only things that matter are impedance,cable length, and adequate shielding.
robertsong

Showing 6 responses by almarg

Mr_m, yes, inductance and most other cable parameters are proportional to length, including resistance, capacitance, the resistance rise at high frequencies caused by skin effect, the effects of dielectric absorption, and propagation delay.

In the case of analog cables, inductance is most likely to be audibly significant in speaker cable applications, especially if the impedance of the speaker is low at high frequencies (as it generally is in the case of electrostatic speakers), and/or if the cable length is long, and/or if the particular cable has relatively high inductance per unit length. The impedance presented by an inductance is proportional to frequency, and therefore in the upper treble region may become an audibly significant fraction of the impedance of the speaker in those situations.

Inductance (as well as resistance) will usually be unimportant in the case of line-level analog interconnects, since the corresponding impedances will be vastly smaller than the input impedance of the component receiving the signal. In that application capacitance will often be important, particularly if the output impedance of the component providing the signal is high.

In the case of digital cables, inductance is one of the key determinants of what is called "characteristic impedance," which for coaxial S/PDIF is nominally the 75 ohm figure you are probably used to seeing mentioned, and for AES/EBU is nominally the 110 ohm figure you are probably used to seeing mentioned. The characteristic impedance of a cable is NOT proportional to length. However at the RF frequencies which comprise digital audio signals the less than perfect impedance match that will inevitably exist to some degree between the cable’s characteristic impedance and the nominally 75 or 110 ohm impedances of the connected components will result in some fraction of the signal energy reflecting and re-reflecting back and forth along the cable. The arrival of those reflections and re-reflections at the DAC will result in some degree of distortion of the signal waveform as received by the DAC, which may or may not ultimately affect timing jitter at the point of D/A conversion, depending on the arrival times and also on the design of the particular DAC. And those arrival times will be dependent on the length of the cable as well as on the propagation velocity of the particular cable. That is explained well in the paper by Steve N. of Empirical Audio that was linked to earlier.

Regards,
-- Al

Geoffkait 8-30-2017
... so what’s the diff?

The difference is the frequencies that are involved, Geoff, which bring completely different effects into play. Such as the one that is explained in the paper by Steve N. of Empirical Audio that was referenced earlier in the thread, which explains the rationale for 1.5 meter digital cables. That rationale having no relevance whatsoever to analog audio signals, since it involves the effects of signal reflections that result from impedance mismatches at RF frequencies on timing jitter at the point of D/A conversion. Surely you realize that cable effects can be dependent on frequency, especially when both the frequencies and the application are vastly different? If you don’t, any further debate would be pointless, and I’m not going to engage in any.

Also, your last sentence represents a complete misreading of what I have said. I absolutely did not say, and have never said in this thread or any other, that length differences in analog cables won’t be audible. In fact in many other threads, such as this one in which you’ve participated very recently, I’ve said that they certainly can be audible. I have said, however, that in general the shorter an analog cable is the better, assuming the goal is for the signal to be conveyed in as accurate a manner as possible (i.e., for the sonic effects of the cable to be minimized).

Feel free to have the last word, even if it involves asserting that something I have neither said nor implied is incorrect. To others who may be interested in the subject, I would commend the excellent posts earlier in the thread by Kijanki, AudioEngr (Steve N. of Empirical Audio), and others, as well as my own posts.

Regards,
-- Al

Mrblackcrow 8-26-2017
I read somewhere else that all interconnects can benefit from being 1.5m. Somehow that doesn’t sound right to me. So, is it true- will 1.5m RCA interconnects going from my DAC to my pre amp sound better than the 2 ft length I am currently using?
You are correct about that not sounding right. The 1.5 meter length recommendation that is often seen for digital cables has no relevance whatsoever to cables conducting analog audio signals. In general, in the case of analog audio cables the shorter the better, if it makes any difference at all.

Also, regarding S/PDIF and AES/EBU digital cables, as you can see in some of the earlier posts in this thread the optimal length is dependent on a great many component and cable dependent variables, some of which are not usually specified (e.g., risetimes and falltimes of the output of the signal source). So 1.5 meters should be considered as having the best odds of being optimal for a digital cable (unless a very short length is practicable, such as 6 or 8 inches), but other lengths may be better in some cases, and there may be some cases where it won’t matter very much if at all.

Regards,
-- Al


Steve, thanks for your inputs. Do you feel that the following may also be significant contributors to sonic differences between S/PDIF interconnects, at least in some systems?

6)Differences in noise-induced jitter, due to ground loop effects and/or RFI/EMI pickup, both of which may be sensitive to cable differences.

7)Differences in radiated RFI, that may affect circuitry in the system that is not directly related to the S/PDIF interface.

Concerning your no. 3, impedance mismatches, and with respect specifically to the impedance match to the components that are being connected (as opposed to mismatches between cable and connector, or impedance discontinuities within the cable) I would add the thought that what is important is not how accurately the impedance of the cable and connectors match the 75 ohm standard, but how closely they correspond to the actual output impedance of the component driving the cable, and to the actual input impedance of the component that is at the receiving end. Everything else being equal, a cable that is less accurate relative to the 75 ohm standard may therefore outperform a more accurate cable in some systems, if it happens to be a closer match to the component impedances.

Finally, I would be interested in your take on what degree of correlation can generally be expected between cable performance and cable price, for S/PDIF interconnects, given the many variables and system dependencies that are involved in the effects that have been mentioned.

Regards,
-- Al

P.S: Re your first question, I am an EE with an extensive background in digital signal transmission (not for audio).
Will a cable of some determinate length not add some measurable, repeatable, non-arbitrary amount of jitter within a particular range of measurement, regardless of any jitter coming from the source component?
No, absolutely not. As implied in some of the preceding posts, the amount of jitter that will result with a given cable in a given system, at the point where D/A conversion is performed within the DAC (which is where it matters) depends on a complex set of relationships and interactions between the parameters of the cable, including length, impedance accuracy, shielding effectiveness, shield resistance, propagation velocity, bandwidth, etc., and the technical characteristics of the components it is connecting, including signal risetimes and falltimes, impedance accuracy, jitter rejection capability, ground loop susceptibility, etc.

Many of the relevant component parameters are usually unspecified, and even if they were specified in great detail predictability of the net result of those interactions would still be limited at best.

Regards,
-- Al
Steve, thanks very much for your comments and insights.

My one comment in response is in regard to:
10-06-12: Audioengr
"Characteristic impedance different than 75 ohm can be very good, as Al mentioned, if it is better match for given system."

Sure, but I would sell that system and get one that meets the specs so I dont have to try to find a wacked-out cable that matches it.
The problem as I see it, in at least most cases, is that there is no practical way for the consumer to know what the transport's output impedance or the DAC's input impedance is. Even JA's measurements don't address those parameters, at least in the reviews I've looked at. And, if I recall correctly, the tolerance defined by the S/PDIF standard is a very loose one, something like +/- 20 ohms or +/- 20%.

Also, as indicated in your paper:
I have never seen impedance control on any Transport or DAC circuit board. Occasionally, the wiring from the circuit board to the connector is impedance-controlled, but this is the exception, not the rule.
It all seems to me to add up to a very hit or miss situation, and even more so given that another key parameter, the risetime and falltime of the transport's output signal, is also usually unspecified, and widely variable (e.g., 25 ns or so in many cases, per your paper; 3 ns or less in some cases, per your statement above).
10-06-12: Dura
... and then there is impedance match which I frankly do not quite understand.
See Steve's paper, linked to above, which explains it all nicely.

Regards,
-- Al