I would only add, that it seems logical for D/A converter clock to be synchronized with incoming data samples (to avoid losing data). In S/Pdif D/A clock is adjusted to average data rate from S/Pdif while in asynchronous USB D/A converter runs at the rate of independent fixed stable clock. DAC receives data in packets (frames), places it in the buffer and signals back "Give me more (or less) in the next frame". Synchronous USB is pretty bad since D/A converter rate depends on average rate of the data coming from computer - rate that is very uneven. As Jaytor mentioned, USB connection has ability to inject either computer noise or ambient electrical noise picked-up by the cable.
Optical cable might seem better than coax, since it doesn't inject electrical noise and it has no transmission line effects (reflections). The problem is that transitions produced by transmitting optical diode are slow. Slow transitions when crossing threshold point make uneven time of level recognition if system is noisy (on either end). |
Any optical cable that is lower cost (under $50-60) is going to be made from monofilament plastic. While these are okay, they do also have a loss of high frequency resolution. If you use optical, you really want a glass fiber cable. These are definitely more expensive. The best for the money is always going to be Lifatec ($120 / 3 feet, or any custom length). There are other more expensive otions (Wire World Supernova 7, DH Labs Glass Master Toslink, etc.).
|
|
Shorter USB cables will not be a problem. The biggest difference is that the USB clock is not used to generate the DAC sample clock on modern DACs, so any jitter introduced by reflections from the connection will not cause DAC clock jitter.
That's not to say it will not have any impact since it can contribute additional noise, but it's likely that other noise transmitted through the USB connection will have a larger impact. And a longer USB cable provides more opportunity for noise to be picked up in this connection. |
I was aware of the explanation for S/PDIF. I’m curious if the same applies to USB and other digital implementations.
No, I would not expect the guidelines for optimal S/PDIF cable lengths to apply to USB. For one thing, modern USB implementations (i.e., those operating at 480 megabits per second and higher) have far faster risetimes and falltimes than S/PDIF (or AES/EBU), which operate at vastly slower bit rates. FWIW I don't recall ever seeing meaningful information as to what cable lengths for USB tend to be more optimal than others in audio applications, if it makes any difference at all. Excellent answers by Kijanki and Jaytor to the original question, BTW. Regards, -- Al |
I was aware of the explanation for S/PDIF. I’m curious if the same applies to USB and other digital implementations. I would assume yes. Jaytor of Kijanki? |
@auxinput Thanks for the info. |
I have recently heard definite differences between a 1 meter and 2 meter digital S/PDIF. This includes even a true BNC to BNC cable with proper BNC connection on source and DAC. Using a Nordost Heimdall 2 cable, there was a loss of high frequency response and detail with the shorter 1 meter cable. The 2 meter cable solved this problem as well as some other "not quite right" things with the shorter 1 meter cable.
|
Who knew, not me. Thanks for the detailed explanations.
|
@jaytor You snooze, you lose :)
|
@kijanki - you beat me to the punch. :). |
Digital cables have signals with very fast rise and fall times. In order to reduce reflections from interfering with the signal transmission, they are spec'ed with a fixed impedance (75 ohms in the case of S/PDIF) driving a load with the same impedance. Any variation in the impedance will result in a signal reflection.
If everything were ideal, the cable length wouldn't matter. The problem is that there are minor variations in the impedance through the connector, PC traces, etc. at the ends of the connection. This will result in some reflection, although in a well implemented system, it is fairly minimal.
It takes about 5ns for the signal to travel through a 1m cable, so the reflection will propagate back to the source end of the cable and then back to the destination in 15ns since it's traveling through 3 lengths of cable.
As I understand it, S/PDIF rise/fall times are typically in the 20ns range, which means that with a 1M cable, the reflection from the start of the square wave is going to arrive around the same time that the primary signal is going passing through the threshold where the signal is sampled. This can cause a perturbation in the signal which interferes with the timing - essentially adding jitter.
If you use a really short cable, the reflection reaches the destination early enough in the rise (or fall) of the signal to not interfere with the timing. Likewise, if the cable is long enough, the reflection will arrive after the signal has passed through the threshold voltage so it won't cause a problem.
This is why it's recommended to use an extremely short cable (such as you'd get inside a CD player, or a long enough cable that the reflections don't interfere. The general guidance I've read is that using a cable of 1.5M to 2M will avoid this issue almost all the time. |
@ericsch We want to avoid reflections from the end of the cable. These reflections appear on characteristic impedance boundaries (changes). Characteristic impedance is related to geometry and dielectric of the cable and is roughly equal to SQRT(L/C). We want smooth fast transitions of digital signal. Reflection can alter them by coming back and adding to the signal. Typical transition in average CDP is about 25-30ns while the threshold (point of level recognition) lies in about half of that (mid voltage point). So we don't want reflection back in 30ns/2=15ns. Beginning of the transition starts reflection. It has to travel to the end of the cable, reflect and come back. 15ns is equivalent to about 3m at 5ns/m signal speed. It means 1.5m each way.
Now, case for the short cable. Reflections start appearing when travel time is longer than 1/8 of transition time (cable becomes transmission line). In our case it would be 30ns/8=3.75ns - equivalent to about 0.75m. For 25ns transition it would be closer to 0.5m This distance includes internal wiring of the transport and receiver (DAC). I would not go further than 1ft cable. 1/8 is a rule of thumb, but normally you would have to know the fastest transition point, calculate frequency, then the wavelength and take 1/10 of it. |
Not sure I understand your answer. Yes, I can use a short cable (.5 meter). Why is 2m safer? |
It might sound counterintuitive, but if you cannot use short cable (<1ft), then you have to use long cable (>1.5m). Most of the time 1.5m is the minimum so 2m would be safer. |