USB sucks


USB really isn‘t the right connection between DAC and Server: depending on cables used, you get very different sound quality if the server manages to recognise the DAC at all. Some time ago I replaced my highly tuned Mac Mini (by now-defunct Mach2mini, running Puremusic via USB) with an Innuos Zenith Mk3. For starters I couldn‘t get the DAC (Antelope Zodiac Gold) and server to recognise each other, transmission from the server under USB2.0 wasn‘t possible because the server is Linux based (mind, both alledgedly support the USB2.0 standard) and when I finally got them to talk to each other (by using Artisansilvercables (pure silver) the sound quality was ho-hum. While I understand the conceptual attraction to have the master clock near the converter under asynchronous USB, the connection‘s vagaries (need for exact 90 Ohms impedance, proneness to IFR interference, need to properly shield the 5v power line, short cable runs) makes one wonder, why one wouldn‘t do better to update I2S or S/PDIF or at the higher end use AES/EBU. After more than 20 years of digital playback, the wide variety of outcomes from minor changes seems unacceptable.

Since then and after a lot of playing around I have replaced the silver cables by Uptone USPCB rigid connectors, inserted an Intona Isolator 2.0 and Schiit EITR converting USB to S/PDIF. Connection to the DAC is via Acoustic Revive DSIX powered by a Kingrex LPS.

The amount of back and forth to make all this work is mindboggling, depending on choice of USB cables (with and without separate 5V connection, short, thick and God-knows what else) is hard to believe for something called a standard interface and the differences in sound quality make any review of USB products arbitrary verging on meaningless.

Obviously S/PDIF gives you no native PCM or DSD but, hey, most recordings still are redbook, anyway.
Conversely it is plug and play although quality of the cable still matters but finally it got me the sound quality I was looking for. It may not be the future but nor should USB, given all the shortcomings. Why is the industry promoting a standard that clearly isn‘t fit for purpose?

Finally, I invite the Bits-are-bits naysayers to go on a similar journey, it just might prove to be educational.
antigrunge2

Showing 4 responses by jaytor

The idea of a USB "reclocker" is kind of hilarious. The only thing the USB clock is used for is to transfer the data. A $5 usb peripheral can reliably read USB data. USB can tolerate a LOT of jitter and still reliably transfer the data. The USB clock has nothing to do with the timing of the DAC. 

I can see how a DAC with poor EMI/RFI filtering on the USB input could possibly benefit from some external filtering, but even if this was accomplished by reclocking the USB signal (like any USB hub does), this does not have to be expensive. 


Of the various inputs supported by DACs on the market today (SPDIF, AES3, Optical, I2S, USB, Ethernet), only USB and Ethernet provide an asynchronous input.

Using an interface that requires the clock signal to be generated by an eternal device (server, streamer, reclocker, etc.) and then pass that over a cable with connectors (which even in the best case will have impedance irregularities causing reflections) makes no sense with today's technology. The DAC needs to be responsible for the clock in order to minimize jitter. 

Ethernet might ultimately be a better electrical interface than USB, but until the industry standardizes on a single, simple protocol over Ethernet, USB is our best option. 

I'm sure there are otherwise good DACs that don't do a very good job with USB, but that is changing. With the rapid increase in popularity of streaming, most DAC vendors are recognizing the importance of USB as an interface, and I think it won't be long before this is the preferred interface for most users.
@steakster - I think what @rixthetrick was getting at is that any signal on a cable is inherently analog. While it represents digital data, the signal does not instantaneously transition from one voltage to another to represent 0s and 1s. The interface and cable must maintain good signal integrity to properly confer the digital data.

However, USB is a fairly robust interface. It’s designed to transfer digital data reliability in very low cost implementations. It can start to have problems with long cable lengths, but within reasonable limits, it does a great job of reliably transferring data and can easily handle the requirements of high resolution audio.

As has been pointed out, it is not optimized for minimum noise transfer between devices. It’s designed to be a reliable, inexpensive interface between digital devices. So care must be taken in the design and implementation of the server/streamer, interface cable, and DAC to minimize the effects of any electrical noise generated by the source or picked up along the way.

That doesn’t make it a bad interface. In most regards, all other digital audio interfaces have the exact same issues. It’s true that USB carries a power connection, but this can easily be ignored/dealt with by the DAC. The big advantage of USB (and Ethernet) over older digital audio interfaces (spdif/optical/AES3) is that they are asynchronous and therefore won’t introduce audio sample jitter into the mix.

Sure, you can put a lot of engineering effort and cost into the digital source to reduce jitter on these older interfaces (spdif, etc.), but the interfaces themselves make it impossible to achieve as good of results as the same effort/cost applied in the DAC itself where clocks and transmission line impedances are much easier to control.

There is certainly value in reducing the noise that is conveyed on the USB interface since this just makes the DACs job easier. Using a USB source device that has a good low-noise power supply and using a good cable can often help. This isn’t (or certainly shouldn’t) have any effect on the actual data that arrives at the DAC, but can reduce the amount of electrical noise that the DAC has to deal with.
As for jitter from servers and transports.. Asynchronous transfer is designed to eliminate that. The DAC asks for data packets at its pace and clocks them on down the line using its internal clocks. Any jitter from the source is therefore disregarded so if you hear a difference using devices as above, in theory it is not because they reduced jitter. That said, I don’t doubt you hear something. I’m just saying that attributing it to jitter flies in the face of everything we know about how this stuff all works.
+1.  It's quite possible that a device that purports to be a USB reclocker (any device that receives and retransmits USB is a reclocker) is reducing noise on the USB signal to an audible degree. But the notion that this is because it has less jitter on the USB connection makes no sense. 

I believe that there are still a lot of DACs on the market that sound better using legacy interfaces (spdif, toslink, aes3) but I contend that this is either because the source of these signals (transport/streamer) has a better clock than the DAC, or the DAC has a particularly poor USB implementation. 

There is no technical reason why a DAC can't be implemented with a USB interface that outperforms legacy interfaces. SPDIF, Toslink, and AES3 all have an inherent flaw in that they are prone to jitter because the clock is embedded with the data. No matter how much you spend on cables, the connectors themselves introduce impedance discontinuities which create reflections which interfere with the waveform. 

I'm not saying that legacy interfaces can't delivery excellent results. But from an engineering perspective, the cost to do so exceeds (perhaps significantly) the cost to achieve similar performance from USB (or Ethernet). I think it's only a matter of time before the industry has dropped the legacy interfaces in favor of USB (or some future asynchronous digital transport interface).