USB signal timing goes mainstream. Just an FYI


Now, to be fair, the issue here is rates that strain the "eye" and the ability to recognize the 1-0-1 transitions.  But in the analog domain the precision of those transitions affects Jitter and therefore half of the Cartesian plot that is PAM..  'later

https://www.electronicdesign.com/industrial-automation/article/21177252/kandou-11-myths-about-usb-re...
itsjustme
Properly designed DACs for all intents and purposes are considered ' transparent ' in other words they deal with distortion and noise no matter where it comes from not to say there are still DACs used as distortion generators with tubes and other nonsense stacked on.
I almost regret posting this.  Don't take much into that article except to point out that timing (not just bit recovery can be a real issue, and that when that bit transition is used in part to determine fully half the reconstructed waveform (the "digital" interface is quasi analog in this regard) imagine how much more impactful it is for those of us who care about this backwater.


The article is neither directly relevant to audio nor particularly leading edge. It was simply in my morning feed and an interesting counterpoint to some tirades. Not worth seeking out.  Want a better discussion of why jitter matters?, see the blog at Sonogy research (somewhere in there)
Man, I just clicked the link you posted and couldn’t get past all the pop ups that I had to wait ten seconds before it went away and the next one popped up!

Sheesh
I am not going to argue with you.  I sat with MPEG int he 90s and learned a great deal about what we cannot measure well. Mostly i learned that there is much i don't know and it best to begin with empirical data rather than accepted rules.


I actually agree that we cannot hear distortions at -155 db> In fact I thin we are strained at -90 dB. That said, we also seem to be very sensitive to some distortions. As


I don't think you really ant to hear a long explanation why, and i certainly don't feel like typing it.  Correct me if I'm wrong.
I'll close with a quote from on of th epioneers of audio and sound from way back:

if it measures good, and sounds bad, it is bad. If it sounds good, and measures bad, you’ve measured the wrong thing.” – Daniel Von Ricklinghausen, Chief Engineer of HH Scott to the Boston Audio Society, 1954



The deeper meaning of any of this post continues to elude me. I can though report that adding a 10m clock to my dac (Antelope Zodiac Platinum) connected via USB to an Innuos Zenith Mk3 results in noticeable improvements in resolution, impulse rendition and spacial imaging, presumably as a result of better timing on the USB connection as well as the actual d-to-a conversion. (This differs from USB reclocking since it actually addresses the dac‘s own clock and thereby indirectly the USB link in asynchronous mode). Timing accuracy in any case has nothing to do with RMI/EFI and ground level noise rejection by the dac. That in turn seems to best be achieved by passive filtering and galvanic isolation. And by the way: the choice of the clock’s power supply and power cord as well as clock cables has major impact on SQ.
The hearing range of humans isn’t opinion. If the DAC can produce a SINAD of -115db which even some $150 DACs can manage it would be interesting to know of someone who can hear distortion or noise of the DAC over that of the amp and speakers, that would truly be amazing. Just sayin’
Your basic argument is "you cant hear it". This is quite different from "it makes no difference" and the audibility is debated by many. Including me. I cannot explain it (and noted that above), i understand precisely what you are arguing, but neither you nor I, in fact, know what is audible. We have our opinions. Just sayin’.

Any well designed DAC's USB  is asynchronous and uses its Clock. Retimers are different than reclocking and are useful for high speeds and long runs nothing home audio needs to be concerned about. Noise and jitter are not really a problem anymore as even well engineered DACs for a few hundred bucks these are well below human audibility.
i should clarify; i presume you mean DAC as master clock, with the ability to terminate, buffer, & re-clock, whcih is what i do and what i believe meitner does.
The best USB DACs for music all use asynchronous mode.  Eliminates this as an issue.  I could go into detail.  But do your own homework.  EMM does, Wavelength Audio (Gordon Rankin) does.. as do many others.
You are correct both in the fact that they do, and that it ought to eliminate any difference.  And the EMM labs unit is excellent.  My prototypes also work that way*.  The fly in the ointment is.....ground and power noise still impact it and i do not fully understand exactly how/why. But they do.  And Ed has commented on this in the past too (i cant recall where).

At the very least though, a noisy ground and/or power supply, without isolation, can pollute the analog power supplies in any DAC. And those power supplies are, frankly, more important than most of the issues that audiophiles debate endlessly like specific chips....



* so far i have only in fact prototyped this part of the DAC - USB interface, powering, isolation, clocking --- and then spitting out either SP/DIF or if possible LRBW or I2S to existing commercial back-ends (DAC chip itself, reconstruction filter). These things take a lot of time to optimize.
The best USB DACs for music all use asynchronous mode.  Eliminates this as an issue.  I could go into detail.  But do your own homework.  EMM does, Wavelength Audio (Gordon Rankin) does.. as do many others.
Just sayin', none of that technical gibberish ( and I'm a 30+ year EE in both fields) has the slightest thing to do with clock recovery and timing.  The fundamentals persist regardless of forma or content - the bits must be recovered, and in digital music protocols, the specific timing must be recovered to reconstruct fully half the Cartesian coordinate info needed for PAM or PAM like representations.

Over and out.
Post removed 
missing the point. Don't worry about USB2 vs 3 vs 4 - its all just speed. As the speed rises timing becomes more critical.  Re-clockers have been a debated thing in audio over USB for some time. here's an example of the same approach being used for signal integrity mainstream. Nothing more, nothing less.
@antigrunge2            +1

This isn’t audio relevant , it’s only computer related for computer hardware and networking geeks with USB-C connectors

The promoter companies having employees that participated in the USB4 Specification technical work group were: Apple Inc., Hewlett-Packard, Intel, Microsoft, Renesas Electronics, STMicroelectronics, and Texas Instruments.

USB4 by itself does not provide any generic data transfer mechanism or device classes like USB 3.x, but serves mostly as a way to tunnel other protocols like USB 3.2, DisplayPort, and optionally PCIe. While it does provide a native Host-to-Host protocol, as the name implies it is only available between two connected hosts; it is used to implement Host IP Networking.

This must qualify as the most obscure post in history: what exactly are you saying? I am not aware of any audio device employing USB4.