As for jitter from servers and transports.. Asynchronous transfer is designed to eliminate that. The DAC asks for data packets at its pace and clocks them on down the line using its internal clocks. Any jitter from the source is therefore disregarded so if you hear a difference using devices as above, in theory it is not because they reduced jitter. That said, I don’t doubt you hear something. I’m just saying that attributing it to jitter flies in the face of everything we know about how this stuff all works.
+1. It's quite possible that a device that purports to be a USB reclocker (any device that receives and retransmits USB is a reclocker) is reducing noise on the USB signal to an audible degree. But the notion that this is because it has less jitter on the USB connection makes no sense.
I believe that there are still a lot of DACs on the market that sound better using legacy interfaces (spdif, toslink, aes3) but I contend that this is either because the source of these signals (transport/streamer) has a better clock than the DAC, or the DAC has a particularly poor USB implementation.
There is no technical reason why a DAC can't be implemented with a USB interface that outperforms legacy interfaces. SPDIF, Toslink, and AES3 all have an inherent flaw in that they are prone to jitter because the clock is embedded with the data. No matter how much you spend on cables, the connectors themselves introduce impedance discontinuities which create reflections which interfere with the waveform.
I'm not saying that legacy interfaces can't delivery excellent results. But from an engineering perspective, the cost to do so exceeds (perhaps significantly) the cost to achieve similar performance from USB (or Ethernet). I think it's only a matter of time before the industry has dropped the legacy interfaces in favor of USB (or some future asynchronous digital transport interface).