Axle, asynchronous DAC controls the timing. Data coming from the computer is placed in the buffer. Every frame computer adjusts number of samples in the frame based on buffer under/overflow signal from the DAC. DAC takes data from the buffer and writes it into D/A converter using internal stable clock. Because of that jitter does not even apply here. It is possible tough that ambient or computer electrical noise can enter DAC thru the cable. USB cables carry power that is not needed and can be source of such contamination. Ethernet is pretty much the same story - data comes without timing in packets so cable should not matter, but people reported improvement when moving to better shielded cables. I suspect that the same thing takes place - cable picks-up ambient electrical noise and injects it into DAC affecting internal clock thus jitter. Jitter converts to noise in frequency domain.
I don't have USB DAC so my observations are only theoretical. I assumed that DAC is asynchronous. Synchronous DACs, where computer controls timing supposed to be pretty bad since computer clock is very jittery.
The main problem is jitter. Clock jitter is a form of signal modulation (similar to FSK) that produces sidebands. These sidebands are at very low level but also very audible since they are not harmonically related to root frequency. Eventually with many frequencies (music) jitter produces a lot of sidebands resulting in noise that is proportional to signal level and hence undetectable without signal.