People assign analog attributes to digital cables. Warmer, colder, brighter etc. I just wonder which bits are added, changed, or entirely missed when the sound turns warmer or brighter etc.? If bits are missed or added it will likely cause nasty noises (if not corrected by the error correction mechanism of the receiving end). Try to transfer a high-resolution camera picture via a reasonable quality ($15) USB cable and observe the resulting picture. Do you see any distortion? Hardly ever. Or transfer a large software package to your computer via a USB drive. Do you get errors? Hardly ever. So why do you think you need such a special USB cable for music? I agree that if SPDIF interface is used, the receiving DAC can introduce distortion if the jitter is high, so there the cable has to be of high quality (as well as the source equipment) to minimize that. However when it comes to asynchronous USB, there is no jitter induced from the interface (no PLL is needed) and hence the quality of the cable is not critical as long as it's well shielded to prevent noise. I've tried many USB cables in my system, expensive and cheap, and could never detect differences. (I had golden eared audiophiles agreed). So if you think you detect differences in your system, it is likely a psychological effect IMO. Needless to say, the cable industry will not agree.