@thyname I have 50 years experience in audio, professional, home theater and 2-channel., and I am formally trained in architectural acoustics. several of my professional speaker designs have been commercial successes. I also have 40 years experience computers and digital audio, including 20 years as a principal network architect for AT&T.
I have never said analog cables do not make a difference, but claims made for digital cables simply have no plausible or viable basis for their claims. The nature of digital signal transmission and the protocols used do not support the claims. Whatever audible differences that may exist are in the ADC/DAC process involving quantization errors or sampling rates, or in filter algorithms or improperly filtered noise on the analog outputs. None are attributable to the digital data transmission itself. A quick study of the layered model of data communications, like the 4-Layer TCP/IP model shows why this is the case
The TCP/IP reference model has four layers:
4) Application Layer - Formats messages, provides User Interface, and App Services
3) Transport Layer or Host to Host Layer - Ensures data delivery and sequencing
2) Internet Layer - Provides addressing and routing
1) Network Access Layer or Link Layer - Provides physical connectivity and transport of raw bits.
Layer 1 - The Link Layer is where the physical connection lives, be it copper or glass or radio. Layer one serves only to transport raw data bits - low voltages (0s) and high voltages (1s) or in the case of fiber, relative light and dark levels. This where all digital cables operate and is the only physical component. Everything above this is software. The actual data rides above Layer One, in the case of Ethernet and USB, in packets.
Transmitted signal levels are 0.0–0.3 V for logical low, and 2.8–3.6 V for logical high level. This means that any 'noise' below 300mV is simply not recognized. Any noise above that, and that would be a VERY noisy circuit, would trigger a CRC or Parity error, and a packet retransmission for TCP/IP. It's also easy to measure with an oscilloscope.
You should also be aware that USB 3.0 (the standard since 2008) and above are operating at 5 Gbits/Sec, 10Gbits/Sec for 3.1, 50,000 to 100,000 times faster than even the most aggressive audio requirements. Discussions of isochronicity in relation to audio signals are academic at best for any realistic implementation. So if you want to run a 100M or 200M USB cable, yeah, maybe expect issues. At 5M or less, just not a big deal, not even a little deal.
You can listen all you want and perceive all the differences you want, but they aren’t in the digital cable.