Why does my DAC sound so much better after upgrading digital SPDIF cable?


I like my Mps5 playback designs sacd/CD player but also use it as a DAC so that I can use my OPPO as a transport to play 24-96 and other high res files I burn to dvd-audio discs.

I was using a nordost silver shadow digital spdif cable between the transport and my dac as I felt it was more transparent and better treble than a higher priced audioquest digital cable a dealer had me audition.

I recently received the Synergistic Research Galileo new SX UEF digital cable.  Immediately I recognized that i was hearing far better bass, soundstage, and instrument separation than I had ever heard with high res files (non sacd),

While I am obviously impressed with this high end digital cable and strongly encourage others to audition it, I am puzzled how the cable transporting digital information to my DAC from my transport makes such a big difference.

The DAC take the digital information and shapes the sound so why should the cable providing it the info be so important. I would think any competently built digital cable would be adequate....I get the cable from the DAC to the preamp and preamp to amp matter but would think the cable to the DAC would be much less important.

I will now experiment to see if using the external transport to send red book CD files to my playback mps5 sounds better than using the transport inside the mps5 itself.

The MPS5 sounds pretty great for ca and awesome with SACD so doubt external transport will be improvement for redhook cds


128x128karmapolice
I love threads like this . People telling you what you can and can not hear. Even though they have never  listened to your system . And then they quote a test that they didn’t hear either . That is followed by the phrase “ Placebo Effect “. So when you throw out that rationale, it’s called the “ Bullshit Effect “. Sometime that’s followed by the “ Butthurt Effect “, which leads to the “ Get Even Effect “. Think I’ll go outside and play .....
As one person noted, a digital cable actually carries an analog signal. However, the magic is the software. When a digital signal is sent it is sent with what are called stop bits and a checksum. The hardware at the other end recalculates these values and compares them. If they don’t match, it requests the sender to re-send it. This way it is VERY rare, and i mean VERY rare for an incorrect packet to be get by this protocol.

At both ends the data is buffered (stored) to accommodate a fair number of error/resend cycles. After all the transmission speed is MUCH higher than required for high definition audio or even HD video. If you have a LOT of errors, then you will run out of buffered data and get ’skipping’ or some other sort of artifact.

If you find that cables made a difference, then you either had defective cables or ones that were insufficiently shielded, allowing enough errors to empty the buffer.

We are not talking about dropping bytes or getting bit-errors here. This is about timing inaccuracies. The timing of the digital signal must be extremely accurate, from word to word, in order for the D/A to reproduce a low-distortion waveform.

Steve N.

Empirical Audio


rocknss
8 posts11-24-2018 5:34pmAre there any studies showing the cable improvements by placing a instrumentation microphone at the listening position?
I am not aware of such results, but I too strongly believe that it would make things a lot clearer for everyone if there was a way to identify a type of "distortion" related to jitter for which the amplitude could be measured based on the resulting acoustic signal and compared for different components.
In the end, audibility is no voodoo or placebo, but refers to the sensitivity of our sensory apparatus and processing abilities, which have finite bandwidths and thresholds for the auditory illusion to happen when listening to sounds reproduced by a stereophonic audio system.
In my experience, reducing jitter in digital audio systems lets us experience reproduced music in a way that ressembles more the output of a turntable (whatever the words to describe this subjective effect are).
As is the case with nearly all threads here, contributors come in a variety of forms: bias confirmationers, naysayers, impartialists, fiction writers, humorists, trolls, truth-seekers and educators.

@audioengr should be recognized as a patient saint-educator on this one. 
We are not talking about dropping bytes or getting bit-errors here. This is about timing inaccuracies. The timing of the digital signal must be extremely accurate, from word to word, in order for the D/A to reproduce a low-distortion waveform.

How does timing affect things if there is buffering? I still don't quite get this.