Why does my DAC sound so much better after upgrading digital SPDIF cable?
I like my Mps5 playback designs sacd/CD player but also use it as a DAC so that I can use my OPPO as a transport to play 24-96 and other high res files I burn to dvd-audio discs.
I was using a nordost silver shadow digital spdif cable between the transport and my dac as I felt it was more transparent and better treble than a higher priced audioquest digital cable a dealer had me audition.
I recently received the Synergistic Research Galileo new SX UEF digital cable. Immediately I recognized that i was hearing far better bass, soundstage, and instrument separation than I had ever heard with high res files (non sacd),
While I am obviously impressed with this high end digital cable and strongly encourage others to audition it, I am puzzled how the cable transporting digital information to my DAC from my transport makes such a big difference.
The DAC take the digital information and shapes the sound so why should the cable providing it the info be so important. I would think any competently built digital cable would be adequate....I get the cable from the DAC to the preamp and preamp to amp matter but would think the cable to the DAC would be much less important.
I will now experiment to see if using the external transport to send red book CD files to my playback mps5 sounds better than using the transport inside the mps5 itself.
The MPS5 sounds pretty great for ca and awesome with SACD so doubt external transport will be improvement for redhook cds
i don’t have a golden ear, that is very true. However, not a single person tested professionally/scientifically has been able to hear jitter below a nanosecond, so to say that 0.0012ns (22psec) of jitter is audible just isn’t true.
Jitter introduced by Toslink is not an issue, even without a DAC reducing it. With a DAC reducing it, the problem is non-existent; when doing a J-Test (pretty much worst case scenario of jitter), the $100 AudioQuest DragonFly Black had a jitter reduction of no worse than -105dBFS; meaning you could playback 35dB (average noise floor in a treated room) to 140dB and have no jitter present.
Forgot to add, in order for any jitter to be present for 16-bit (not factoring in jitter reduction by the DAC nor dithering), it needs to be less than 346psec ((1000000000000/44100)/(2^16) ).
So, since <346psec is litterally not present for 16-bit, without reduction or dither, making the claim that going from 22psec to 7psec was drastically audible should now be clearly evident as bogus (placebo).
And yes, there is no benefit from going higher than 16-bit, which can cover say 35dB to 126dB without dither, and 25dB to 145dB with noise shaped dither.
+1, @nonoise, for acute perception of the discourse here. -1, @audioengr, for making a stupid political analogy that adversely affects my opinion of you. -2, @mzkmxcv, for being a nerdy troll here.
Question for you: Do you purchase everything based on specs alone or you trust your senses? If the former, then you are definitely not only in the minority but most likely not getting your best bang for the buck. If the latter - which sounds unlikely, then you need to get your hearing checked.
If specs show me that any further reduction in jitter would be inaudible, then I wouldn’t spend time nor money looking at a re-clocker or anything, as it’s not theoretical, but actually impossible for there to be any audible improvement if the current jitter is below the noise floor.
For say speakers, I’ll look for speakers that measure close to ideal, and then try to demo (as if my room is overly reflective or narrow, I wouldn’t want a speaker with a super wide soundstage for instance, as that would cause too many reflections; the same goes for a narrower soundstage in a wide room).
It’s not a question whether jitter, per se, is inaudible. The question is whether jitter manifests itself in the audio reproduction of a digital source that is audible in terms of the dimensionality and realism of instruments and vocalists localized therein. Countless companies, as confirmed by critical listening sessions, have demonstrated that reducing jitter to psec levels in digital audio streams greatly improves the sound quality in high-end audio systems. This is largely incontrovertible.
“I wouldn’t want a speaker with a super wide soundstage for instance, as that would cause too many reflections...”
The above statement clearly shows we’re dealing with a newbie to high-end sound reproduction. I take back my comment about his being a nerdy troll. My apologies.
Stereophile has several publications that explain the nuances of sound reproduction in high-end audio systems. I recommend mzkmxcv check out those sources.
I have shown that psec levels of jitter is indeed need, but it’s already >300psec for 16-bit.
Also, no, jitter doesn’t alter the reaslism of instruments/vocals, the type of jitter we are talking about is a rise in the noise floor, or distortion-like peaks of low-level. Meaning, it’s like a bit of static added in. And again, this whole conversation was about the how a new Toslink cable make a drastic reduction in jitter. Keep in mind how I showed even the budget-ish DragonFly DAC can reduce even immense jitter well below the noise floor anyway.
For the last 13 years I've had a media PC hooked up to my TV and Denon surround amp. I'm using an TOSLINK from the PC motherboard to the amp. So, this is a red light flashing on and off on the motherboard going down about 1m of some sort of glass tube (whatever an optical cable is made of) and the amp is reading the pulses of light. I only play mp3s and it's mainly dance/pop stuff (so nothing high brow) - it all sounds truly amazing. I've never heard music sound better than it does in my house.
Am I to understand that the cable is causing jitter to a flashing pulse of light and that the digital info being received by the amp is not totally correct? Or is jitter to do with extra devices and connections and distance between the source and the amp?
The only thing that I have ever found that affects the quality is to use some mp3 gain software that stops mp3 flies from clipping. Before lowering the gain to remove clipping they sound harsh, get rid of the clipping they are sweet as anything. I do find that any Youtube or Spotify stuff directly over the internet generally tends to sound a bit rough in comparison, I assume it's down to the gain and that internet stuff is designed to sound good through phones and laptop speakers and not a couple of thousand quid of amp and speakers.
I’ve got a comparably modest digital source setup:
Allo Digione RPi to Schiit Modi Uber 2 to NAD C316BEE to Tekton Lore speakers
Did an a/b test with each cable upgrade: no-name cheapo rca analog interconnect to ZU audio mission from dac to amp = huge improvements.
Then the same with digital rca: monoprice 1.5 meter to amplifier surgery 1.5 meter from Digione RPi to dac = huge improvements.
in both cases- or at each stage- beyond satisfied, to point of giddy, weird smiles that kind of worried my wife (however, she also noticed the improvements). The degree of improved detail and stage was more than palpable.
Just wanted to share that that it was even noticed with a far lesser setup than most possess here. Happy Thanksgiving! 🍁
Maybe the paper referenced below will shed some light on the complexity of jitter for those that believe that a simple value in some clock or DAC datasheet can Trump this effect for good. Audibility of some forms of jitter on DACs and ADCs have been investigated, but I believe that the improvements that most of us hear when jitter is reduced tend to indicate that audibility thresholds are not so easy to define and greatly depend not only on the technology used inside the chips, but also on the implemented circuitry around those chips (e.g. power supply management). My 2 cts worth, /patrick
1. The power of suggestion2. Noise - One cable may be better at rejecting noise than another3. You had a dirty plug and this cleaned it out4. Better impedance matching. Not all cables are up to spec.
While lots of people pooh pooh optical cables, they are ideal for PC outputs as they utterly reject ground loops as well as EMI/RFI noise.
One hidden issue in digital transmission is ground loops which can and do occur and seem to increase jitter levels so high that it causes audible degradation. In an ideal world, all DAC's are isolated from this but not all are. This is also made much worse with switching power supplies like a laptop's wall wart or PC. Sadly, DAC's do not have any indicator of this. There's no light that says "ground loop" or "excessive jitter" and honestly I'd feel a little comforted if they did.
There's no reason to believe that any coaxial cable would be better than any other at reducing ground loops though, my comment was just about how and why Optical can be far superior.
A thought.... who praises the significant, if not impressive, improvements in sound quality that can be achieved by buying very expensive "high end” cables?
Two groups. One - Those that manufacture, distribute and sell these products at a serious profit and Two - those who were talked into drinking the Kool Aid and would NEVER fess up to being fleeced. Actually there is a third group. This group are consumers who gulped the Kool Aid and are victims of the insidious audio placebo effect that causes you to believe you hear the advertised, albeit impossible, sound quality enhancements.
Since virtually no consumer does, or has the capability to perform, an instantaneous AB test (the only kind that are really valid) on the old/new equipment, his (now biased) expectation will allow him the pleasure of experiencing superior performance – even when there is none.
@dynaquest4: very quick AB lets you pick up changes but it may be advisable to validate that by switching after longer periods of time, typically several days with interruptions and sleep to get a fresh ear on it.
Another thought: if heavily processed electronica music is used to optimize your system, you will never end up with an accurate result. It is sometimes shocking to realize that loved albums are really not that good...
Good post dyna. there is actually a third group of which I was one. When I started in this hobby I bought into all the cabling nonsense but as I learn more about how things work now I realize 90% of its bs. I’m not sure why people find it so difficult to admit they made mistakes.
I cue everything up so I can switch between the CD and the Sonos source
playing the same track. The added clarity and detail via Steve's rig is
easy, obvious to hear. It just sounds better.
I find that Tidal hi fi sounds different than CD. I've grown to prefer it. But I would think having an all in one CD player would minimize jitter. I had a modified sonos connect streaming Tidal, and I could never tame the splashy treble. It was running on a switching power supply.
Yet again, superhumans have taken over this site claiming they can hear electrons grinding at each other trying to race in a wire trying to reach the other end :). No need to argue with super beings. They will never admit that spending thosands on a DIGITAL cable is a waste of money. It would make them sub superhuman otherwise.
What a bunch of fools a whole bunch of posters are. Is it arrogance maybe?
Denying others’ experiences smacks of ’holier than thou.’ "I don’t experience what you do, so I must be right." 99% of the time though, it’s "I haven’t heard the particular equipment but I’m going to express my biased opinion anyway."
Placebo is a word that is thrown around whenever anyone expresses their subjective experience of equipment, upgrades, tweaks. More used as an accusation, blatantly questioning intelligence.
This year I have upgraded speakers, cables, DAC/Power supply, fuses, power cables, mains filters, and a few other pieces/items. On most of them, I have noticed a change in my musical experience for the good.
People who then state as fact that my experiences are false, are in my mind shallow, judgemental and not worth taking seriously.
The term “placebo” is often used by those who actually know the difference between two or more substances, of which one causes a “non-effect” and another that causes an “effect.”
It seems to me that one cannot have a placebo unless one acknowledges something actually does bring about an “effect.” So one must first acknowledge the existence of a cable, for example, as having a bona fide change in SQ, before a placebo can even exist.
So it’s more than arrogance. It’s stupidity to suggest another’s experience is false or that another is being deceived when the poster typically has no experience with the product(s) in question, let alone the context in which the product(s) resides.
And there are are others who base their arguments on test results that they would likely admit their don’t fully comprehend to even critically adjudge the test construct’s legitimacy. Even more amazing are those who believe not hearing such differences when listening for them on an iPhone proves their point as being universally applicable to all audio systems...well, don’t get me started.
Including a freebie one, an intentionally poorly made one, an all glass one,etc.
Guess what? They are 99.9% identical, and jitter all below 100dBFS. So yeah, I stand further by my claim that the poster simply fell prone to placebo.
Worthless measurements. Why? Because they are woefully insufficient to characterize the system, much less the jitter difference.
Most classical analog measurements are insufficient to show small differences in dynamics or soundstage. This is a fact supported by hundreds of reviews in Stereophile where the measurements were very poor and yet the review with music was stellar.
The ONLY accurate way to make jitter measurements on a digital source is to do it directly, not through a DAC or analog system. This requires a 5-10GHz B/W measurement system, not an AP.
Am I to understand that the cable is causing jitter to a flashing pulse of light and that the digital info being received by the amp is not totally correct? Or is jitter to do with extra devices and connections and distance between the source and the amp?
The info is correct, but the timing of the info is not optimum. Everything adds to the jitter a little, the optical to electrical converters, the cable and every active device inside the components. If the cable delivers a less than optimum signal, this will affect jitter because the receiver will have a slower risetime in reponse to the optical signal transitioning states.
Why don't you just try this excellent inexpensive cable and hear the difference:
Maybe the paper referenced below will shed some light on the complexity of jitter for those that believe that a simple value in some clock or DAC datasheet can Trump this effect for good. Audibility of some forms of jitter on DACs and ADCs have been investigated, but I believe that the improvements that most of us hear when jitter is reduced tend to indicate that audibility thresholds are not so easy to define and greatly depend not only on the technology used inside the chips, but also on the implemented circuitry around those chips (e.g. power supply management). My 2 cts worth, /patrick
"It follows that specs such as "Jitter 200 ps RMS" are
practically meaningless. Jitter specs should always
identify what measure of jitter they are referring to,
as in "Period jitter 200 ps RMS" for example."
All of my jitter measurements are direct and of the period.
"Period jitter was introduced in section 3.1.2. Unlike
wideband jitter and baseband jitter, it can be measured
directly in the time domain, i.e. without filter hardware.
You simply use a scope, and examine the waveform
one period after the trigger point. Many scopes can plot
period jitter histograms and extract RMS values."
I do not agree with this however:
"We saw in section 3.1.2 that period jitter is entirely
appropriate for some purposes. We see here that it is
entirely inappropriate as a general measure [14]. This
is because it is basically blind to low-frequency jitter."
This depends on the measurement system and how it measures the jitter. Mine measures the jitter of the data, not the clock, so it factors in the fact that the period changes. It selects one period and locks onto this.
"it can be useful to make N-period jitter
measurements with very large N. Modern digital scopes
are excellent for such measurements."
I do not believe my measurement system can do this easily, but it is important.
"A key point is that it is not just the basic audio signal
that gets modulated. It is everything that crosses the
boundary between the continuous-time domain and
sampled-signal domain. This can include out-of-band
interference (in ADCs), incompletely attenuated images
(in DACs), and "zero-input" internal signals such as
shaped quantization noise and class-D carriers."
"Even low-level components can
cause problems if they are up at high frequencies."
"Jitter bites equipment designers most deeply when it
causes a converter that should have more than 100 dB of
dynamic range to deliver e.g. only 80 dB. In such cases
the jitter is interacting not with the audio signal but with
an internal signal such as shaped quantization noise.
Early one-bit DACs were particularly sensitive to this.
More-recently the inclusion of switched-capacitor filters
and the move to multi-bit designs has eased things.
Above ~200 kHz, the quantization noise is largely white
at its point of injection. When you factor in the DAC's
sin(x)/x frequency response and the effect of the internal
switched-capacitor filter stage, its spectrum becomes
more like the upper trace in figure 10 (taken from [17]).
By applying the already-mentioned 6dB/octave tilt,
one can estimate the region of greatest jitter sensitivity.
It is typically somewhere around ~0.5 or ~1 MHz for
DACs that use high-order noise shaping."
"The jitter performance differences that we have seen
relate entirely to signal components that are above the
audio band."
So as you can see, the DAC itself is sensitive to jitter that is way out-of-audio band.
Am I right to think that gold connectors for digital cables are pointless?
The contacts are the important part in a BNC or RCA. If both the shield and center conductor contacts are gold-plated, this is good enough. Any non-oxidizing conductor material will do. The shield is usually not gold-plated or having 360 degree contact unless you get a high-end connector, like the Neutrik BNC.
All this noise about digital cables. As one person noted, a digital cable actually carries an analog signal. However, the magic is the software. When a digital signal is sent it is sent with what are called stop bits and a checksum. The hardware at the other end recalculates these values and compares them. If they don’t match, it requests the sender to re-send it. This way it is VERY rare, and i mean VERY rare for an incorrect packet to be get by this protocol.
At both ends the data is buffered (stored) to accommodate a fair number of error/resend cycles. After all the transmission speed is MUCH higher than required for high definition audio or even HD video. If you have a LOT of errors, then you will run out of buffered data and get ’skipping’ or some other sort of artifact.
If you find that cables made a difference, then you either had defective cables or ones that were insufficiently shielded, allowing enough errors to empty the buffer.
Truly analog signals like those from your pre-amp to your amp are different story, of course. Those signals are sent in real time. No buffering. No error correction. If you have extra money to spend on your digital sources, spend it on the DAC, because the output from your DAC is analog, and there are lots of ways to screw that up.
”This is a fact supported by hundreds of reviews in Stereophile where the measurements werevery poor and yet the review with music was stellar.”
Because the review was good the measurements must not show the whole picture (even though JA’s don’t, like no distortion measurements)? The reviews are done sighted with knowledge of the company and price. To suggest the reviews being positive must only correlate to performing better than the measurements suggest is just silly.
Also, saying the Toslink measurements aren’t valid because they are taken after the DAC thus shows you agree the DAC can reduce jitter to below audibility. And no, it is valid for my argument, as any differences that would show up would indicate that the differences between them is large enough that the DAC couldn’t reduce the jitter to the same amount.
Consider me biased, but one’s opinion that one can’t discern differences in audio SQ from audio streams with different levels of jitter—based upon listening for such differences with an iPhone—does not carry much weight to draw the conclusion that such differences in SQ would not be readily heard in a decent audio system in an open air environment.
You again misinterpret what I said. I said even on my phone all the jitter values with the test tone were audible, except one of them. So, I invited Steve to take the test with the music samples to see if he could hear it. I never suggested that if I couldn’t hear it on my phone, that it wasn’t audible.
Yep, I spend my evenings enjoying test tones on my audio system.
I correctly understood you stating that you could not discern the jitter level differences with authentic music in the stream. And you then concluded what I stated in my previous post.
I love threads like this . People telling you what you can and can not hear. Even though they have never listened to your system . And then they quote a test that they didn’t hear either . That is followed by the phrase “ Placebo Effect “. So when you throw out that rationale, it’s called the “ Bullshit Effect “. Sometime that’s followed by the “ Butthurt Effect “, which leads to the “ Get Even Effect “. Think I’ll go outside and play .....
As one person noted, a digital cable actually carries an analog signal. However, the magic is the software. When a digital signal is sent it is sent with what are called stop bits and a checksum. The hardware at the other end recalculates these values and compares them. If they don’t match, it requests the sender to re-send it. This way it is VERY rare, and i mean VERY rare for an incorrect packet to be get by this protocol.
At both ends the data is buffered (stored) to accommodate a fair number of error/resend cycles. After all the transmission speed is MUCH higher than required for high definition audio or even HD video. If you have a LOT of errors, then you will run out of buffered data and get ’skipping’ or some other sort of artifact.
If you find that cables made a difference, then you either had defective cables or ones that were insufficiently shielded, allowing enough errors to empty the buffer.
We are not talking about dropping bytes or getting bit-errors here. This is about timing inaccuracies. The timing of the digital signal must be extremely accurate, from word to word, in order for the D/A to reproduce a low-distortion waveform.
rocknss8 posts11-24-2018 5:34pmAre there any studies showing the cable improvements by placing a instrumentation microphone at the listening position?
I am not aware of such results, but I too strongly believe that it would make things a lot clearer for everyone if there was a way to identify a type of "distortion" related to jitter for which the amplitude could be measured based on the resulting acoustic signal and compared for different components. In the end, audibility is no voodoo or placebo, but refers to the sensitivity of our sensory apparatus and processing abilities, which have finite bandwidths and thresholds for the auditory illusion to happen when listening to sounds reproduced by a stereophonic audio system. In my experience, reducing jitter in digital audio systems lets us experience reproduced music in a way that ressembles more the output of a turntable (whatever the words to describe this subjective effect are).
As is the case with nearly all threads here, contributors come in a variety of forms: bias confirmationers, naysayers, impartialists, fiction writers, humorists, trolls, truth-seekers and educators.
@audioengr should be recognized as a patient saint-educator on this one.
We are not talking about dropping bytes or getting bit-errors here. This is about timing inaccuracies. The timing of the digital signal must be extremely accurate, from word to word, in order for the D/A to reproduce a low-distortion waveform.
How does timing affect things if there is buffering? I still don't quite get this.
You must have a verified phone number and physical address in order to post in the Audiogon Forums. Please return to Audiogon.com and complete this step. If you have any questions please contact Support.