jitter


I am pretty sure I understand jitter generated by streamers and/or DACs. My question  is, when a digital recording is created, can there already be jitter in the digital data itself from the ADC? If so, can this ever be corrected during playback, either by the streamer or DAC?

128x128jw944ts

Jitter is created in D/A or A/D conversion by uneven/jittery conversion clock.  Jitter produces added noise and once conversion is completed it becomes permanent.  We can prevent D/A conversion jitter few ways before conversion, but in case of A/D conversion clock has to be perfect to avoid jitter artifacts.  Analog recordings digitized with jittery A/D clock cannot be fixed and the only remedy is to digitize again, if analog tapes still exist.

Jitter during recording is different. It shows up as sample errors and that gets baked into the recording. I imagine it would show up as Harmonic Distortion or, based on a quick reading, reduced signal to noise.

What I mean by sampling error is that the value that is written down won't be correct.  For instance if the ADC would record 2.0 V at T0 with a perfect clock, with ADC jitter it might record 1.99997 or 2.0004 instead.

As far as I know it doesn’t produce the same noise side bands that jitter in playback does.

Still, both should be minimized for the best musical recording experience. :)

Jitter is a non-issue solved decades ago! Present day DAC's and streamers are immune to jitter. No need for re-clocking devices, despite what the neurotically obsessed will claim.