Think about how the waveform output of an D/A convertor will be distorted if the sampling rate is not consistent. Amplitude distortion, phase distortion. I too was skeptical, but after I heard the difference I thought about it a little more and it does make sense.
Also check out
http://www.dcsltd.co.uk/papers/jitter.pdf
which explains it better than I ever could.
Also check out
http://www.dcsltd.co.uk/papers/jitter.pdf
which explains it better than I ever could.