This is beginning to sound like the old "bits is bits" discussion. To the naive engineer (such as myself), the digitally encoded music signals would seem to avoid all the mechanical vibration issues which dominate analog turntable playback. Yet, there is a shocking difference in the sound of various CD players - error correction or not. The degree of vibration isolation in CD transports and players is a distinguishing product feature. In my experience, the sound of a CD player can be affected by vibration coupling or isolation - this has been reported by many people in these forums.
From my perspective, the effects of vibration on an audio system are undeniable and can be profound. What I would like is a "model" which correctly describes this. (Perhaps that was the question that started this post). Vibration can affect the analog circuits in a CD player, but how do they affect the digital stream or the DA conversion? Perhaps there is a mechanism for modulating the timing of the converted signal? Or are there also errors involved?