One digital copy may exactly resemble another, but you still have to get that digital file from one place to another and then convert it to a convincing analog audio presentation.
The server just has to get the correct series of zeros and ones fed into the DAC with timing that's accurate enough to keep jitter induced noise and distortion to a minimum, assuming it's a synchronous connection. If it's asynchronous then it just needs to get the ones and zeros into the DACs buffer in a reasonably timely manner. A lot of server software can do that, with jitter noise well under 100dB below the signal. There may be something going on that's causing it to not deliver the correct stream of ones and zeros at least some of the time, or perhaps to be accidentally sending analog signal along with the stream of ones and zeros that somehow ends up causing noise in the analog output stage of the DAC, or causes the DAC to otherwise not function correctly. So I know there are sonic differences sometimes. I've experienced it. What I'm not getting a good explanation for is how the server software can possibly make the sound coming out of the DAC sound different if the system is actually working to spec., and not having a problem that could be readily identified with a little analysis. At least the effect could be measured on the DAC output even if the root cause might be hard to track down.
It's much harder to bring two different cartridge designs in to uniformity of output from the same groove. It's an electro-mechanical device, and making tiny mechanical things with various connected parts vibrate identically enough to be undistinguishable in sound output is more trying. I have seen some double blind tests that showed there are some cartridge designs that people struggled to tell apart, while others were successfully distinguished.