A thought experiment


Some time ago an OP was advised to avoid digital room correction inserted downstream of the dac, the rationale being artifacts created by additional ad and da conversions. That got me thinking. Suppose a digital music file goes through a dac, then an adc. Both units would be generally accepted as high quality. Would the final file be bit perfect to the original? I am not talking about simply sending the original file through both chips, but rather through the output stages of the dac as well. Assuming the answer is yes, now imagine the original digital signal passes through a chain of 50 da’s and 49 ad’s. The 50 dacs would all be different to avoid precision vs accuracy issues. How would the final file now compare to the original?
128x128dbrewer12345
The D/A has jitter in it and the A/D adds more jitter. It would degrade the result. You can test this with various DSP devices for crossover, room correction and equalization.

Much better idea to apply DSP in the digital domain and then just do D/A. Even then, the DSP has to be extremely good code to avoid artifacts and compression, such as Sonic Studio etc..

Steve N.
Empirical Audio
If you take the analog output from your dac and then convert it back to digital for room correction, its like bypassing your dac. Because you converted the signal to analog with the dac first, it will have some effect on SQ, but the dac in your room correction will have a much bigger effect. Personally, I feel that doing something like that would be very foolish. It pretty much makes your dac a waste of money.

If you keep the signal digital, you will just be using what ever dac is last (the one that actually converts the signal to analog). This is a better option. Putting the room correction unit in between your transport and dac will allow you to make your room adjustments first and then send the digital signal to your dac for conversion. That allows you to use both products as they were originally intended.
Suppose a digital music file goes through a dac, then an adc. Both units would be generally accepted as high quality. Would the final file be bit perfect to the original?
Extremely unlikely. No DAC is perfect, and no ADC is perfect. The only way to maintain bit perfect accuracy is to keep the data entirely in the digital domain, while applying no processing to it that might change the value of any of the bits. Even then, timing fluctuations ("jitter") caused by electrical noise or any of many other possible effects, as well as DAC inaccuracies, may audibly degrade the quality of the signal when it is ultimately converted to analog.

Regards,
-- Al