Why does better power = better sound?


Why does improving power quality improve sound quality?

I’m not asking to start an argument about power cords or wall outlets. Please let’s not go there. I’m asking because I’m hoping to learn some technical explanations for the effects of power quality on sound quality. I think I already understand how…

1. greater current availability = greater dynamic range
2. reduction of RFI/EMI = better signal to noise ratio

…but what about these…

3. ???????? = greater perceived resolution
4. ???????? = more realistic instrument timbres
5. ???????? = more precise imaging

Are differences in resolution, instrument timbres, imaging, etc. somehow reducible to current availability and/or powerline noise? If so, HOW are they reducible?

Again, I’m hoping to get into technical specifics, not polemical generalities.

Thanks in advance.

Bryon
bryoncunningham

Showing 1 response by bombaywalla

…but what about these…

3. ???????? = greater perceived resolution
4. ???????? = more realistic instrument timbres
5. ???????? = more precise imaging
I believe that several before me have already hinted strongly at it - lower distortion equals better resolution, timbres & imaging. So, better power implies "better quality power". It's all about the linearity of the power amp & the ability of the power supply to supply sufficient voltage headroom (so that the voltage excursions do not clip) & dynamic current into the load (so that voltage swings do not clip & sufficient voltage can be created in the drivers so that they in turn can react pistonically assuming that the speakers can handle to volume SPL).