What is the Sound of Impedence Mismatch?


As I understand it, you want your power amplifier to have an input impendence much higher than the output impedence of your preamp, at least 10x. Can anyone tell me what the sonic symptoms are of a mismatch? If I'm hovering around 10x, what might I hear that would indicate an impedence mismatch as opposed to, say, a preamp that is simply too bright or whatever?
Ag insider logo xs@2xdrubin
The impedance mismatch is only critical when the mismatch causes the preamp to go into distortion. If the preamp's current drain into a lower impedence device puts the preamp into a clipping mode similar to a amp driving low impedence speakers at high volume. So the sound is normal distortion IM and THD. Harsh sound would be the first clue. Some preamps can take a ratio of 1 and others need an amp impedence of 20 time their output impedence because they can't "drive" any current.
Keis are you familiar with the concept of LCR cable-reactance-induced frequency response rolloff? I cannot at all agree with your statement above.
In the Recording and Broadcast Industry, virtually all Audio Wiring is balanced, it is normal practice for Studio Equipment to have an Output Amplifier (Line Amplifier) with a low output impedance, say about 50 Ohms. This amplifier is generally not capable of feeding into a 50 Ohm load but is designed to feed equipment having an input impedance 600 Ohms or greater, very often in the order of 10 K Ohms. The same will apply to Stereo Hi Fi equipment where you can always feed a low impedance source into equipment having a high input impedance. This applies for relatively short run audio wiring or cabling inside a Studio Complex. For long runs, say 50 meters or more, or into Phone Cables going outside the studio, proper Line Driver Amplifiers are used capable of delivering audio into a load of 600 Ohms, these Line Driver Amplifiers are always balanced and capable of driving into 600 Ohms. Early practice was to build the output impedance up to 600 Ohms by adding resistors in series with the output, and make the input impedance of the equipment being fed 600 Ohms. Modern practice is to use the low impedance output of the line amplifier directly to feed into the line. Most equipment inherently have a high input impedance, made low by terminating the input by a resistor, usually 600 Ohms. In most cases, the accuracy of the 600 Ohm Termination resistor is not particularly critical. For Stereo Hi Fi applications, it should in virtually all cases be OK to feed a low impedance into a high impedance. Hi Fi equipment line outputs generally have a low output impedance, but cannot cope with low impedance loads and will distort if required to feed into a low impedance load.
Poulkirk your above treatise is very nicely explained & prompts me to clarify. Loading & clipping is of course a valid consideration. But it is not *the only* consideration of impedance mismatch, which is more generally manifested in freq. response problems than in clipping distortion.
Yes Bob I am familiar with LCR effects in cables. What's your point? I was answering the question about preamp amp interaction without considering cables. Poulkirk brings up the professional 600ohm standard which while correct is likely not answering the question which is probably possed from a RCA connection perspective. I was addressing the loading issue, not a potential frequency rolloff due to inductance effects. You might want to explain the issue of frequency rolloff to Drubin and how the R in the LCR circuit is partially the output impedence of the preamp since you are so keen on it.