Yes, the 10X rule of thumb is commonly cited as a minimum acceptable ratio of amp input impedance to preamp output impedance. A considerably higher number (such as 20 or 30X) is preferable, though, if possible (and it is possible more often than not).
You may have heard of Ohm's Law, E = I x R. The voltage (E, measured in volts) across a resistance is equal to the current (I, measured in amperes, often referred to as amps) flowing through that resistance, times the amount of resistance (R, measured in ohms).
The output circuit of the preamp may be thought of as an ideal voltage source (meaning one with zero output impedance) connected to the output jack through (in series with) a resistance (corresponding to the output impedance).
The input circuit of the power amp may be thought of essentially as a resistance between its "hot" input terminal (the center pin of the rca connector), and ground (the outer part of the rca connector), in parallel with an ideal input circuit (having infinite input impedance, and therefore not drawing any current).
Based on that model, and per Ohm's Law, the current flowing will be equal to the voltage being generated by the preamp output stage, divided by the sum of the two resistances (the preamp's output impedance, and the power amp's input impedance). A voltage will appear across the preamp's output impedance equal to that current times that output impedance (per Ohm's Law). Therefore the voltage appearing across the power amp's input impedance will be LESS than the voltage the preamp is trying to generate, by an amount equal to the voltage dropped across the preamp's output impedance.
So the result will be some attenuation (reduction) of the signal provided to the power amp, the amount of attenuation increasing as the power amp input impedance goes down, or the preamp output impedance goes up. That in itself is not a problem, unless it is unusually extreme, but the major problem is that the preamp's output impedance is unlikely to be the same at all frequencies. The result will be different amounts of attenuation at different frequencies, which will affect the tonal balance of the music.
Preamp output impedance which is too high can cause other problems as well, such as increased sensitivity to capacitance of the interconnect cables. Capacitance represents an impedance which decreases as frequency increases. Therefore, a high preamp output impedance combined with a cable that has high capacitance per unit length, combined with a long cable length, will result in attenuation of the upper treble.
A preamp design that has extremely low output impedance avoids all of those problems, but constrains the choice of output stage device (eliminating choices that may provide better sonics); may require significant feedback to achieve the low output impedance (resulting in various adverse side-effects); may increase the likelihood of damage to the output stage if the output is accidentally short-circuited to ground (since greater current will flow into the short); and may increase circuit complexity and cost.
There are always lots of tradeoffs involved in any aspect of a design.
Hope that clarifies things somewhat.
Regards,
-- Al