Exactly muffinhead123. Not that problems WILL occur because of the input impedance varying with frequency, but that they can. For instance, a tube pre-amp may have an output impedance of 600 ohms at 1kHz---which is probably fine for an amp with an input impedance of, say, 20k ohms. But the pre’s output may jump up to 2000 ohms at either a higher or lower frequency than the 1kHz figure---which would be marginally acceptable for the amp’s 20k ohm input impedance, causing a noticeable roll-off in bass. But 100k is about as high an input impedance in a power amp as I’ve seen. By the way, when I said the ratio "should be" 10 to 1 or 100 to 1, depending on whom you ask, I inadvertently left out the words "at least"---should be at least 10 to 1 or 100 to 1.
In a lot of amps, the designer’s chosen input impedance, which could be within a range of values, is a balance chosen between a couple of competing factors---a couple of dB of noise being sacrificed to get a higher input impedance, for example. I think it was Nelson Pass who said that a lower input impedance (10k to 20k) makes possible a better amp, but that a figure that low results in greater demands being placed on the drive capability of the pre feeding the amp. Perhaps Audiogoners atmasphere or Al, experts in these technical matters, will provide more complete info.