09-14-11: Atmasphere
The reason power cords make a difference despite the limitations described in this statement has to do with voltage drop in the power cord. It also has to do with how DC power supplies work.
These effects can be quite measurable!! For example, I have seen a 3 volt drop across a 6 foot power cord cost a tube amp of about 35% of its total output power. If you want a reason to look for, that one is pretty basic!
i'm still not convinced of the significance of upmarket power cords, but at least your comments give me something to work with more than just "i believe". at least you articulate mechanisms which can be discussed. here my comments to which i would be interested in reading your reaction.
first, i will address the issue of voltage drop across the power cord. while you didn't state the current draw the produced this 3 volt drop, i will assume a current draw of around 30 amps, a reasonable figure for a practical system that would be in a residential setting. in that case, the cord presents about 0.1 ohms of resistance; for a 6 ft power cord that would translate to a resistance of about 50-60 ohms/km. that to me seems like a realistic resistance for a wire.
the thing is, even if you went to an upmarket power cord, that resistance is not going to go to zero. so even if an upmarket power cord improves the wire resistance by 10%, that amounts to only about a 0.3 volt difference.
if your amplifier is sensitive to that small of a change, there are probably a number of problems with that amplifier. first, it would be sensitive to your turning on lights, or appliances, or a range of devices that would draw a current because the voltage coming out of your wall can be influenced by this kind of stuff. second, i would suspect that a bigger contributor (but more difficult) contributor to voltage loss would be due to resistances in transformers and diodes/active components within the power supply circuit.
but real circuits don't operate under ideal conditions. that is why power supply regulation is so important. while your comments suggest that you are aware of the mechanisms for power supply regulation, the question in my mind is why wouldn't a designer of audio equipment have the same awareness? if you really are observing the dramatic changes in output power that you are reporting, then that would lead me to suspect that you have a real power supply reguation problem since it would appear that you have a extremely sensitive amplifier. if true, it just seems to me that your amplifier wasn't designed for the real world, in which case you would probably still have problems after you bought an upmarket power cord.
09-14-11: Atmasphere
But there is more. Most DC power supplies have a power transformer, a set of rectifiers and a bank of filter capacitors. The circuit draws its power from the filter caps, which are replenished by the transformer and rectifiers. Now its a simple fact that the filter caps are not seriously drained in between cycles, else the amplifier will not work very well. But the rectifiers will only turn on at a certain time- whenever the voltage from the transformers is higher than that of the filter caps.
This only happens at the peaks of the incoming AC power. IOW, the power supply is only doing its work in very short bursts of energy. Now in normal operation what this means is that the diodes are doing some fairly high frequency service; they may only be on for a few milliseconds per cycle. This is called commutation- the turning on and off of the rectifiers, and the current that might occur at these times can be quite prodigious depending on the circuitry of the audio device.
Meanwhile the power cord may be doing double duty, especially if the amplifier has a filament circuit.
Consequently you have two effects: voltage drop at 60Hz, and the current ability at a fairly high frequency. The greater the demand on the cord the greater the likelihood that its effects will be audible on this basis; OTOH the lower the current and the more regulation employed by the audio device the less audible it might be.
i get the part about the diode switching on and off, and i get the part about the on period being very short. but for a 60Hz ac line voltage, that on/off cycle should only happen once/second. so i don't see where the "fairly high frequency" stuff is coming from that you described. as i see it, for the diode on/off cycles to occur with the frequently that you suggest would imply that while the line voltage is in the declining phase, the capacitor is discharging faster than the line voltage is decreasing; that sounds like extremely bad circuit design.
as far as the amount of current that is pumped through the diode to charge the capacitor, it depends on how tightly you need to limit ripple in the dc voltage. but even still, the current pumped through the capacitor is not the current drawn from the wall. i mean, it's not like the power cord is jammed into the circuitry straight-on; it goes through a transformer. i would expect that the transformer is going to do something for you such that the amount of current drawn from the wall is somewhat less prodigious than the current through the diodes. which would mean that the current through the power cord would be less than the current through the diodes. so if you used diode current as the basis for an estimate of voltage drop across the power cord, you would have an exaggerated figure.
09-14-11: Atmasphere
Meanwhile the power cord may be doing double duty, especially if the amplifier has a filament circuit.
i don't understand the "double duty" comment. this might be a concept related to tube amplfier designs, but i don't know much about tube circuits. i'm old enough to remember how great is was when they came out with transistor radios, so for me it's ridiculous to go back to tube devices. maybe younger people have a different perspective...