Amplifier heat depending on power cable?


Hello,

I have an ML No 27 and I've noticed that when using one of my power cables it heats up MUCH more than when using another cable I have.

Both cables are manufactured by small European companies, probably little known in the US (Hifi Cable & Cie from France, Actinote from Belgium)

Does this make sense?

BTW, the sound is better with the less heating up cable.

Thanks.
amuseb
You mean your power cable heats up or the amp heats up more using one of the power cable?
Post removed 
The link below may help. It may help to think of the power cord as an extension of your house wiring.

http://ecmweb.com/nec/code-basics/electric_conductor_size_matters/
I've noticed that my older Burson integrated, with its two 500 watt toroids never got 'hot' no matter how long I left it on. And that was with a run of the mill (but better than stock) power cord from GTT Audio.
The newer Burson, with 'only' a 380 watt toroid and a Zu Mission PC gets very warm to kinda hot if left on for more than 4-5 hours. Having no venting and using the entire casing as a heat sink would lead one to conclude that this is why but the older model was designed that way as well.
Tis a puzzlement!
Many years ago I remember being very surprised when a dealer friend replaced the original Krell power cord with one of the first Shunyatas (maybe the Phyton) on the normally very hot Krell preamp/CD player (KPS-25, if I remember well). The cord cooled it down dramatically. We repeated the experiment several times, with the same results. My friend was just as flabbergasted as I was.
A quick google search revealed some known power supply issues with the Mark Levinson No 27 Power amp. (It's not my amp so I didn't read further). The first thing I would check is the current draw from each amplifier and again when the cables are swapped. Get a clamp-on ammeter and an AC Line Splitter (Sears 81066).
A few things to consider:

1. Thinner wires have more resistance per foot. The longer the length, the more resistance causing a proportional voltage drop. And more resistance means more heat.
2. As voltage decreases, current increases.
3. Conductors and devices heat up as current rises, (eg.:fuses melt when current ratings are exceeded).

For whatever reason, my guess is that your two power cables are not at all alike. Perhaps the conductors, insulators or connectors in the bad cable are causing a voltage drop at the amplifier. That Sears AC line splitter also has voltage test points making it possible to measure voltage at the plug while under load.
One other possibility I can think of would be, they may have the ground and neutral tied together, in the cord. If the outlet has a long 14 gauge run, it might have a lot of voltage drop. If the ground and neutral are tied together, it can have a lot less voltage drop feeding it, and give more voltage to the amp. This would be not be the best thing, if this is the case. The electrical code (if any) from some different countries are different from ours, and they may not no our system.
If this system is in France, my post most likely wouldn't apply if its on a 220/230 volt supply.
The use of a Kill-A-Watt plug in meter will allow measurements as well.
Voltage, current, Watts, VA and of course, Power Factor.

Cheap and should be in every kit.
2. As voltage decreases, current increases.
01-23-11: Heyraz
Motors yes, transformers no.... Voltage drop on the primary will cause a voltage drop on secondary proportional to the windings turns ratio.