Dedicated 20 amp circuit - Electrician laughed!


I brought my electrician out to my house today to show him where I would like to install a dedicated 20a circuit for my system.  He laughed and said that's the stupidest thing he's heard and laughs when people talk about it.  It said, if you're going to do it, you have to have it separately grounded (shoving a new 8 foot rod into the ground) but even then, he sees no way there can be an audible improvement.

Now, he's not just an electrician though. He rebuilds tube amps on the side and tears apart amps and such all the time so he's quite well versed in audio electronics and how they operate.

He basically said anyone who thinks they hear a difference is fooling themselves.  

Personally, I'm still not sure, I'm no engineer, my room's not perfect, and I can't spend hours on end critical listening...  But, he does kinda pull me farther to the "snake oil" side and the "suggestive hearing" side (aka, you hear an improvement because you want to hear it).

I'm not taking a side here but I thought it was interesting how definitive he was that this not only WILL not make a difference but ALMOST CANNOT make a difference. 
dtximages

Showing 4 responses by heaudio123

I am going to call BS on this, or at least poor measurement. A 10 foot long 18awg cord at 15 amps (well over rated current) would be about 2.5V. A much more typical connection for 15A would be 14awg, and say 6 feet or about a 0.6V drop and 9 watts if running flat out. Even class-A amps at 140W would not draw 15A. If you had 40w of losses, either your cable or outlet is going to get quite warm.

AC voltage drop is the voltage dropped from the wall to the input of the equipment in use. I’ve measured a loss of 40 watts on an amp that makes 140 watts, so no-one should be surprised that that might be audible as well. I used a 3 1/2 digit DVM to measure the voltage drop and it showed around 3 volts. This was a pretty standard but inexpensive Belden cord. A more expensive Belden cord with heavier gauge showed a lessor drop and more power out of the amp. So no mystery here.


Can't agree with this either. The main limitation in high frequency power delivery is not going to be the AC cord from the wall, but the power transformer either in the equipment or outside your house.  Add resistance (or inductance) in series and you are going to soften those diode switching spikes and reduce the output noise of the power supply at high frequencies which is likely to be a bigger issue for most supplies than low frequencies which feedback can usually negate. There is a reason why Pass amps and other good quality amplifiers have inductance to slow down current delivery into the power supply. If you have enough capacitance and are not nearing the voltage peak of your amps in operation, I would guarantee most amplifiers have less distortion with a bit of resistance/inductance on the line.

If the power cord limits current during this period, the performance of the circuit using the power supply might suffer, possibly due to increased IMD since the DC might have a bit more of a sawtooth on it than if the current was not limited.



Where the dedicated line mainly comes in is noise rejection from other things that could be on the line and to prevent signal injection via a case ground voltage that varies in potential to signal ground with the draw on the AC line.
Unless you are clipping bass your power supply is more stable with some resistance not less. However, neutral to ground differential can induce noise in unexpected places as chassis and signal ground vary with load.
I don't remember ever saying I have an audio equipment company, which is not to say I have not been involved in the research, design, measurement, and professional evaluation (and I don't mean magazine review) of acoustic, electro-acoustic and electronic products in the audio sphere.
I don't have a problem with that. If I post something similar, I hope he does the same to me.

- If you are seeing 40W of losses on a 140W output amplifier, you are either using a power cord way too small for the application and unsafe, you have some serious non-normal contact resistance that is again generating unsafe losses that could cause enough heating to melt the plastic holding the contacts, or there was measurement error or a combination there-of. I don't think many people are using 10 foot, 18awg cables with massive Class-A amplifiers drawing 15 amps continuous, so this does not sound like a "real-world" condition as I noted. With an adequate cord for the maximum draw of the amp, 1/2V or less would be more typical.


- If you were measuring 3V RMS drop, the drop during current transfer be 3x that or more if the capacitor bank is bigger. Bigger the capacitor bank, worse the THD on the AC line (baring other circuitry to improve). It would be expected there would be a reduction in Output power.

- It is most definitely not a given that increased resistance on the AC line results in more IMD, especially if the amp is not driven into clipping:
since the DC might have a bit more of a sawtooth on it than if the current was not limited.

This is not what happens. The exact opposite happens, assuming the amp is not into clipping as I noted above. Adding resistance will smooth the voltage on the bulk DC capacitance because it increases the conduction angle from the AC line. This filters out high frequencies on the power supply rail, which is beneficial, and it reduces radiated noise on the AC lines by reducing the peak current draw and frequencies. Power supply ripple generally presents itself as THD, as you get components of the power supply frequency modulating with the audio signal. You may get IMD products from other non-linearities, but again, as the power supply rail is more stable, these will also be less if you are not clipping.