XLR to RCA Adapter Dilemma


I know that use of adapters should degrade the sound. However, when I use the XLR output of my BAT preamp into the single ended input of my power amp, the sound is much fuller than when going from RCA to RCA. Could this be because the XLR output has a higher gain than the RCA? Your thoughts would be appreciated. Thanks.
rlb61
I have some really nice Cardas XLR to RCA adapters that I have had to use numerous times. I could never tell a degradation in sound with their use. I also never noticed better sound either.

Balanced outputs do usually have higher gain, so yes, maybe that's what you are hearing.
You are simply not gaining the real benefits from an XLR balanced connection. It is certainly possible that you still prefer the sound using the XLR output. If it is due to the gain, I think what you are hearing has more to do with a relative volume shift than an actual quality difference.

Are you able to use the connections in parallel and then switch back and forth and then use a dB meter to check the relative volumes. People generally prefer the louder source.
You might be seeing a impedance issue. It would be difficult to say which configuration is seeing frequency loss or gain.

Run a test sweep cd and use sound meter to see if you have any frequency dips or rises between the two to decide which is correct. After all that, use the one you like better.
Post removed 
There is no way to provide a definitive answer without knowing how the output circuits of the preamp are designed. But it is certainly plausible that the sound could be significantly different for the two connection arrangements, for reasons that have nothing to do with the adapter itself.

I would expect that in many and probably the majority of designs there would not be a significant difference between the two configurations in the voltage seen by the power amp, because when using the preamp's balanced output in conjunction with an XLR-to-RCA adapter you are only utilizing one of the two signals in the balanced signal pair. In many (although certainly not all) situations where a volume increase such as 6 db occurs balanced vs. unbalanced the voltage of each of the two signals in the balanced signal pair is the same as the voltage of the unbalanced output. But since the two signals in the balanced signal pair are inverted relative to each other (i.e., 180 degrees out of phase) the DIFFERENCE between them at any instant of time, which is what a balanced receiver circuit would respond to, would be twice as much as the voltage of the single unbalanced output signal of the same component.

Do you tend to set the volume control at about the same positions in the two configurations? If so, differences in voltage are probably not the explanation.

Another possibility is simply that the output stage which drives the two signals that are supplied to the XLR connector may be independent of the output stage which drives the signal that is supplied to the RCA connector, and may have significantly different design characteristics, including output impedance among many other possible differences. As I say, the question can't be definitively answered without being familiar with the internal design of the preamp.

In saying all of this, btw, I'm assuming that you are using the same RCA-to-RCA cable in both cases, and when you utilize the XLR output of the preamp you are using an XLR-to-RCA adapter at the preamp's output. As opposed to running an XLR cable from the preamp output and using an adapter at the power amp input, which would introduce additional variables into the mix.

Regards,
-- Al
Liz I believe that the less stuff in the signal path rule is pretty much accepted as a truth. I realize that it does represent a theory or tenet of a widely held belief system that many audio people ascribe to. I also know about your
"tube buffer" work around, which flies in the face of this sentiment.
"I believe that the less stuff in the signal path rule is pretty much accepted as a truth"

Well, the simple old game of rope pull debunks this myth, The more that pulls the greater the result.

Thanks Al, for always supplying enlightened and reasoned answers to this forum, we need more guys like you.

Good Listening

Peter
Thanks for the responses, folks. Yes, Al, I am using the adapter at the preamp out only, and then plugging into the power amp directly with the rca. Also, I am using the same rca to rca cable. I have a/b'd this ad nauseum, and my ears continue to conclude that using the adapters through the balanced output of the preamp results in greater clarity and volume over the rca to rca connection. To be certain that it just wasn't "me," I had my wife listen to the system and she came to the same conclusion.
Also, to Elizabeth ... thanks for validating that practical application can trump theory. For a minute there, I was beginning to doubt myself because the theorists maintained that what I was hearing was simply impossible. I feel much better now.
... the sound is much fuller than when going from RCA to RCA. Could this be because the XLR output has a higher gain than the RCA?

Yes, that is normal. The sound quality has nothing to do with RCA or XLR Design, when you have both it depends on matching components and what technical quality the Designer has implemented.
RCA/XLR adaptors degrade the sound normally because they are internally super lousy wired. Open them and you will find even in very expensive adaptors super thin, ultra cheap wire. When you want a good one, contact SignalCable, they do custom order with Silver wire, not expensive and very, very good. You can also order better connectors (for signal transfer).
Rib,

You should be able to look at your manual and look at the output via xlr vs RCA and see what the manufacturer says its doing. It is not uncommon to see different output voltages for the different outputs in specs.

My old Esoteric X03 cd player played great via rca into my preamp but the balanced out overloaded the preamp input due to higher output voltage. It sounded better until it clipped (distorted). It is not uncommon. Balanced is/was designed for longer cable runs so makes sense it has a more robust signal for traveling further.
Agree with Elizabeth's point, actual use (practical application) is the final determinant. Rlb61 you were right to trust what you heard and not what theory would predict as the outcome. In general I subscribe to "simpler is better" but readily acknowledge there are situational exceptions .
Charles,
Peter (Pbnaudio), thanks very much for your comment, which I reciprocate :-)
11-30-13: Rlb61
Also, to Elizabeth ... thanks for validating that practical application can trump theory. For a minute there, I was beginning to doubt myself because the theorists maintained that what I was hearing was simply impossible.

12-01-13: Charles1dad
Agree with Elizabeth's point, actual use (practical application) is the final determinant. Rlb61 you were right to trust what you heard and not what theory would predict as the outcome.
While I certainly agree that in general actual use is the ultimate determinant, and supersedes theory if the two are in conflict, in this case it seems to me that it is not theory that is wrong, it is the THEORISTS that are being referred to who are wrong, or perhaps are being misinterpreted.

Specifically, it would appear that the alleged theory overlooks the fact that using the adapter may call into play a significantly different output stage design in the preamp. Which in addition to having different intrinsic sonic characteristics may also be interacting differently with the power amp's input impedance.

As I said in my earlier post, the OP's findings are certainly plausible from a technical standpoint, but a precise explanation cannot be provided without knowing more about the specific designs.
12-01-13: Cerrot
It is not uncommon to see different output voltages for the different outputs in specs.
Although often the difference in the two output voltages will be a factor of two, and that difference will be negated when an adapter is used, since as I indicated earlier the use of an adapter results in only one of the two signals in the balanced signal pair being utilized.

Regards,
-- Al
XLR adapters will work and in most cases should work quite well, but you might want to experiment with Jensen transformers for a true balanced to single ended conversion. Their downside is that you will need two pairs of interconnects to go between the components. The upside is potentially better sound.
Unfortunately, my amp is not at all balanced, and I'm not looking to replace it since I like it just fine, and would rather save the dough. So, for now, it's adapter city for me. I may just contact Signal Cable and inquire about custom adapters. Thanks.
Hi - check out the Sterophile review of the BAT 3iX. Although this pre is different, the balanced output has 6db higher gain when used with single ended input.
Furthermore, the output impedance of the preamp varies significantly with balanced/unbalanced across different frequencies and in the testing there was a significant hf roll off in balanced mode with low impedance input power amps. Your power amp is 31k input impedance and the VK3iX is high and goes to 14k at some frequencies.
If the 3i is similar to the 3iX then you probably have a hf rolloff issue in balanced mode with the Musical Fidelity amp.
A lot of amps have an opamp on the input to handle the XLR input signals, then come back to the single end after this opamp. Where the rca input bypasses this opamp.

I have found this on some big hiend amps. You can imagine that the rca on these amps sound better than the xlr inputs when the levels are the same, but being an opamp for the xlr input some have a small amount of extra gain especially if the opamp they used is not unity gain stable.
These kind pseudo xlr inputs are more common than you think.

Cheers George
Post removed