DIY balanced interconnects


I want to build some balanced interconnects.
1. Has anyone compared Swithcraft, Vampire and Neutrik XLR plugs?
2. Any comments on Mogami Neglex 2534 vs Vampire CCC-II vs Oyaide PA-02 cables.
3. Should the ground shield on these twinax cables be connected on both ends, only on the source ends, or on the preamp ends?
Thanks for your comments.
oldears

Showing 9 responses by atmasphere

Oldears, connect the shield at both ends, which will always be pin 1. Any other connection (or lack of connection) is not standard and could cause troubles. The way it is supposed to work is that there will be no signal current in the ground- all the signal will be in the twisted pair.

The Neglex 2534 is excellent cable; I saw a set beat a pair of $15,000 cables in a system just a few days ago. You will find also that the Neutrik connectors are excellent. A major advantage compared to a lot of high end connectors is that they are built to the right dimensions and with excellent materials. I have seen several 'high end' XLR connectors that had fit problems, and damaged the connectors that they were plugged in to because the fit was too tight.

Balanced lines allow longer cable runs with lower noise and less HF losses, so they still have advantages over single ended, even without termination. However you may well hear differences between cables. To eliminate the audible differences, cable termination is a must. The industry standard for decades has been 600 ohms and is still common today. Most high end audio products do not support driving such a low value without bass response losses or overall output losses due to high output impedance issues. But without the termination, you will hear differences between cables, essentially obviating the main reason for using balanced lines in the first place!

So if you really want to get everything balanced line has to offer (read: no more expensive cables, lengths as long as you want), your equipment should support all aspects of the balanced line standard.
Tvad, you got that right. I've not watched the solid state part of the market, but I know that Roland did support the 600 ohm standard. Generally its fairly easy for semiconductors. Wadia supported it in the old days- I've not kept up with them though.

As far as tubes- in high end to the best of my knowledge we are the only ones that support the standard. There are others that can drive 600 ohms, but there will be a loss of bass response- the output impedance issues I mentioned earlier, IOW the 600 ohm load combined with the output coupling cap will produce a rolloff. There are certain transformer-coupled tube products, like Modwright, that may well support the standard too (if they have a 600 ohm secondary on the output transformer), you would have to check with them to be sure.

The reason we support the standard is we had a lot of exposure to recording/broadcast gear prior to the introduction of the MP-1, and we saw the results. I remember seeing a local recording engineer set up the mics a good 150 feet from the recorder, and the recording was fabulous. Years later I had that event in mind in setting the goal to support the same operation. I think a lot of high end companies got in for other reasons and without that exposure.
Oldears, I don't think I've seen any cable wherein the barrel of the XLR was tied to anything. I would ignore that connection. The only ground you need to be concerned about is pin 1, which is always the shield at **both** ends of the cable.

Tvad, its not really a VHS/Beta thing- the point is that if you want to get off the cable merry go round, using the low impedance standard is the ticket. Otherwise feel free to pay whatever you have to to get an unterminated cable to do the same thing :) the way I see it, you either pay for the technology to make it happen, or you pay for the cable to make it happen. Seems to me using the technology to advantage is cheaper.
To sort of sum up what I have been trying to say here. if you want to have the cables not be a part of the overall system sound, the old 600 ohm standard is the way to do it.

The audio world in general has seen a shift from what I call the Power Paradigm to the Voltage Paradigm (see http://www.atma-sphere.com/papers/paradigm_paper2.html for more info)
and the dilution of the 600 ohm standard in balanced operation is an example. Under the old power rule, the effect of the cable could be neglected. Under the newer Voltage rule as Kirkus has mentioned, the cables have an increasingly audible effect as the load impedance is increased.

The question is whether an audiophile would want the cable to have an audible artifact or not. I would prefer that it not, thus I use the 600 ohm standard as that is the technology that was created to solve the problem decades ago. Building more expensive cables to me does not look like a solution.
Tvad, what you are missing is the output impedance is not the spec. For example, the Modright might have a 100 ohm output impedance- if the output winding is designed to drive 600 ohms then it would work fine. Ditto Roland and Wadia. The output impedance is something very different from what load the circuit will drive.

Its a good idea for an electronic circuit to have about 1/10th the output impedance vs the load it has to drive, however there are some exceptions where the output impedance can appear to be much higher, yet it will drive the load just fine. For example, my Neumann microphones are set up to drive 150 ohms. What this means is that if you don't load them at 150 ohms (if instead you have a load of 1000 ohms or higher), the output transformer will express the inter-winding capacitance rather than the turns ratio, and you will get coloration and no bass.

If we take the example of the Modwright, a similar situation exists- its measured output impedance is one thing, the load it drives (and is optimized for) is another. I suspect it has that load built-in, much like the old Ampex tape machines did, so that their output transformers would be properly loaded.

The Cary has sufficiently low enough output impedance to drive 600 ohms, but if it employs a coupling cap, its likely that you will get a low frequency roll-off if you try to do it. IOW the only tube units that will drive 600 ohms properly will have:
1) a low output impedance (well withing the range of several of the units already mentioned) and
2) will either employ
a) an output transformer, or
b) a very large coupling cap, or
c) be direct-coupled.

It turns out a large coupling cap is impractical, so you can see how the realities of trying to do this limits the field.

In the world of transistors, its quite easy to get semiconductors to drive 600 ohms, so there should be lots of the examples there.
Right, Kirkus, my Neumanns are U67s. After doing some measurements with them, we found that the transformer was more linear when driving a lower impedance- even 600 ohms was too high.

The 600 ohm standard arose from the characteristic impedance that was created by spaced telegraph cables, and was later adopted by the phone companies, later still by recording and broadcast equipment manufacturers.

These days that standard is considered obsolete, 1K and the like are common input impedances, as a result the standard is somewhat diluted. We wanted to be assured that our gear would support existing hardware that it might get connected to, as there is still quite a bit of collectible tube equipment out there like Ampex tape machines that are on the same standard.
Kirkus, if you are referring to characteristic impedance, we in fact examined that issue about 20 years ago. The fact of the matter is that if you can determine the actual characteristic impedance of the cable and then terminate it correctly, the result is quite spectacular.

The problem is, you need a Time Delay Reflectometer or the like to make the determination- in practice a bit impractical. So you have no standard termination value as characteristic impedance can vary quite a bit due to minor changes in the construction and materials of the cable. This is likely why the industry settled on 600 ohms decades ago. It clearly was not accident.

In practice, the 600 ohm standard works quite well and since it was already in place, it seemed to be a matter of picking one's battles. As it is, the fact that our preamp is balanced has been its single biggest marketing problem, so in retrospect I'm glad its not been any more complicated that it already has been :)

I would have loved to have more input when we were setting a lot of this up but, but at the time we were the only ones that cared about balanced operation.

I can find plenty of historical evidence to support my paper BTW; the Radiotron Designer's Handbook, published by RCA is a good place to start. If you wish to discuss this further that would be a topic for another thread or email. In fact I would love to do that, the fact of the matter is I've been looking for someone who can rebutt the document ever since it was written (about 3 years ago). No-one has been able to do that so far. BTW, its not about having a tube amp with a high output impedance, its about whether you use feedback or not- IOW obeying the rules of human hearing or not.
Well, Kirkus, the issue is actually quite simple. What are the rules of human hearing? Once understood, then we simply apply physics to create the means that will obey those rules as close as possible.

The most important rule is how we perceive loudness, which is done by listening for the 5th 7th and 9th harmonics. Our ears are so sensitive to these harmonics that we can easily detect alteration of only 100ths of a percent. Audiophiles have words for this alteration: harsh, bright, clinical, brittle... -so for starters the design would have to honor this rule, as it is the most important.

With regards to cables, transmission-line effects do affect audio cables, both interconnects and speaker cables. The effects can be measured and correlate to listening as well.

Conductor spacing, size, geometry, purity of materials and the choice of materials are the issues that affect both what we hear and characteristic impedance. In interconnects it does seem that these issues are less important than they are in speaker cables (interconnects that we measured had characteristic impedances between 40 and 200 ohms- speaker cables varied from 4 to 60 ohms). I have to admit I was quite surprised to discover that characteristic impedance had any artifact at audio frequencies!

I've stayed well away from speaker design. Its easy to put drivers in a box and make them do something; it can be very difficult to make them sound like real music. There are a host of variables that can be quite vexing. I know enough excellent speaker designers that have fabulous results- I doubt I could do as well.

Plenty of material here for another thread...
Hi Kirkus, actually the input impedance of our amps is 100K single-ended, 200K balanced. On the bigger amps there is an input termination switch that allows for 600 ohms between pin 2 and 3 of the XLR. Although I am a fan of the standard, common sense dictates that not everyone is going to have our preamps, so we have to make the amps easy to drive.

We've often done the manual termination exactly as you describe.

I've been through the back thing- good luck with it!