Phase inverting problem


Hello,

I have a Conrad Johnson PV-12A pre-amp. It is phase correct for the phono stage, phase inverting for the line stage.

My power amplifier is a conrad johnson MF2100. It is phase correct.

So my first idea was to connect the speakers to the power amplifier the wrong way (black to red, red to black) and then connect the cartridge the wrong way around as well (R: + and - reversed, L: + and - reversed). Then the phase should be correct for everything.

But there lies the problem. When I switch the connections on the cartridge, I get a really loud hum, makes the music barely hearable. Perhaps it has something to do with the fact that R- is connected to the cartridge body, it is some kind of earth? Anyway, switching the connections on the cartridge is not an option. So, what to do?

My only idea so far is, seeing that I only have one line input (cd), is cutting open the RCA cable and switching + and - of the line, and connecting the speakers to the power amp the correct way. So, I'll do just that. But maybe there's a better solution that I'm missing. Any ideas?
swaf

Showing 3 responses by kirkus

Actually, the question of absolute phase is a very big consideration in recording and mastering.

First, every professional microphone specifies its wiring polarity referred to sound-pressure polarity - this is crucial for any kind of consistency in application of microphone techniques. Also, keep in mind that a large percentage of microphone models are used in both a recording and a sound-reinforcement context, and in the latter case it's extremely important to keep track of absolute phase as the sound of the instruments (or instrument amps) themselves interact directly with the front-of-house and monitor loudspeakers. (Anybody who's worked with older JBL stuff should be familiar with these phasing issues . . . the driver labelling is reversed.)

Second, in either the live or studio context a great number of sound sources and equipment processing loops is ultimately mixed together, at least the relative phase inarguably crtitical. So in practice, the correct connection of all equipment, wiring, and patchbays (observing individual TRS and XLR pinouts, etc.) is a cornerstone of good workmanship in professional practice.

So unless somebody's made a mistake, absolute phase should indeed be preserved all the way through the studio recording chain, and also through the chain at the mastering studio, especially if it's digital. For record lathes, I know that the Neumann and Ortofon cutting amplifiers are very clearly specified as far as their absolute input phase, and of course the cutting-head MUST be properly phased to the amplifier or it will oscillate and destroy itself.

Now there are generally three places in the recording chain where phase can be deliberately manipulated - when tracking, on mixdown, or in mastering . . . through the use of a phase-inversion switch on the mic preamp, console channel strip, or mastering console. In practice, all of the switches start out "non-inverted", and the phase of a particular channel/microphone is inverted only when necessary for specific interactions . . . for example, when two mics are used for the top and bottom of a snare drum. Inverting a single microphone during tracking is generally frowned upon; keeping the audio as un-molested as possible until mixdown is the usual goal. For the overall absolute phase, the mastering engineer usually makes a final decision.

As for Al's question, most of the guys I've met who record on-location with minimal microphone techniques pay very close attention to absolute phase, especially with M/S and Decca Tree configurations. If there are any distant "hall" omnis, they'll usually phase these to preference, while keeping the primary mics uninverted. They also almost always track directly to digital, making it easy to keep everything the same through mastering.

Now after all this blathering, I agree with Atmasphere as to the general utility of an absolute-phase switch in a home reproduction context, if maybe for different reasons. I feel that there are very few reasons not to design equipment or wire a system so that absolute phase is maintained. But how much it matters is one of those old, unsolveable audio debates.

The absolute phase of any loudspeaker through the midrange and treble can be thought of as arbitary, as connection polarity, driver response, and crossover designs vary considerably. And in lower frequencies, the interaction between the cabinet/driver/port tuning and room trumps everything else. Maybe for headphone listening there's some validity for phase absolutism, but through loudspeakers, it's a "flip to your taste" kind of thing, if you care to take the time . . . which I personally don't.
Some equipment uses a modification of the original balanced standard, in which pin 2 is non-inverting. Sometime in the 70s or 80s, European equipment went to pin 3 non-inverting. This practice has shown up in some Japanese equipment as well. This stuff is all over the industry! Unless someone has taken the time to make special cables that convert from the pin 2 convention to the pin 3 convention, the result is there is simply no way to know what is up.
Atmasphere raises a very important point here, that is the difference between the "American" (pin 2 hot) and "European" (pin 3 hot) XLR pinouts. The "American" pinout is the EIA/AES specified professional standard, but this is of course not at all consistent across different types of gear, i.e. microphones are virtually always wired with pin 2 hot.

I didn't mean to imply that recording and mastering engineers in general give any special attention to absolute phase, but as others has pointed out, relative phase is absolutely critical. So in practice, when outboard gear in the studio (i.e. a compressor, mic preamp, effects unit, etc.) is wired to the patchbay or console, any potential polarity reversals *should* be corrected. Otherwise, this can cause some really weird issues when the output is brought back into the console for mixdown, or routed to headphones on tracking.

So a thorough, necessary attention to relative phase is highly necessary, and absolute phase much of the time just comes along with the ride. This is especially true given the ubiquity of Pro Tools . . . when one purchases a CD that's digitally recorded, mixed, and mastered, it's a pretty good bet that a positive pulse on the vast majority of input channels' (microphones') ADC corresponds to a positive pulse on the output of your CD player.
Tbg - dunno how many or which players invert polarity. I dunno why either, but I don't think they do it on purpose. Maybe they are like recording engineers and just donl;t think about it.
There are three reasons why a consumer audio product would invert phase:

1. Ambiguous or no standardization on hookup of balanced interconnections (i.e. the XLR pinout conventions that Atmasphere mentioned)

2. Error in design, manufacturing, or nomenclature

3. Preference for circuit design topologies/combinations for which inversion is a side-effect. An example would be a two-tube line preamp where the first stage is a plate-loaded voltage amp, and the second is a cathode follower . . . the signal will come out inverted, unless another stage is added to flip it back around. These types of circuit designs are increasingly eccentric and anachronistic in modern equipment . . . and pretty unusual even in the field of enthusiast audio where eccentricity and anachronism is widely respected.

Reason #3 is especially uncommon in CD players and DACs, given that the overwhelming majority of high-quality DAC chips have balanced current outputs and/or on-chip signal inversion capability . . . meaning that the designer can just as easily preserve signal polarity regardless of the design of the output stage. That's why I used a CD played from a Pro Tools recording/mastering process as my "good bet" example.