To bi-wire or not to bi-wire?


I have 2 pairs of floorstanders that have bi-wire capability: Dali Ikon 6 as FL & FR in my 7.1 a/v system; Polk M50 in my 2.1 PC system.

The manual for the Ikons shows how to bi-wire but makes no recommendation that it be done. The Manual for the M50 doesn't say much about anything. So, no guidance from the manufacturers.

I have read both pros and cons re. bi-wire. There appears to be some consensus that success with bi-wire depends on the particular speakers and the amps they are paired with.

In a previous 5.1 system, I had Wilson Cubs for the front 3. I had the L and R Cubs bi-wired and I could not tell any difference in sound compared to the single wired center Cub. They all sounded equally great.

I would be grateful for any advice.
mmarvin19

Showing 4 responses by musicnoise

Sounds like you already have the proper approach and are a rationale consumer; as evidenced by your comment about the Wilson Cubs. Biwiring does not make any sense from an engineering standpoint. The theories used to support biwiring are pretty much junk science, particularly the idea that your amplifier sees a different impedance when driving the same speakers biwired vs non biwired. I have not heard a difference between the same pair of speakers biwired or not. Biamping on the other hand makes a good deal of engineering sense and if you really want to tweak your system for noticable effect, is something to be considered. This of course is more expensive. More importantly, biamping with off the shelf units requires a good deal of research before-hand as to the units and requires some experimenting once tentative choices are made. The ideal way to biamp is to design the amplifier specifically for the driver. Speaker wire is not all that expensive (unless you subscribe to the idea that expensive speaker wire is better) so trying biwiring vs non biwiring seems reasonable. Even if you do subscribe to a theory or expensive speaker cable is better, experimenting with dollar per foot speaker cable for 15 foot runs with decent connectors is inexpensive and the results of that testing should sufficiently inform you as to which choice to make.
As to the idea of the amplifier seeing a different impedance from biwiring vs not biwiring, any way you draw the circuit, the amplifier sees the same impedance. Likewise, biwire or not, both drivers see the same signal from the amplifier. Shardone's transmission line theory offers a plausable explanation for changing the impedance as seen by the amplifier (at high frequencies) but does not provide for an explanation as to a difference in the individual speaker circuits as seen by the amplifier- in other words the change is macro affecting everything. Now, the induced noise theory offered, from coupling low frequency noise (probably 60 Hz) does offer a plausable explanation for different effects on the respective drivers, assuming that the coupling is different for each set of wires, because that theory essentially inserts different sources in each leg (Nice theories by the way) I don't think this is the intended goal of biwiring though. I checked out all of the online explanations of the biwire effects that I could find but could not find one that demonstrated the effects through an analysis. If there is an explanation why not show it with a step by step circuit analysis with reasonable lumped components - i.e. not assuming that the speaker wires are ideal conductors. That is how the rest of the engineering community explains such things. Seems easier and more convincing. Kind of hard to argue with math (actual math that is, not referrals to math terms).
actually our "hearing instrument" does not have an incredible range and is not all that accurate. A 3 db spl sound change is barely perceptible to the average person. While the audio range is defined as 20 to 20k most people cannot hear frequencies near 20 k and most people over 50 cannot hear 15k. Many years ago I repaired tv's as a part time job. I was in my 20's and worked with two guys in their 50's, I could hear the horizontal oscillator vibration on some sets - crt's ( more likely something derivitive of it from mechanical vibration - but regardless - a high frequency) which is 15,750. The other guys in the shop could not. Their hearing otherwise appeared normal, i.e. normal conversation etc. This is not merely anecdotal, , high frequency hearing loss with age, presbycusis, is very common and audiology testing does not go beyond 8k - the frequency losses are evident at 4 and 8 k with mild cases showing 30 db attentuation from normal. That is my point on a lot of my posts - the instruments we have available to measure everything that has to do with hearing (different than something to measure our tastes in what we hear or what the sounds mean to us - which is where the art comes in) are orders of magnitude more sensitive, accurate, and resolving, than our ears. So the values measured with such instruments should be the base from which to evaluate the effects of many of these quasi technological solutions.
Shardone: I agree with you on the point of wide band amplifiers. A high frequency cut off at 200k will reduce the phase change and attenuation a decade down. This would allow for less attention to the characteristics of that 200 k filter. However, well engineered band limiting will achieve the same effect and avoid any problems resulting from the additional energy contained in those frequencies outside the audio range. Two ways perhaps to achieve similar results but I would opt for the more band limited approach.