In defense of ABX testing


We Audiophiles need to get ourselves out of the stoneage, reject mythology, and say goodbye to superstition. Especially the reviewers, who do us a disservice by endlessly writing articles claiming the latest tweak or gadget revolutionized the sound of their system. Likewise, any reviewer who claims that ABX testing is not applicable to high end audio needs to find a new career path. Like anything, there is a right way and many wrong ways. Hail Science!

Here's an interesting thread on the hydrogenaudio website:

http://www.hydrogenaud.io/forums/index.php?showtopic=108062

This caught my eye in particular:

"The problem with sighted evaluations is very visible in consumer high end audio, where all sorts of very poorly trained listeners claim that they have heard differences that, in technical terms are impossibly small or non existent.

The corresponding problem is that blind tests deal with this problem of false positives very effectively, but can easily produce false negatives."
psag

Showing 2 responses by minorl

This to me is very indicative of the people in power or the ones that are the "experts" wanting to stay that way. Remember the attitude in the sixties and early seventies regarding wines and how the "experts" continuously stated that French wines were the best and everyone else's was not very good? It wasn't until the "judgement in Paris" happened that the world realized that opinions were changed dramatically when blind testing occurred. There I absolutely no scientifically logical explanation why blind testing isn't the best comparison method.

Of course it has to be an apples to apples comparison. This to me means price point testing. Just like cars. Pick a price point, get the equipment that falls within that price range and go at it. But, tube lovers will pick tube equipment most of the time based on knowing what they are hearing ahead of time. Same is true for solid state lovers. But, blind testing? within price points? Lets see what the experts say then. But, the "experts" don't what to do that because the would show people that many of them (absolutely not all of them) are frauds.

if test are not done scientifically and are not based on "opinions" they really aren't real to me. How does one measure whether the equipment accurately demonstrated the sound stage depth? dimensionality? etc. I hear many opinions of the reviewers, but based on what? What criteria? are you going by memory in your opinions and comparisons? or did you listen intently and then switch out that amp with another (without changing anything else) and listen again?

I have read of some reviews that do exactly that. And the equipment they are reviewing is compared to similar equipment within the price point. That is alright for me. But, I still prefer an A/B comparison test that is blind to really identify the sonic differences in an unbiased way.

enjoy
Zd542; I get what you are saying in response to my post. The closest I have seen is as I mentioned. Reviewers listening to one piece and some music, then swapping it out for another piece, without changing anything else and listening again. My point earlier and using the wine example was that blind A/B testing would show that most reviewers (not all) have no clothes and they can't have that. so the best I can hope for in this day is what I mentioned earlier.

However, companies respond to letters and not so much to phone calls and posts on chatboards. So maybe more letters to the magazines requesting A/B blind testing may help

enjoy