Bluesound Node 2 vs. Auralic Aries Mini


Anyone have experience with both the Bluesound and the Auralic? Specifically, the user apps and their capabilities. I'm thinking about replacing my Squeezebox Touch to gain some newer technologies that allow higher bit rates and improved WiFi technology. My main concern is the WiFi and management capabilities, not the DAC as I'll use the digital out to drive an external DAC. I just want something that cleanly moves files from my PC library of primarily Apple Lossless ripped music and plays internet radio. I'd also like to be able to dabble with hi-rez formats, although I haven't taken much of a plunge there, yet. Streaming is not a big deal for me, but if the abilities are there that's no loss.

So, seems to me that the Node 2 and Aries Mini would both fit the bill. So it probably comes down to a matter of user experience with the control apps. Appreciate any thoughts!
Ag insider logo xs@2xdogmcd
Really like the Auralic after using it for a while. I was using a Squeezebox Touch before and it just seems that the Auralic does a better job, probably due to newer technology wireless and data transfer components internally, and I do like the Lightning DS app. Like any software, you have to get used to using it and figure out its little idiosyncrasies, but the Lightning app is intuitive and works very well. 

On the subject of USB cables, I still have some skepticism about one "sounding better" than others. Not to say that can't happen, but if and when it does are we actually hearing something "better" or just "different"? Keep in mind that I spent over 30 years in the computer industry, so I'm not just shooting blindly here.... USB, whether 1.0, 2.0 3.0 or the new USB "C" are industry standards that were created and defined by the computing industry starting back in the '90's. Now, since it is a standard, cables must conform to it, so no matter the materials  used or the construction techniques applied, that base standard must be adhered to or it can't be called USB. The original intent of USB was to carry data, NOT music signals. However, we've transformed music signals into data via digital technologies (another debate all on its own) and made access to music faster, more ubiquitous and cheaper. Since USB was the standard available on the computers everyone was using for computer audio it made sense for the audio industry to embrace it for transfer of data from point A to point B. 

IF (and its a big IF) a USB cable properly transfers a data stream, USB power, and clocking signals from one device to another (here's where the audiophiliac claws will come out...!) THERE IS NO WAY A DIFFERENT USB CABLE WILL IMPROVE THE SOUND. Remember, I said "IF" everything is correct and in a USB data stream from device to device that level of precision is rarely met. So, the standard includes lots of error correction to "rebuild" the data packets crossing the interface to an acceptable level of usability. With music, especially hi-rez, data rich packets, that aforementioned correction, along with proper clocking, all become critical. Plus, the " bits is bits" argument is not really true because I can tell you from experience that some bits can track their voltage too high and some too low, and in each case, that will cause anomalies in the bit stream that must be corrected. And, here's the rub with audiophile USB cables - most do the same thing as some of the generic brands out there because they MUST be built to the standard to be called USB! Remember, the standard says you have to be able to move the data stream within an acceptable level of error and loss from point A to B and the rest the magic happens down stream from the USB world in receiver chips, DAC and output amplifiers within the realm of a digital device. So, my suspicion is, when people claim that they hear "differences" in USB cables, what they are hearing is a product of the data transfer and correction needed with each of those different cables. What that says is that some cables are transferring data and clock signals, all at the proper voltages, better than others because that's the only thing that can make a difference. So, considering that cables must be built to a standard and once that cable's signal transfer characteristics make everything correct, you're done. I fear that what we hear as "better" may just be "different" - possibly "worse" - if a cable is not up to par and causes more error correction or clocking errors than another. However, getting that data stream to be correct can be achieved for a pretty nominal cost, but it's very easy for us to succumb to hearing improvements when we're told we will and we put down money for that improvement, whether it's there or not. Our brains will trick us into believing the angels are signing louder in direct proportion to the amount of money we spend regardless of actual performance differences. IF A USB STANDARD CABLE GETS IT RIGHT, IT GETS IT RIGHT, PERIOD. It's about a data stream that gets transformed into something else, and if that data is right, it's right. No amount of fooling around with wire type or insulation or connectors, or whatever marketing hype or pseudo science you want to add will make a difference because it can't, UNLESS it can actually improve the data transfer parameters.

So, my personal sensibilities regarding USB cables is to tread with caution. I find if hard to believe that a USB cable that costs hundreds, even thousands of dollars is going to have the ability to change the data stream in a way that it's actually worth that much more than say a Belkin Gold or the like. And, I have yet to see any kind of demonstrable science or evidence showing how one cable compares to another in actual data transfer characteristics. I also think to myself that the snake oil quotient is extremely high with USB cables because if a manufacturer sells more than one cable and their highest price cable is the "best" - one that hits all of the needs of the data transfer 100% let's say -  that means they are producing inferior products below that level, some that may not even conform to the standard. It's also a possibility that we hear things with USB cable changes that may be more pleasing to the ear because they are actually diminishing the signal rather than improving it, like a filtering effect at some frequency extreme or the like.

All that said, I am at the end of the day a red-blooded capitalist, and if someone hears a difference, perceived or real, and wants to spend their dough to get that difference then by all means the manufacturers should take their money! I for one remain skeptical and tend to be wary of whatever hype is poured out, especially by the high-end rags, on the cable of the moment. In my experience, cables do make a difference in a high quality system, but the laws of diminishing returns are higher with cables that with any other piece of the audio chain. Oh, and don't give me that junk about ," well, you just don't have a high enough resolution system to hear the differences". The arrogance of that makes my blood boil. Yes some folks have systems that may not be as highly resolving as our own, but let's take the stance of trying to help rather than trying to diminish someone else to make our own egos happier. I will continue to experiment with USB cables and see what comes of it. And, just to muddy the Watters further, our friends at Apple have gone to USB "C" in their products and that new standard will slowly but surely become ubiquitous making all of our expensive audiophile USB cables useless and enabling all of the big boy cable companies to... sell us all new cables! :)

Happy listening!
Let us not forget that in computer-to-computer data transfers, data is checked and replaced if faulty. In audio streaming, it is a one-way street.  I have played with more with the Ethernet cables recently, and some are sold as "the best" but I would give up the hobby if i has to actually keep them in my system. So, until you try a few, I'd suggest you really don't know.
Hmmmmm... in audio streaming, it is not a one way street. USB protocol is USB protocol and there's just as much error correction that happens whether it's music or a Microsoft Office document. Computers don't know the difference. Now, if you're talking about S/PDIF, or ethernet network signals (and with the latter, there's still standard error correction protocol) that's a different matter, but with USB it's ultimately a matter of correct data at the receiving end, clocked in at the proper time. Once that is achieved, a cable cannot "improve" the data. It may change it and make it different, but it won't improve it.
Hi wgutz: Which Ethernet cables have you auditioned/compared? What were their sonic characteristics?  Thanks in advance, Jeff