Bluesound Node 2 vs. Auralic Aries Mini


Anyone have experience with both the Bluesound and the Auralic? Specifically, the user apps and their capabilities. I'm thinking about replacing my Squeezebox Touch to gain some newer technologies that allow higher bit rates and improved WiFi technology. My main concern is the WiFi and management capabilities, not the DAC as I'll use the digital out to drive an external DAC. I just want something that cleanly moves files from my PC library of primarily Apple Lossless ripped music and plays internet radio. I'd also like to be able to dabble with hi-rez formats, although I haven't taken much of a plunge there, yet. Streaming is not a big deal for me, but if the abilities are there that's no loss.

So, seems to me that the Node 2 and Aries Mini would both fit the bill. So it probably comes down to a matter of user experience with the control apps. Appreciate any thoughts!
Ag insider logo xs@2xdogmcd

Showing 6 responses by dogmcd

Thanks to all for the great feedback. Seems my initial thought was correct in that either may be a good choice, might just come down to which I can get the best deal on! :) Sounds like both have decent control apps.

On the subject of "just using a computer", that's not an untenable idea, just not my cup of tea. In my line of thinking, a streamer like the Node 2 or Aries Mini actually becomes an extension of my computer that's already set up and running. All I need that streamer for is to get the data I want from the computing environment and out to my DAC. Plus, if I were to go computer only, I'd get constrained by disk space (I have about 3 Terabytes in a RAID 5 set on my current PC/Server) because most of the pc's that I would deem quiet enough to be close to my system have solid state drives that aren't quite adequate in size. On top of that, I'd get sucked into the entire USB cabling debacle, which probably contains more snake oil than anything in the history of audio.... oh, and all of those silly, megabuck USB cables have potentially been rendered obsolete now that Apple has released all of their latest PC's with USB-C ports. Rest assured, the rest of the industry will follow.
Great info folks, and many thanks for the comments on the direct comparison. Not sure I would understand the mechanism of one sounding better than the other via their digital outputs and not their DAC's, unless one actually has some sort of anomaly in how it transfers data, which is entirely possible. I have a Nuprime as well, the DAC10H, so I'm guessing my experience may be similar. Also interested in the other streamer suggestions, so I may investigate them as well... happy listening!
Was able to pick up an Auralic Aries Mini for an exceptional price. Does exactly what I was looking for and the Lightning DS interface works exceptionally well. Had a couple of hiccups with initial setup, but that was not due to the Aries Mini, it was an issue with some file protection settings back on my PC ( I know, just get an Apple for my file server and let it do the work :)  !). I am, begrudgingly, trying some USB cables to connect to my Nuprime DAC10H, but not are fancy, high end varieties - my snake oil meter goes off at low levels - and plan on comparing those to my usual Acoustic Zen Absolute Digital. By the way, the Auralic absolutely contains a DAC no plans on using it in my set up, but it's definitely there.

That said, I'm very impressed with the Auralic. Does what it's supposed to and sounds exceptional.
Really like the Auralic after using it for a while. I was using a Squeezebox Touch before and it just seems that the Auralic does a better job, probably due to newer technology wireless and data transfer components internally, and I do like the Lightning DS app. Like any software, you have to get used to using it and figure out its little idiosyncrasies, but the Lightning app is intuitive and works very well. 

On the subject of USB cables, I still have some skepticism about one "sounding better" than others. Not to say that can't happen, but if and when it does are we actually hearing something "better" or just "different"? Keep in mind that I spent over 30 years in the computer industry, so I'm not just shooting blindly here.... USB, whether 1.0, 2.0 3.0 or the new USB "C" are industry standards that were created and defined by the computing industry starting back in the '90's. Now, since it is a standard, cables must conform to it, so no matter the materials  used or the construction techniques applied, that base standard must be adhered to or it can't be called USB. The original intent of USB was to carry data, NOT music signals. However, we've transformed music signals into data via digital technologies (another debate all on its own) and made access to music faster, more ubiquitous and cheaper. Since USB was the standard available on the computers everyone was using for computer audio it made sense for the audio industry to embrace it for transfer of data from point A to point B. 

IF (and its a big IF) a USB cable properly transfers a data stream, USB power, and clocking signals from one device to another (here's where the audiophiliac claws will come out...!) THERE IS NO WAY A DIFFERENT USB CABLE WILL IMPROVE THE SOUND. Remember, I said "IF" everything is correct and in a USB data stream from device to device that level of precision is rarely met. So, the standard includes lots of error correction to "rebuild" the data packets crossing the interface to an acceptable level of usability. With music, especially hi-rez, data rich packets, that aforementioned correction, along with proper clocking, all become critical. Plus, the " bits is bits" argument is not really true because I can tell you from experience that some bits can track their voltage too high and some too low, and in each case, that will cause anomalies in the bit stream that must be corrected. And, here's the rub with audiophile USB cables - most do the same thing as some of the generic brands out there because they MUST be built to the standard to be called USB! Remember, the standard says you have to be able to move the data stream within an acceptable level of error and loss from point A to B and the rest the magic happens down stream from the USB world in receiver chips, DAC and output amplifiers within the realm of a digital device. So, my suspicion is, when people claim that they hear "differences" in USB cables, what they are hearing is a product of the data transfer and correction needed with each of those different cables. What that says is that some cables are transferring data and clock signals, all at the proper voltages, better than others because that's the only thing that can make a difference. So, considering that cables must be built to a standard and once that cable's signal transfer characteristics make everything correct, you're done. I fear that what we hear as "better" may just be "different" - possibly "worse" - if a cable is not up to par and causes more error correction or clocking errors than another. However, getting that data stream to be correct can be achieved for a pretty nominal cost, but it's very easy for us to succumb to hearing improvements when we're told we will and we put down money for that improvement, whether it's there or not. Our brains will trick us into believing the angels are signing louder in direct proportion to the amount of money we spend regardless of actual performance differences. IF A USB STANDARD CABLE GETS IT RIGHT, IT GETS IT RIGHT, PERIOD. It's about a data stream that gets transformed into something else, and if that data is right, it's right. No amount of fooling around with wire type or insulation or connectors, or whatever marketing hype or pseudo science you want to add will make a difference because it can't, UNLESS it can actually improve the data transfer parameters.

So, my personal sensibilities regarding USB cables is to tread with caution. I find if hard to believe that a USB cable that costs hundreds, even thousands of dollars is going to have the ability to change the data stream in a way that it's actually worth that much more than say a Belkin Gold or the like. And, I have yet to see any kind of demonstrable science or evidence showing how one cable compares to another in actual data transfer characteristics. I also think to myself that the snake oil quotient is extremely high with USB cables because if a manufacturer sells more than one cable and their highest price cable is the "best" - one that hits all of the needs of the data transfer 100% let's say -  that means they are producing inferior products below that level, some that may not even conform to the standard. It's also a possibility that we hear things with USB cable changes that may be more pleasing to the ear because they are actually diminishing the signal rather than improving it, like a filtering effect at some frequency extreme or the like.

All that said, I am at the end of the day a red-blooded capitalist, and if someone hears a difference, perceived or real, and wants to spend their dough to get that difference then by all means the manufacturers should take their money! I for one remain skeptical and tend to be wary of whatever hype is poured out, especially by the high-end rags, on the cable of the moment. In my experience, cables do make a difference in a high quality system, but the laws of diminishing returns are higher with cables that with any other piece of the audio chain. Oh, and don't give me that junk about ," well, you just don't have a high enough resolution system to hear the differences". The arrogance of that makes my blood boil. Yes some folks have systems that may not be as highly resolving as our own, but let's take the stance of trying to help rather than trying to diminish someone else to make our own egos happier. I will continue to experiment with USB cables and see what comes of it. And, just to muddy the Watters further, our friends at Apple have gone to USB "C" in their products and that new standard will slowly but surely become ubiquitous making all of our expensive audiophile USB cables useless and enabling all of the big boy cable companies to... sell us all new cables! :)

Happy listening!
Hmmmmm... in audio streaming, it is not a one way street. USB protocol is USB protocol and there's just as much error correction that happens whether it's music or a Microsoft Office document. Computers don't know the difference. Now, if you're talking about S/PDIF, or ethernet network signals (and with the latter, there's still standard error correction protocol) that's a different matter, but with USB it's ultimately a matter of correct data at the receiving end, clocked in at the proper time. Once that is achieved, a cable cannot "improve" the data. It may change it and make it different, but it won't improve it.
Celo,

I've found that the Aries Mini does seem to sound better, more detailed and more extended. My guess is that it's due to advances in network chips and clocking causing fewer errors and better data throughput. The SBT was a cool product that Logitech decided to abandon, so it never was updated any further. Still perfectly viable, but I think newer technology like the Aries Mini is better. The only constant in the digital world is change, and that change happens much faster than we usually like or expect!

dlcockrum,

Let's be clear, I am not discounting the fact that different cables can sound different, what I'm trying to discern is why that would be the case. Keep in mind that the operative word is "IF".... IF the data structure is exact at both ends of a cable, IF the cable allows said data to clock in at the proper time, IF the cable ensures that data voltage levels are consistent and correct, then no other cable can improve on the audible results. If the data is correct, it's correct. Differences can be had past the receiver/transmitter chip in whatever device you use, like in a digital filter, DAC, oversampling system, etc., but a digital cable can't change the sonics UNLESS it's changing the content of the data in some way. That said, it's very possible that is happening, but it's interesting that there are so many different manufacturers with different USB cables at different price points; how are they manipulating that data stream to get different results? That's my question. And, how do we know when a cable does get the data right versus when it doesn't? Are we really hearing "improvement" or just something "different?

By the way, this question should be the same with any digital transmission standard, be it USB, CAT5, CAT6, S/PDIF, whatever. A digital signal being transmitted must ultimately be converted back to an analog output at some point when we're talking about audio. Each of those has an industry accepted standard that must be adhered to. Now, are the standards themselves not able to hold muster with the data they are transmitting, or are we playing with bit streams within cables to "filter" for a particular sound profile? In practice, no digital transmission is without error, that's why there's error correction all over it. But, in the end, how do we ensure that have the best way of ensuring that a digital data stream is as bit perfect as it can bee at the input of a DAC? That's what matters.