16 bit vs 24 bit vs 35 bit vs 36 bit vs 64 bit DAC sampling


I have limited knowledge about DAC's, but as I understand it, a typical CD player used to have 16 bit sampling, and supposedly no one was supposed to be able to hear the difference between anything more than 16 bit sampling; however, I recently purchased an Esoteric K-01X, which has 35 bit sampling (why 35 bits? no doubt only to differentiate it from their then top of the line 36 bit sampled Grandioso series).  

Now I can hear a big difference between my old Musical Fidelity kW DM25 DAC with 24 bit sampling (circa 2005), and the newer Esoteric DAC with 35 bit sampling, although I'm not supposed to, although maybe there are some other electrical programs playing with the sound besides the sampling rate.  

Now, there are 64 bit sampling DAC's, and I'm wondering how much the ear actually does hear from the sampling, or if it's something else entirely that's making the digital sound better?  

Any insightful opinions or perspectives?  

Thanks.
drbond

An audiophile friend gave me a book for Christmas, written by Robert Harley, "The Complete Guide to High-End Audio".  The quote below, which explains what I hear quite adeptly, is from that book, pg 231, from the section titled, :"16 Bits, 20Bits, 24 Bits":

"...the benefits of the increase in word length from 16 to 18, 20, and even 24 bits are not in dispute.  As I mentioned earlier, word length is the number of bits used to encode the audio signal's amplitude at each sample. Assigning a number--called a word--to represent the audio signal's amplitude is called quantization.  The word length determines the system's resolution, dynamic range, distortion, and signal/noise ratio.  We also call the word length resolution. . .

....the greater the number of bits in each digital word, the more precisely the analog signal's amplitude is encoded. . . The longer the quantization word, the more steps, and thus the finer the resolution.  

. . . .the resolution of a digital audio signal isn't defined by the maximum number of bits available, but the by the number of bits being used at any given moment.  

The advent of 20- and 24-bit digital audio not only expands the dynamic range, but also increases the resolution of low-level detail.  This low-level detail can be fine nuances of an instrument's timbre, which enhance the sense of realism.  It can also be subtle spatial cues, such as discrete  acoustic reflections and reverberation decay, which the ear interprets as a more convincing reprodution of the original recording venue. 

Longer word lengths also contribute to better sound because the postproduction (mixing, equalization, signal processing) common in the recording or mastering studio can be performed with much greater mathematical precision.  Moreover, any noise added by these processes is spread out over a wider bandwidth, which makes it less audible.

The combination of higher sampling rate and longer word length results in greatly improved sound quality.  (A digital system's resolution can also be increased by adding to the signal a small amount of noise, called dither...)"

He then proceeds to outline the limitations of 20+bit processing, often calling the additional bits, simply marketing bits, fi they're not done right, and that for most systems any information over 20 bits is "rarely capable of delivering real audio information"

So, I needless to say, it appears simple, but it quickly becomes a complex issue.