The CD format is 16 bits with a 44.1khz sample rate per channel, period. You cannot get more resolution than this from a CD.
In theory, a high quality 16 bit DAC is fully sufficient to extract all information avaialable from the CD. Making a high quality DAC is not an easy task, though. As digital evolved, it turned out that using a DAC capable of decoding a higher than 16 bit resolution signal could result in improved performance on pure 16 bit datastreams due to better DAC linearity and precision required in general for higher resolution DACs.
Also oversampling and dithering operations added extra bits to the 16 bit data stream that could be conveniently fed to a higher resolution DAC. These extra bits don't provide extra information but can allow the DAC to reveal more of the infomation contained in the original 16 bit data stream (the "how" is a long story...).
In a lot of ways bits beyond 16 are "marketing bits". Manufacturers engaged in marketing wars over the number of bits supported by the DAC.
In the beginning all DACs were multibit. For each word (16, 18, 20, 22, 24 etc bit) input they output a corresponding analog voltage (or current in many architectures). It was hard to ensure linearity across all bits, and many such DACS required complex calibration during both manufacture and assembly. This kept their costs relatively high.
As fast logic became cheaper to manufacture and use, single bit DAC architectures became feasible. A single bit DAC effectively has only one voltage (or current) reference instead of 16 or more. The input datastream is oversampled (64x or higher) into a single bit data stream. This in turn becomes a high speed pulse train. There are variations on the theme but basically the width of the pulses in the train corresponds to the levels of the incoming sample words. This pulse train can be filtered to produce an analog signal.
Of course I'm oversimplifying greatly. This process results in lots of switching noise that typically is relocated above the audio band by a digital process called noise shaping. The clock stability becomes much more critical than with multibit converters. Clock stability is one ot the limiting factors on single bit converter reolution.
Single bit converters usually specify equivalent multibit resolution. Many now claim to provide 24 bit resolution. There was another marketing war in the 90's on multibit vs single bit converters. Multibits claimed to do bass better and single bits claimed to do highs better. This has pretty much died out, and now there are excellent examples of both implementations on the market.
DAC performance depends on many factors, including power supply, filter implementation, DAC architecture, and analog output stage. The artistry of design is the balance of these elements.
So you can't conclude that a 1 bit architecture is inferior to a 20 bit architecture or that an 18 bit architecture is inferior to a 24 bit architecture. You have to listen.
The CD has come a long way in 24 years. Consider that when it was first standardized it had to be theoretically capable of providing 10-20khz frequency response with better dynamic range than analog tape, and it had to be implementable with 1982 technology. It's an amazing achievement.
Again this discussion appliies to CD only. To support higher resolution formats (SACD or DVD-A or HDCD) we are dealing with data streams with greater than 16 bit resolution, but that's a topic for another reply.