My understanding is that the extra bits are to increase the accuracy of the most significant bit since a 16-bit converter cannot, in practice, do a 100% linear conversion.
How is more than 16 bits better for CD?
A response to Roadcykler's question made me wonder about a related topic... If the data on standard CD's is encoded as 16bit, how can an 18, 20 or 24 bit DAC improve things? That is, if the waveform of 16 bit audio is made up using 65,536 levels, where do these extra bits come in? Does the DAC 'guess' the extra bits?
- ...
- 10 posts total
- 10 posts total