Audioengr--
The Nyquist Sampling Theorem requires the use of a reconstruction filter to remove ultrasonic images of the audio signal no matter what dac implementation one uses. This filtering can be implemented either digitally or by analog circuitry.
Digital filters were usually used in conjunction with ladder DACs and most nos dac chips. Back in 1982 the first Sony CD players (CPD 101) used no oversampling and analog brick wall filters at 22 khz. The Phillips players introduced oversampling and digital filtering (and used 14 bit dacs!). The controversy of the time was frequency dependent phase shift caused by the analog brick wall filters.
By 1985 or so everyone used oversampling and digital filters along with ladder dacs.
In the late 80's single bit dacs were introduced which required high oversampling rates (typically 64x or better) digital filtering and noise shaping just to function. They were really cheap to build since they didn't require the labor intensive calibration that multibit dacs needed on low order bits and became standard in consumer gear.
That launched the multibit vs single bit wars of the 90's.
A few high end manufacturers returned to the early days of no oversampling. Some even chose little or no filtering which dumped lots of ultrasonic energy into downstream components. These manufacturers also favor nos laddar dac chips. The choice to forgo (digital) filtering (yikes!) has nothing to do with the dac chip.
The Nyquist Sampling Theorem requires the use of a reconstruction filter to remove ultrasonic images of the audio signal no matter what dac implementation one uses. This filtering can be implemented either digitally or by analog circuitry.
Digital filters were usually used in conjunction with ladder DACs and most nos dac chips. Back in 1982 the first Sony CD players (CPD 101) used no oversampling and analog brick wall filters at 22 khz. The Phillips players introduced oversampling and digital filtering (and used 14 bit dacs!). The controversy of the time was frequency dependent phase shift caused by the analog brick wall filters.
By 1985 or so everyone used oversampling and digital filters along with ladder dacs.
In the late 80's single bit dacs were introduced which required high oversampling rates (typically 64x or better) digital filtering and noise shaping just to function. They were really cheap to build since they didn't require the labor intensive calibration that multibit dacs needed on low order bits and became standard in consumer gear.
That launched the multibit vs single bit wars of the 90's.
A few high end manufacturers returned to the early days of no oversampling. Some even chose little or no filtering which dumped lots of ultrasonic energy into downstream components. These manufacturers also favor nos laddar dac chips. The choice to forgo (digital) filtering (yikes!) has nothing to do with the dac chip.