A/D input levels with 24-bit - need to max out?


Hi,

I use a USB Pre A/D converter with a Grado PH1 phono stage to record vinyl records in 24-bit 44.1Hz stereo. I'm using Bias Peak to record. I use Cool Edit and ClickFix to process. Back in Peak again, I raise the gain to the highest unclipped level before dithering to 16 bit to make CD's or FLAC archives.

I record a lot of records these days, so to avoid re-recording due to clipped levels on the USB Pre, I set the levels conservatively. As a result, the peak for some records might be 80% or so instead of 95% more. I do try to record loud records in a set, etc., and adjust the level accordingly. But the Pre's level knobs are tricky to set evenly by eye (wide-spaced markers) so I may adjust downward more than I would otherwise.

My rationale has been that raising the input level doesn't help much, since I'd also be raising the noise level of the phono stage and any noise generated by the analog portion of the converter and the signal to noise ratio would not change.

Now, I may be answering my own question here, but if I understand digital audio correctly, more bits are given to louder passages, so by keeping the level lower, I'm getting fewer bits for the quiet parts than I should be getting.

But, since I'm recording at 24-bit anyway, before editing and then dithering (to 16 bits) after raising the gain, does it really matter that much? The 16-bit result has the highest level sound, it's just the 24-bit initial recording that is lower.

Thanks in advance for reading this and any advice you might have.

gritingrooves
gritingrooves
Gritingrooves,

It is not the dynamic range you need to worry about. Even the best and the cleanest vinyl only has about 45 db of dynamic range, well within the 96 db red book capability. But if you record at too low a level, you will start losing resolution. You loss 1 bit of resolution for every 6 db down in level. So for optimum recording, you should adjust the input level such that the loudest passage reaches 0 db or just below it. If that is too difficult to achieve than make sure the loudest passage is within -6 db range so that you won't loss more than 1 bit of resolution.
You could be using only the lowest 10 bits of your 24 bit hardware, and that would certainly degrade a 16 bit source.

It would only degrade if the signal has more dynamic range than can be properly represented by 10 bits. Of if by playing the signal louder from the source it is being rasied further above the noise floor. To me this risk of clipping (flat topping) by trying to position the peaks in the music at too close to the maximum signal of the ADC is a much greater risk.
Shadorne...As mentioned in another thread I have found that EVERY CD, at some peak signal point, comes within 2 or 3 dB of maximum, but never more (ie: no digital clipping). I'm told that this is no accident, and that when CDs are mastered the signal level is adjusted to achieve this end. So every CD uses all 16 bits, and represenring the signal using fewer bits, eg 19, results in loss of resolution. In other words, the least significant bit would represent more analog voltage at your speaker.
Thanks, a variety of opinions here, basically all over the map, I think.

On one hand, with 24-bit, I can afford to not worry about maxing out the gain, but on the other hand I could lose a a lot.

Just to underline, I'm recording at 24-bit. I know that at 16 I'd be crazy not to try to get the highest gain, but don't I have enough bits at 24 to get by? If I could record at 96 would that be even better?