At what point ( min or max value ) does av/preamp output voltage not make a difference?


Hoping my question is not viewed as 'stupid'. I will attempt to phrase the question so it makes sense. I have a basic understanding of EE and AV electronics but still trying to understand specifications of preamp outputs and at what value output voltage of an AV preamp output stage matters ... 

Meaning - Is a preamp with 4v max output... 'better' suited to drive external amplifier than 2v? Is an 8v output 'better' than 4v in real world use?

I understand that higher voltage output is better ... but at what point does it not translate into better performance?

Reason I ask:  Looking to replace aging avr which has an 8v preamp output.  And I have yet to find ( under $3k for avr or prepro ) one with anything close to this. I am looking at the Anthem AVRs, Marantz 770x pre pro, .... and none have preamp outputs remotely close to the 8v of my 20 year old Denon.

Hope someone can shed some light on this.  

I am looking at using Parasound Halo, ATI, Rotel, Music Fidelity, etc for external amplification ( if that makes a difference ).

Thanks in advance.


lightfighter2018
Hi lightfighter,

I am not an electronics person at all, but from what I read, the 2 important specs for preamps are output impedance, lower is better and usually only an issue with tube preamps, and gain.  A high gain preamp,  20db and up, may cause you to keep the volume control below its optimal level.  Preamp gain of 6db should be plenty in most applications.

I know that's not what you asked, but somebody will probably respond to tell me I'm full of it and maybe a discussion will ensue.
I agree with Tom’s comments.

Also, while I haven’t researched the AVRs and Pre-pros you referred to, I suspect that the reason for their relatively low maximum outputs is that those outputs are generated internally by a DAC circuit. So as in the case of CD players those components are presumably designed to generate a maximum output, when the digital data is at its maximum possible value ("full scale"), that is modestly above the voltage that is required to drive most power amps to full power.

A preamp or line stage having an entirely analog signal path is a different story. In that case, to assure best performance the component should provide a substantial margin between its maximum output voltage capability and the highest output it would be called upon to generate under reasonable usage conditions. If that were done with a DAC-based circuit, though, the added margin would compromise usable dynamic range, noise performance, and low level resolution. And in many cases would force the listener to use the volume control at undesirably low settings.

The bottom line: While there may be some exceptions, in general that spec should be ignored IMO.

Regards,
-- Al

Thanks for the feedback. I was speaking specifically of preamp preouts for use of external amplifiers. Its my understanding that the higher the preamp output voltage -- it allows the preamp to take full advantage of all the power amplifier X is capable of providing. Under volting the amp/preamp relationship - as I understand it - puts a bottle neck on the amp.

Just clarifying the thing I am trying to wrap my head around. So, what I think you both are saying is that ... preout V .... won’t impact amp performance as much as I think it will if it has a low V output??

Maybe over thinking it, but I am trying to figure out where that sweet spot is - so I get the most out of amplifier X. At the same time - not obsess unnecessarily over the spec ( i.e., 2v vs 4v+ etc ) on the output stage.

Example: No sense in buying a multichannel 7x200 watt amplifier if my AVR or AV preamp won’t be able to output a signal strong enough to give me those 200w etc.

Or... are you both saying that the spec I am looking at has no bearing on the output relationship of amplifier X?

Thanks.

For a signal path in a preamp (or the preamp section of an AVR) which includes a digital-to-analog conversion, the maximum output spec should be (and in the majority of cases will be) higher than the voltage required to drive a power amp to full power.

For a signal path in the preamp that is entirely in the analog domain the specified maximum output voltage is usually irrelevant. (Although in theory, everything else being equal (and everything else is almost always not equal), a higher maximum output voltage might correlate with better preamp sonics to a greater degree than a lower maximum output voltage). But what is relevant in that situation is that the maximum output voltage of the source component, when increased by the gain of the preamp, should (at least preferably) be high enough to drive the power amp to its maximum power capability. While not being so high that the user is forced to set the volume control near the bottom of its range.

In most cases involving an analog signal path through a preamp, if the source component is a CD player or other digitally-based component and an active preamp is being used there will be no problem driving a power amp to full power. In many of those cases there will also be no problem driving a power amp to full power if a passive preamp (providing no gain) is used. That issue is likeliest to arise if a phono source is used, especially if it is used in conjunction with a passive preamp.

Regards,
-- Al