Amp experience


I have no dout that more power is better. But for low volume listening when the power output is UNDER 1 W how much difference can you tell between a 200 W SS to a same brand 300-450W SS Amps? I just like to hear from people who actually owned both amps please. Not by “ obviously” or guessing please.
128x128kfz03110

Showing 3 responses by spatialking

Well, the additional power means those music peaks won't be clipped.   For example, if your speakers are pulling 20W and a 10:1 peak comes along, a 200W amp can handle it nicely and probably a bigger peak than that.   A 100W amplifier will clip that peak since you only have a 5:1 headroom.  

If you are biamping or even triamping, then each amplifier doesn't have to be as large since that peak is not divided up between a number of amplifiers.  

However, more power is better up to a point.   A while back I was consulting for a high end audio company and working on developing a 100W and a 200W power amp.  What I found was given the semiconductor devices available today for a Class AB amplifier, there is a sweet spot for 70 to 125 W/ch amplifiers.  Above the 125 W/ch level, cost begins to increase significantly to maintain sound quality, high current capability, and power reserve. 

If you are really only using 1W of power, I'd look at a good SET amplifier!  If you are not into tubes and prefer solid state, any good audiophile quality amplifier in the 70 to 125W range is worthy of a good listen.  At least you have options for other speakers at higher power levels, whereas the SET locks you into very low power applications.  

That being said, I don't see a problem using a 200W/ch amp and I personally would look for one with very high bias since you are using it at low levels.   That is, Class A operation up to 15 to 20 Watts then Class AB beyond that.   I've seen ads where this concept is called "heavy Class AB bias".  There is no point in brute forcing it when you can do it with finesse.  :-)
kosst_amojan - I know how logs work, what I don't know is whether ktz03110 does.   Consider this, the 10*log(10) of 20 is 13 dB, 10*log(10) of 200 is 23 dB.   That an increase of 10:1 in power. 

Another way to look at it is if music is running 20W at one point in time and the next point in time music demands 200W, that is still a 10:1 change in power that the amplifier has to produce. I purposely used those numbers so I could say a 10:1 power change.  I never said a 180 Watts is 10dB. 

If I had a sound system that only used 1W typical, I wouldn't be looking at a 200W amp, as I mentioned above, as well as the reasons you stated.


@kfz03110:  After rereading your original post, I realize what you are really asking about is the crest factor of music.   Crest factor is a term used in physics to define the maximum ratio of RMS voltage to peak voltage.  For example, for a pure sine wave the crest factor is the square root of 2 or 1.414.   If you have a sine wave, the peak will never be higher than that.  

For music, the crest factor is around 20 for uncompressed recordings.  So if you had music that typically ran 2.83 Volts RMS, then the max should be around 56.6 V peak.   With an 8 Ohm speaker, that works out to 1 Watt RMS and 400 Watts peak. 

Now, that being said, if the composer added other things into the music, such as uncompressed gunshots, the crest factor would rise to 30 or more.  Cannons, 1812 Overture for example, then the crest factor is obviously greater than 30. 

It isn't just brute force, as I mentioned earlier.  That amplifier has to be very linear in its transition from 1W to 400W and it should, obviously, be very musical at the same time. 

You can read a bit more about crest factor here (scroll down to crest factor): http://www.aes.org/par/c/#cps