Class D is just Dandy!


I thought it was time we had a pro- Class D thread. There's plenty of threads about comparisons, or detractors of Class D.

That's fine, you don't have to like Class D amps, and if you don't please go participate on one of those threads.

For those of us who are very happy and excited about having musical, capable amps that we can afford to keep on 24/7 and don't require large spaces to put them in, this thread is for you.

Please share your experiences with class D amps!
erik_squires
@autre I won't pretend to be smarter than I am.. I don't understand why it matters either, but preamps have a characteristic output impedance and amps have a characteristic input impedance. If memory serves me right, the amp input imp should be 10x higher (or more) than the preamp output imp. Having them too close can apparently cause the system to sound off or cause an increase in noise.

Anyone with more detailed knowledge, PLEASE addend or add!
Hi Guys,

The biggest issue with input/output impedance is the change in frequency response. This is especially bad with tube pres as they usually have a high output impedance. Driving a low impedance input amp can affect the overall response and deviate from ideal. With purist tube pre's even the volume control setting can affect things because they lack additional buffer stages that would prevent this.

Good ways to see this is to look through the Stereophile archives and look at tube preamps. In the measurements section you will see the frequency response driving different simulated amps. Here is a link:

http://www.stereophile.com/category/tube-preamp-reviews

If you look at this particular preamp, youll see the FR changes a great deal when driving a 100k load vs. 600 Ohms.

http://www.stereophile.com/content/vtl-tl65-series-ii-signature-line-preamplifier-measurements#aVQTI...

High impedance inputs also tend to be noisier (I think).

Solid state preamps or op amp based preamps can easily be completely flat down to 600 Ohms.
So, wanted to make things a little more full.

One of the spec’s commonly sited is S/N (signal to noise) or related THD+n (Total Harmonic Distortion + noise). The problem with Signal to Noise is that it is often cited at full power, and not at a more reasonable amount, like 1 watt.

So, a 1000 watt amplifier can claim 10 dB more S/N than say a 100 Watt amplifier that sounds exactly as noisy. The full power S/N ratio becomes useless for comparing amps and how quiet they are. What I wish reviewers and manufacturers did more consistently is rate the S/N at 1 watt or 10 watts. Something small so we can compare what we’ll actually hear.

Distortion is also something that tends to go down as power goes up until the limit of the amp is reached. This is why these figures can be misleading. This is something stereophile does a little better, in showing graphs of THD vs. power output. Now we can at least look at charts to compare noise and distortion as more usable power outputs.

The other thing, I think that the amplifier community is having a hard time keeping it’s prices up to make boutique manufacturing possible. This makes it necessary to have big differentiation. For instance, having a 1,400 watt amplifier is cool and all, but in my living room, at my listening levels does it mean my sound is better than my 250 W amps?

I don’t really know all the answers to these questions at all. I’m just sharing where I start to scratch my head. :)

Best,

E
Good comments by Erik.

In the situation Autre described in his post dated 4-29-2017 it is very unlikely that impedance compatibility issues were present.  The Rogue Sphinx has a very high input impedance (more than 100K according to Stereophile's measurements).  And although I don't know what the input impedance of the Peachtree Nova 125 SE is, it is most likely much higher than the output impedance of the solid state Onix CD player which was driving it.

Impedance incompatibilities between line-level source components, such as CD players, and integrated amplifiers such as those, or between preamps and power amps, are likeliest to arise when a tube-based output stage is driving a solid state input stage.

Regarding the 10x rule of thumb guideline which Todd alluded to, I would state it as follows:

The input impedance of the amp (or other component that is receiving a line-level input signal) should be at least 10 times the output impedance of the preamp or line-level source component that is driving it, at the frequency within the audible range for which that output impedance is highest. Which in the case of preamps or source components having capacitively coupled outputs (such as the majority of tube preamps) will usually be at 20 Hz.  And the output impedance at that frequency will often be far higher than the specified output impedance (which is usually based on a mid-range frequency such as 1 kHz), because the impedance of a capacitor rises as frequency decreases.

That doesn’t mean that there will necessarily be a problem if the guideline is not met. It depends on how the output impedance **varies** as a function of frequency. What it means is that there **won’t** be an impedance compatibility problem if the guideline **is** met.

If Stereophile has reviewed the preamp or source component, the measurements section of the review will usually indicate the output impedance at 20 Hz as well as at other frequencies.  But if only a nominal impedance can be determined, such as a manufacturer's specification that is presumably at a mid-range frequency, to be safe I would suggest a ratio of 50x or preferably even 75x.

Regards,
-- Al
 
Whew. Thanks Todd, Erik and Al. That is a lot to digest but I think I understand it a bit more now. This does a lot to explain the basis for such terms "system synergy" and "system compatibility". 

And now, back to our regularly scheduled topic of Class D amps are Dandy!