So I should assume all power(90+%) drawn from the wall will end up as heat. I should be looking to minimize the wattage at the speaker terminals. I should clarify that I’m looking at room temperature exclusively. An amplifier’s ability to cool itself has no impact on its thermal contribution(as I’ve learned from upgrading heatsinks in my conputer)
I’d say I don’t understand how to calculate power draw of amplifiers outside of class A. I was looking at an average power AB amplifier last night with power consumption rated 25-250W. I understand it idles at 25W and when in use is pulling 250W? Or does the power usage scale with the volume and demands of the speaker? How about Class D? Can a 50-100W AB or 250W D amp play music using less power than 5W of Class A? Do they all output the same wattage to the speaker all else being equal?
I’d say I don’t understand how to calculate power draw of amplifiers outside of class A. I was looking at an average power AB amplifier last night with power consumption rated 25-250W. I understand it idles at 25W and when in use is pulling 250W? Or does the power usage scale with the volume and demands of the speaker? How about Class D? Can a 50-100W AB or 250W D amp play music using less power than 5W of Class A? Do they all output the same wattage to the speaker all else being equal?