Stupid speaker test question...please help a n00b


Why aren't speakers tested by measuring the output sound waves vs the input wave signals? Would this not be the easiest way of testing distortion introduced by the speaker? Assuming you control all the other parameters of the test of course...

Thanks for the help!
spartanmorning
Because most speakers have high distortion compared to electronic equipment. Over 1% thd is normal for most speakers. It would be nice if manufactures would publish that spec. The only place I have seen it measured was on some reviews by Secrets Of Home Theater online.
THD below 50hz at highish SPL (90 db +/anechoic or quasi anechoic) is off the charts for most speakers. Per Sarcher's post, this test is sometimes run by HT websites, especially for subwoofer testing. Subwoofers are purpose designed for clean response in this range and most of them still suck.

No one wants to see that # for a typical full range speaker.

Marty
Exactly, we kill ourselves to get our electronics purer than Ivory soap, all the while our speakers are distorting much more. It is sort of like the dirty little secret of hi-fi. No one speaks of it much. It seems the emperor has new clothes. BTW woofers can distort up to 10%.
Ah Specs. I learned a while ago especially with speakers to only look at the sensitivity spec. The most current reason is my Dyn C1's are rated 45-22khz. My brother in law has Vienna Acoustic (forget which model) that are rated 32-25k and we both agree my Dyn's go lower. It could be due to the fact the VA's distort more at the lower frequencies (just a guess). Besides if I were a manufacturer I would look for a frequency band where the distortion is the lowest to publish. For me the same holds true for amps regarding watts (but that is a long story).

Bottom line is let your ears decide what sounds best ;-)