Why does better power = better sound?


Why does improving power quality improve sound quality?

I’m not asking to start an argument about power cords or wall outlets. Please let’s not go there. I’m asking because I’m hoping to learn some technical explanations for the effects of power quality on sound quality. I think I already understand how…

1. greater current availability = greater dynamic range
2. reduction of RFI/EMI = better signal to noise ratio

…but what about these…

3. ???????? = greater perceived resolution
4. ???????? = more realistic instrument timbres
5. ???????? = more precise imaging

Are differences in resolution, instrument timbres, imaging, etc. somehow reducible to current availability and/or powerline noise? If so, HOW are they reducible?

Again, I’m hoping to get into technical specifics, not polemical generalities.

Thanks in advance.

Bryon
bryoncunningham
Bifwynne, new Elite 20PFi is $1019 at Audio Advisor. I bought used for $600. New Elite 15PFi is about $699, used currently listed at Audiogon for $350.

Al, you know your Beethoven - I'm impressed.
Al and Kijanki, I have the complete Gardner Beethoven symphony collection. Recorded on the Archiv CD label. Playback is quite good. Gardner uses a tazer to keep the orchestra really hopping! One of the better redbood CD in my collection.
Al and Kijanki, assuming I go forward with installing 3 or 4 dedicated lines, will I need a power conditioner for each line? Just talking out loud here, I wonder if there's a single device that I can install at the circuit box that will condition and filter the AC power for multiple dedicated lines.
Hi Bruce,

I have no particular knowledge of an audio-oriented conditioner that would handle multiple lines at the panel. But in any event I would expect that installing conditioners at the system end of the runs would be preferable, because they would then be able to filter out RFI that may be picked up by the wiring between the panel and the outlets.

It would probably make sense to purchase one conditioner initially, and try it out on each of the different lines.

Not familiar with the Gardner "Pastorale"; thanks for mentioning it. My "go to" version is an imported Japanese CBS/Sony remastering, on LP, of Bruno Walter's famous 1958(!) performance with the Columbia Symphony. I purchased it during the 1980's. Wonderful performance, of course, and remarkably pleasing sonics aside from a bit of steeliness in the strings at times.

Best regards,
-- Al
Thanks to everyone. Some very helpful comments.

I understand that the DC power provided by a component’s power supply is the same power that constitutes the component’s signal, and that therefore noise or distortion on the AC power line, if insufficiently filtered by the component's power supply, will become part of the signal.

What I’m unclear about is how SPECIFIC audible characteristics correlate with SPECIFIC AC/DC powerline anomalies. Put simply, HOW does bad power result in bad timbre, or bad imaging, or less resolution, etc.?

04-24-12: Almarg
…any and all of those numerous frequency components could, to some small extent, intermodulate with the audio signal, resulting in new spectral components at frequencies equal to both the sum of and the difference between the frequencies of any or all of the spectral components of the music and the frequencies of any or all of the spectral components of the noise or distortion.

This was extremely helpful, Al. I wasn’t really thinking in terms of frequency intermodulation, but when I do, it’s easier for me to understand how bad power results in less realistic instrument timbres. It's something like...

AC power frequency anomalies -> DC power anomalies -> INTERMODULATION of DC power and signal -> distortion of harmonic content -> less realistic instrument timbres

Because accurate harmonic content is essential to realistic instrument timbres, anything that distorts harmonic content, like the intermodulation of DC power and signal, will make instrument timbres less realistic. Sounds plausible to me.

So filling in the question marks to #3 in the OP is…

3. dc power/signal frequency intermodulation = less realistic instrument timbre

Assuming all this is correct, I’m still unclear about the explanation at the level of voltage and current. In particular, I'm unclear about the concept of "frequency intermodulation" with respect to DC power. Some dumb questions...

--Does "frequency intermodulation" basically mean that there are FLUCTUATIONS to DC voltage/current that are UNRELATED to the signal?

--Why are DC fluctuations described in terms of "frequencies" at all? Is it simply because the fluctuations occur at a certain rate per second? Or does the use of "frequency" to describe fluctuations in DC voltage/current also imply that DC can be understood as a WAVE, just like AC?

I have lots of additional thoughts/questions about resolution and imaging, but it would be helpful to stick to instrument timbres for the moment, or my head might explode.

Thanks,
Bryon