Why Power Cables Affect Sound


I just bought a new CD player and was underwhelmed with it compared to my cheaper, lower quality CD player. That’s when it hit me that my cheaper CD player is using an upgraded power cable. When I put an upgraded power cable on my new CD player, the sound was instantly transformed: the treble was tamed, the music was more dynamic and lifelike, and overall more musical. 

This got me thinking as to how in the world a power cable can affect sound. I want to hear all of your ideas. Here’s one of my ideas:

I have heard from many sources that a good power cable is made of multiple gauge conductors from large gauge to small gauge. The electrons in a power cable are like a train with each electron acting as a train car. When a treble note is played, for example, the small gauge wires can react quickly because that “train” has much less mass than a large gauge conductor. If you only had one large gauge conductor, you would need to accelerate a very large train for a small, quick treble note, and this leads to poor dynamics. A similar analogy might be water in a pipe. A small pipe can react much quicker to higher frequencies than a large pipe due to the decreased mass/momentum of the water in the pipe. 

That’s one of my ideas. Now I want to hear your thoughts and have a general discussion of why power cables matter. 

If you don’t think power cables matter at all, please refrain from derailing the conversation with antagonism. There a time and place for that but not in this thread please. 
128x128mkgus

Showing 4 responses by atmasphere

I’m just generally curious what is going on scientifically speaking. Furthermore, once we understand the phenomena then we can design and tweak to produce the results we want. For example, maybe the gauge of the copper fully explains the effect. If that’s the case then we can tweak that one variable and ignore the other ones that could use up our time and money.

Its easiest to measure the effect the cord has on equipment with which its used.

I’ve seen a power cord make a difference of nearly 30% of output power out of a power amplifier. I could also see that that was caused by a voltage drop across the power cord. This stuff isn’t woo voodoo nor is it any rocket science, its just Ohm’s Law.

You can measure differences in output power, output impedance and distortion on many power amps just by changing the power cord- and many of these differences are simply caused by voltage drop. IMO tube amps are more susceptible as they have a filament circuit that is unregulated, and the heat of the cathode makes a difference as to the transconductance of the tube. But a class A transistor amp is likely pretty susceptible too.


There is more to it than voltage drop though. It also has to do with bandwidth of the power cord as it has to be able to provide current at high frequencies because rectifiers in power supplies often commutate (switch on and off) for very short periods of time, and if the current gets limited during that current pulse the power supply can’t charge completely. A noisy, improperly charged power supply can have an effect on how a circuit performs. So it should be no surprise that power cords can have an audible effect.


The usual saw is why doesn’t the wiring in the house make a difference and the answer is that it does. Romex in general though is pretty high performance compared to flexible cable but its illegal to make power cords out of it.


@mitch2
1. What characteristics of a power cord would affect voltage (i.e., cause a voltage drop) - is that primarily a function of resistance and wire gauge, or something else, and
2. What characteristics of a power cord would affect bandwidth?  
Voltage drop can be caused by the connectors or the wire itself- they all respond to Ohm's Law. If you find that either are getting warm, you know for sure there is a voltage drop occurring.
The geometry and materials affect bandwidth, in addition to gauge.

This stuff makes NO technical sense
@kosst_amojan

If you have any test equipment on hand, run your power amp with an 18ga. power cord up to clipping and note the power output. Change out the power cord with a 14ga. power cord and do the same test. Do you see any difference?

Can someone explain this or experienced this also? Could it have something to do with the power grid?

This is a common occurrence if our customers are any judge over the last 20 years. The grid is apparently a lot cleaner (and possibly higher voltage) at night.

So it only makes a difference if I drive my amp to the physical limits of it's power supply, which nobody ever actually does?

This is an exercise that shows that the effects of a power cord can be measured. In addition to total output power, distortion and output impedance change as well.


It is also interesting to measure the voltage drop across the power cord from one end to the other. This can be correlated with the performance changes in the amplifier.