What is the significance of the output voltage level of a give cartridge with regard to audio fidelity? For example, the Lyra Delos cartridge puts out 0.6mV. Some MC cartridges that I have seen put out less than half that voltage level. The apparent benefit would seem to be that the higher output level needs less gain in the phono stage, which likely means higher S/N ratios as opposed to lower output MC cartridges where you would need more gain in the phono stage. I guess my question is, are there disadvantages of the higher output voltage level relative to cartridges that output lower voltage levels?
I appreciate that the ultimate determination of whether one cartridge sounds better than another can only be determined by listening (which has it's own limitations because it can often be difficult to due an apples-to-apples comparison under the same conditions), but I am trying to get a sense as to what is the consideration in determining what output voltage a given cartridge will be designed to produce.
On a related topic, I read the instructions that come with the Lyra Delos that specify suggested load impedance values for use at the phono stage. My question is how did you determine those impedance ranges? By SPICE analysis, or some other method?
I appreciate that the ultimate determination of whether one cartridge sounds better than another can only be determined by listening (which has it's own limitations because it can often be difficult to due an apples-to-apples comparison under the same conditions), but I am trying to get a sense as to what is the consideration in determining what output voltage a given cartridge will be designed to produce.
On a related topic, I read the instructions that come with the Lyra Delos that specify suggested load impedance values for use at the phono stage. My question is how did you determine those impedance ranges? By SPICE analysis, or some other method?