You now have distortion only when the output signal is very high and you have no crossover distortion (GM doubling or whatever you like to call it - lets say transition distortion) when output signal is very low and it runs in Class A (both sides conducting).
This is quite correct, but the gm-doubling transition distortion is much worse than the crossover distortion. So as to audibility . . . it all depends on the application - each road has its burdens to bear.
A small absolute amount of distortion on a large signal is better than the same absolute distortion on a small signal.
I've heard it asserted that crossover distortion manifests itself (it drives THD upward) more as the signal level is reduced . . . and honestly, I'm not sure whether or not it's true. It seems to make intuitive sense, but I've measured lots of amplifiers, and I'm doubtful as to whether or not the measured data supports it. Complicating the issue is that THD+N of course rises in a linear manner with a reduction of signal level . . . but when you measure just the noise, you get the same results.
I think it may be that as the signal levels are reduced, the proportion of the total signal that's in the crossover region increases, but the crossover non-linearities are at the same time being spread out across a larger proportion of the waveform, making them less severe. Whether or not these opposing factors cancel each out is the question, and I certainly haven't the skill to investigate it with pure mathematics, and my current measurement equipment isn't sensitive enough to find the answer emperically.