I'll admit, I only completed freshman college calculus and I'm no math wizard, but I can't follow the author's math. I wish he'd explain how he arrives at the following:
Lets say the loudspeaker has an efficiency of 90 dB for one watt at one meter and the desired maximum sound pressure level during peaks is 110 dB (thats really loud). This will require 100 watts. If the signal source is 1 volt maximum, how much amp gain is required?
100 watts into 8 ohms requires 28 volts. So the amp needs to provide a voltage gain of 28 times which is 29 dB.

