Amp power (watts) correlates to output (dBspl) in a logarithmic fashion, which is to say, twice the watts does not give twice the output.
A (relatively) simple equation can give us the difference in output level (in dB) for a given change in power (watts):
10 log(m/r)=dB
where "m" is measurement (new power rating in watts)
and "r" is reference (old power raing in watts)
and "dB" is the change in output level in decibels
Still with me? Let's put in some numbers and compare the output of a 50 watt amp with a 100 watt amp. So "r" will be 50 (watts) and "m"will be 100 (watts):
10 log(100/50)=3.01029995664
So a 100 watt amp feeding the same speaker as a 50 watt amp will only produce a signal that is 3dB louder. This is noticably louder but not a lot.
It's worth noting that a percieved doubling of volume equates to a change of about 10dB.
Lets try another, a 4 watt amp versus a 40 watt amp:
10 log(40/4)=10
So a 40 watt amp is 10dB louder than a 4 watt amp, which equates to a percieved doubling of volume, twice as loud.
So as a rule of thumb, doubling the power (watts) gives you 3dB more output volume (noticable but not great) and ten times the power (watts) gives you 10dB more output volume (twice as loud).
If you consider that every speaker has a specific sensitivity rating (how much output volume you get for a given wattage) then choice of speaker can also play a big role in overall loudness.
a 100 watt amp feeding a speaker rated 97dB @ 1w, 1m
- will be the same volume as -
a 50 watt amp feeding a speaker rated 100dB @ 1w, 1m
I hope that all made sense!
Leave a post if you want me to explain this better!
Cheers,
Spud