Is it true then that the output stage of the amp 'transforms' the amplified signal from the gain stage (in voltage) to current (in amps)?
That is literally true for Valve amps in that they have an output transformer and similarly a key feature of the output stage of an SS amp is to provide the current required to drive the speakers.
The bigger the power supply that's available will allow for a bigger/wider peak voltage swing to be transformed into current to the speakers before it clips?
Not strictly speaking but I think I know what you're getting at... it is common for manufacturers to use a power supply that is rated lower than full RMS output 100% of the time. That's because music doesn't require that level of power all of the time (google 'crest factor' if you want to understand why). How much the power supply is derated and the amount of reservoir capacitance (needed for sudden peaks in the music) is a judgement call... some amps may struggle with heavy loads. I spec my power supplies for 100% RMS output, which is over engineering but means the amps deliver what they say in the specs without fail.
Thus, power equals headroom? Lot of gain + minimal headroom = a limited amplifier.
Kind of... if the ceiling in my house is 2m high then there's a risk I might bump my head if I jump in the air. 3m would eliminate that risk but there's no additional advantage in having 10m high ceilings... that's about as far as I can stretch that analogy. If your amplifier is capable of 300W but you only need 10W (you may be surprised at how little you actually need) then that extra headroom is of little use. If the gain of the amplifier is high to allow you to achieve 300W output from a low level signal then you will likely be getting a worse signal to noise ratio than if you went for a 60W amp with a well specced power supply.
I'm trying to explain without maths and jargon... hope that helps.