Let's assume the device really needs a certain amount of power. Then, look at the equation for power: power=volts*amps. But, then look at the Volts=amps*resistance, another fundamental formula. You can do some algebra and come up with power=amps^2* resistance. Changing the resistance or the volts or the amps will change the amount of power available. A smaller wire=higher resistance.
The codes are written to limit both the voltage drop and the heating of the wiring. What a wire can 'handle' is different than what is safe, especially long-term. Throw in some connections along the way (screw terminals, or wire nuts, etc.) and each of those is not perfect - they add some resistance. The heating/cooling cycling can affect things, too, and if you haven't torqued things down properly, cause a screw to loosen, raising the resistance considerable. So, workmanship comes into play, as well.
Almost any device will come with installation instructions that dictate what the minimum gauge wire is required along with the associated protection device (typically a circuit breaker). You do not have to guess at this...the instruction manual will tell you what is required. You can always use a heavier gauge wire if you wish, up to the ability of the device to accept it (usually only applies to things like screw terminals - wire nuts don't care as long as they are adequate for the wire(s) used).