Set a thermostatic iron to, say, 300C and it'll turn the power on until the tip reaches 300, turn it off, wait for the temperature to drop a couple of degrees, turn it back on, heat it back up to 300, etc. Bring that tip into contact with something (ie. the joint you want to solder) and heat will flow away from the tip more rapidly. The thermostat will increase the power accordingly, to maintain 300C. All is well (until the heat-sinking capabilities of what you're soldering exceed the power rating of the iron, anyway).
With a cheapo temperature controlled iron, you set the knob to something, the duty cycle of the power supply to the iron is altered accordingly, and the tip heats up until it's in thermal equilibrium with the surrounding air. Bring it into contact with the joint and the tip cools down, but the electronics knows nothing about that, it just keeps supplying the same amount of power. Fine, so you find a setting that works for soldering your through-hole resistors, or whatever. All is well. Then you come to solder the power jack, which is a bigger component with more metal in the legs. It sinks more heat, the tip cools more than it did with the resistors and your solder joints are crap. You twiddle the knob a bit and give it a few seconds to warm up. The jack solders fine. Then you come to solder some fiddly little SMD transistor and end up lifting the pads because you've left the power too high and the tip's now too hot.
Ie. it's usable, but it requires more judgement and knob-twiddling to get consistent results that would happen automatically with a thermostatic iron. How much this stuff matters depends on what you're soldering. Small SMD electronics and lead-free solder are the things that will really make you appreciate thermostatic control. If you're just putting connectors on cables, say, it's much less important.