A simple explanation of AMD’s PowerTune technology is that the GPU runs at maximum clock until it detects potential problems with thermal dissipation; the latter scenario causes the card to downclock. The nominal GPU value is 880MHz, and the GPU should run within its TDP „budget“.
AMD implemented several low-frequency states so that PowerTune technology can dynamically alter clocks based on the internal calculations of the GPU’s power consumption. A similar technology is found in Nvidia's GTX 570 cards, where dynamic altering of clocks effectively manipulates TDP without affecting performance. GTX 580 is the first card to implement such hardware monitoring technology and AMD's new cards also rely on it. However, when compared to GTX 570 or HD 6900, the GTX 580’s engine seems simplified as it downclocks the GPU by 50% upon detecting FurMark and similar apps, and that’s that. Basically the GTX 580 proves that muscles don't usually mix with brains.
This technology is a rather nice feature if you're dealing with a smooth running game, as it will allow you to conserve energy and lower noise by simply sliding the controls in PowerTune. It will also keep the temperatures lower by reducing the clocks and lower temps are always a good idea.