Published in Graphics

Nvidia lets Optimus loose

by on09 February 2010

Image

Switchable graphics done right

Nvidia has officially unveiled its Optimus technology, which is actually a new way of implementing switchable graphics. The main difference between the Optimus and the early Desepticon based switchable graphics is that Optimus simply makes it all look easy, quite simple and automatic. Optimus works in combination of certain Nvidia discrete GPUs and Intel's processors with IGPs.


Nvidia came up with Optimus in order to maintain the long battery life which Intel is so eager to market with its Pinetrail and CULV platforms. Nvidia's Optimus will work with Nvidia's Geforce 300M, 200M, "Geforce M Next-gen" as well as the upcoming ION 2 GPU paired up with Intel's Arrandale Core i3, i5 and i7, Penryn Core 2 Duo and Pinetrail N4xx processors.

The previous "switchable graphics" technology was a pretty nice idea, but it lacked simplicity and ease of use, as you had to close down all applications, wait for switch to happen, make profiles, be sure not to forget to switch it back and endure the annoying flickering. Unlike hardware "switchable graphics", Nvidia's Optimus is a software based thing which switches to the discrete GPU whenever it is actually needed and switches back to the IGP once it is no longer needed, so it avoids all the problems that the switchable graphics had.

The noted switch is done almost instantaneously thanks to the actual driver, which, according to our info, should be released quite soon and will have periodical updates. The so called routing is done on executable file name recognition, as well as, another more complex way, where Optimus actually detects whether it needs to power up the discrete GPU. This one could be best described, for example, when surfing the Internet, IGP is usually enough, unless you are viewing Flash video and have Flash 10.1 beta installed in your browser, where you could actually benefit from that decoding power of the dedicated GPU.

Nvidia's Optimus can also be used for games, CUDA applications and other various GPU intensive tasks.

The entire concept makes sense and it makes you wonder why it hasn't been done like that in the first place. Of course, most of Nvidia's Geforce xxxM GPUs are not really meant for playing Crysis at high resolutions, but when compared to Intel's infamous IGPs, everything is better.

You can check out the in-depth reveal of the Optimus over at Anandtech.


Last modified on 10 February 2010
Rate this item
(0 votes)