Featured Articles

LG G Watch R ships in two weeks

LG G Watch R ships in two weeks

The LG G Watch R, the first Android Wear watch with a truly round face, is coming soon and judging by…

More...
LG unveils NUCLUN big.LITTLE SoC

LG unveils NUCLUN big.LITTLE SoC

LG has officially announced its first smartphone SoC, the NUCLUN, formerly known as the Odin.

More...
Microsoft moves 2.4 million Xbox Ones

Microsoft moves 2.4 million Xbox Ones

Microsoft has announced that it move 2.4 million consoles in fiscal year 2015 Q1. The announcement came with the latest financial…

More...
Gainward GTX 970 Phantom previewed

Gainward GTX 970 Phantom previewed

Nvidia has released two new graphics cards based on its latest Maxwell GPU architecture. The Geforce GTX 970 and Geforce GTX…

More...
EVGA GTX 970 SC ACX 2.0 reviewed

EVGA GTX 970 SC ACX 2.0 reviewed

Nvidia has released two new graphics cards based on its latest Maxwell GPU architecture. The Geforce GTX 970 and Geforce GTX…

More...
Frontpage Slideshow | Copyright © 2006-2010 orks, a business unit of Nuevvo Webware Ltd.
Wednesday, 24 October 2007 12:12

nForce 750i also gets N200 treatment

Written by test

Image

But gets less bandwidth


It looks as if Nvidia is relying heavily on its N200 chipset to patch up current chipsets and to make them support PCI Express 2.0. It might be a strategic move, as it's cheaper to add a PCI Express controller than to make a new chipset, but it's not a good solution.

The 750i is still using the C55 SLI X8 and the MCP51 combination, but with the addition of the N200 this board will get PCI Express 2.0 added to its feature set. However, it will be limited to the same two x8 slot bandwidth as with the 650i chipset.

Boards based on the 750i chipset will also have support for up to six x1 slots or devices, 800MHz DDR2 memory and this time it will aparently support SLI memory, as well.

We're curious why all the chipset manufacturers are imposing these made-up limitations of their chipsets just so they can offer a wider range of chipsets, since why would anyone willingly want to use less bandwidth for their graphics cards than they can use?
Last modified on Wednesday, 24 October 2007 20:36
blog comments powered by Disqus

 

Facebook activity

Latest Commented Articles

Recent Comments