NVIDIA GeForce GTX 660 and 670 Ti Coming

Cut-down versions of the GK104 GPU will power them

By on May 2nd, 2012 06:31 GMT
NVIDIA's GK104 Kepler GPU was invented for the GeForce GTX 680 graphics card, but the company has decided to use them for lower-end video boards as well.

TSMC makes loads of GK104 wafers each day, but many of the GPUs come out with defects, meaning that not all of those 3.5 billion transistors work properly.

That doesn't mean that the semiconductors aren't usable though. NVIDIA is doing what all chip makers do: isolate the areas of the die that don't work properly and use the chips in lesser cards.

Two video controllers that will utilize cut-down versions of the GK104 graphics processing unit are GeForce GTX 660 and GTX 670.

We only know what the exact name of the chip is for the latter board, courtesy of WCCFTech: GK104-400-A2 GPU.

Speaking of which, we have even seen a photo of the GTX 670. Not the original, but one bearing Leadtek's brand.

And so we arrive at the part when we lay down the specs.

The GTX 660 has an entire (Graphics Processing Cluster) disabled, which means that 1,152 CUDA cores are paired with a memory interface of 192 bits and either 768 MB or 1.5 GB of video memory. The clock speeds are still unknown.

The GTX 670 will have 1,344 CUDA cores, the full 256-bit memory interface, 2 GB of GDDR5 VRAM and clock speeds of 915-950MHz for the GPU, 1.25GHz QDR for the memory.

That said, the Leadtek we have mentioned earlier is a factory-overclocked iteration, functioning at GTX 680 speeds (1,006/1,058 MHz GPU).

Price-wise, the GTX 660 / 660 Ti will challenge the AMD Radeon HD 7800 line with a tag somewhere within the $199-249 segment (probably around 200 Euro or more in Europe, regardless of what exchange rates say) and GTX 670 / 670 Ti will hover between $399-429 (399 Euro, give or take).

NVIDIA should formally launch this duo in a week or so, but sales won't start until June, at Computex Taipei 2012 (June 5-9).

Strangely, previous reports suggested that GTX 660 would be designed with a GK106 GPU instead of a GK104. Those plans either changed or were never true in the first place. Or maybe this new rumor is faulty, who knows.

Comments