Opening a can of whoopass on NVIDIA, or not?

Apr 16, 2007 07:48 GMT  ·  By

Although NVIDIA was the first to bring to the table the stream processing concept to desktop video cards, it was ATI that first put the concept to use in the form of the ATI Xenos, the GPU that powered the Xbox 360. But has this head start they have been through enough to overcome NVIDIA? They have released video cards for the desktop segment, and have gotten along just fine up until now, so the question is: will ATI's R600 series take over the success from G80's grasp?

After all of the details AMD has so graciously given us concerning the processing power of 320 stream processors, why don't we see big differences when it comes to gaming? And, for that matter, I'm not saying that Microsoft's Vista isn't a good operating system, but haven't sufficient tests proven that gaming is better under Windows XP? And if so, why are all the major hardware websites using it for testing? Should it be that, for now, Microsoft only offers DX10 support for Vista, just to help along the increase of sales? These are questions that rise serious doubts concerning the direction the industry has taken, where it's headed, and why is it taking us along for the ride.

But leaving this to the "great minds" that have planned this ahead, the question we should ask ourselves right now would have to be why the ATI HD 2000 series yield such low results, given the facts that the high end models are said to have 320 stream processors, that would mean a huge performance gap between ATI and NVIDIA. And still, the "gap" isn't that large, not at all actually, according to a website that got their hands on the 512MB ATI HD 2900XT video card. A test was done by using a platform made out of an Intel Core 2 Extreme QX6800 processor, an ASUS P5W DH motherboard, 2x1GB DDR2-800 (5-5-5-12) memory and a Western Digital Raptor 150 hard drive. The video cards were an NVIDIA 8800GTX and the ATI HD 2900XT. Also the operating system used was Windows (*cough*) Vista.

In 3DMark06, NVIDIA's card scored higher than ATI's offering, with 10951 3DMarks for 8800GTX and 10356 3DMarks for HD 2900XT. The second round involved a demo of Crysis, in which ATI's card took the lead, by a small advantage. The HD 2900XT scored 81.4fps in 1024x768, and the 8800GTX got 72.5fps, at 1024x768 4xAA and 16X AF ATI got 74.7fps and NVIDIA got 67.2. In the high resolution tests, the advantage was kept by the HD 2900XT, scoring 52.8fps at 1600x1200 4x AA and 16x AF, while 8800GTX got 50.2fps.

ATI HD 2900XT has 512MB of GDDR4 memory, NVIDIA 8800GTX has 768MB of GDDR3 memory, ATI has more stream processors, higher frequency of the video RAM, but still, only gets a small advantage over NVIDIA's card. The question is WHY? It doesn't make sense, and even though many would say that is has to do with the fact that the card is optimized for DX10, well, so is the G8x series from NVIDIA, and seeing as how there are just a hand full of games and demos out there that are DX10-based, something is rotten in the kingdom of DAAMIT.