The discrete cards will become obsolete as the CPU takes over graphics

Apr 4, 2008 13:33 GMT  ·  By

The future of the dedicated graphics card might be coming to an end, as the multi-processor architectures are blooming. Ron Fosner, an Intel Graphics and Gaming Technologist, claims that there will "probably be no need" in purchasing a dedicated graphics card in a short while.

The news was broken while demonstrating the "Smoke" application during the Intel Developer Forum. The software ran on a test rig equipped with a four-core Nehalem processor, with HyperThreading technology per each of the cores.

The initial specifications of the Smoke demo was thought to include a group of firefighters putting off a blazing fire, yet, due to some obscure reasons, Intel chose to showcase a house being set on fire by a meteor. "We didn't put in the fire fighters yet," said Fosner.

The demo makes heavy use of particles, which are known as graphics card killers. The demo was comprised of a meteor tearing a house into pieces, then set on fire the "flamable" objects in its vicinity. The software would be powered by a particle emitter system with bounding boxes, and when the particles hit another object, the latter would become an emitter itself.

According to Fosner, multi-core processors could handle life-like animations, such as weather or effects better than dedicated GPUs. For instance, multi-core processors can handle the graphics tasks in a better manner than a high-end graphics board could ever do.

He even stated that Intel's chips could offer "more bang for the buck", as it's more efficient to switch to multicore CPUs than to keep piling up more and more dedicated graphics cards into the system.

"The fact of the matter is that you're going to have one graphics card, you may have a dual graphics card, but you're not going to have a four graphics card or eight graphics card system," said Fosner. Moreover, a CPU is easier to program than the more and more sophisticated graphics cards, that require deep understanding of shader models and DirectX. This makes a lot of sense, given the fact that every programmer can deliver a software application for the CPU, while graphics programming is usually a little more "mystical".

Fosner also claimed that the computers that stormed the market back in the 80s did not come with a dedicated graphics card, and the nowadays processors are powerful enough to substitute for them.