They must have been reading from a crystal ball

Apr 19, 2007 10:39 GMT  ·  By

If you are considering having an upgrade in the near future, among the things that score high on your list of components is the CPU. Basically, it can give you the biggest performance boost, or it can seriously set you back a couple hundred dollars, or both. But according to one of the most famous companies that observe the market and give predictions for future years, iSuppli, quad-core processors will be present in "half of all mainstream PCs by Q4 2009".

Now to be honest, this isn't quite the upgrade the computer industry saw when dual-core processors came to replace single-core processors. That was a different matter, as now, some people have upgraded their computer to dual-core processors, they have seen the improvements that were made and to be honest, I don't know how the multi-core era is going to lay its effects on the users. The common conception right now is that quad-core processors aren't a necessity, they are a viable solution into the future. That's what everybody says, but based on what? Based on the number of products that know how to use multiple CPU cores at the present time. So viewing the problem from this point, there are a couple of things that we must consider, which could help these predictions become a reality: first, the software and games developers must have a majority of products on the market that know how to work with multiple CPU cores. There is still the problem of the price, so prices for quad-core processors should hit the $200-$300 mark in order for people to consider them a part of the mainstream market, and then, for the change to become even more viable, the market should not have any single processors available for sale, anywhere. This way people will consider dual-core processors as the base for a new system, and quad-core and whatever they come up with until that point in time, to be the mainstream to high-end products.

But there are a lot of "ifs" until , as I've said, such a prediction becomes a reality, for one, all the processors manufacturers, except VIA perhaps, promote multi-core processors, while saying that in case there are programs that don't need the processing power, the rest of the cores are shut down and stop consuming power, and bla bla bla, period. My question is, why, if your computer basically runs programs that use a single-core, do you need multiple cores that are also shut down when they aren't being used? Because of some rogue programs that actually use all of those cores? Because it's a marketing deal, and that is what you must believe is good for you? Because current CPU architectures have hit a limit when it comes to frequency, so the industry is developing parallel computing, this being their only alternative?

The answer to these questions is, like Einstein's theory, relative, but one of the questions hits an interesting spot. Intel stated that the Core microarchitecture could allow a processor to hit 10GHz (without mentioning a time line and a manufacturing technology it will have to be built upon), as opposed to the Netburst microarchitecture, which they deemed inappropriate. But overclocking attempts have taken Netburst-based processors to 8GHz, while Core-based processors only reached 5.5GHz. In the light of these events, are their words to be trusted or was this just a move so that the computer industry would shift to parallel processing? You decide.