Now if only it had some actual bearing on the near term

Nov 18, 2014 15:01 GMT  ·  By

Change can be a wonderful thing. It can also be totally, totally horrible, especially if you ask a particularly dramatic specimen of the human race, yours truly notwithstanding. The way people react to change is the most interesting part to watch though, at least for me, which is why I'm a bit ambivalent about the slow death of the PC market, such as it is.

On the one hand, it appears that the PC era is finally passing, as many people have been hinting or outright claiming over the past half-decade. On the other hand, it's looking as though the transition is occurring pretty naturally and smoothly, so there won't be any excitement.

It's not even a case of choosing between going out with a bang or with a quiet whimper. It's just happening, and the dullness of the process kind of takes away all the excitement of the “change.”

Kind of a shame, considering that periods of transition like this one are ripe grounds for uncanny and groundbreaking inventions, since the people behind the products that are getting left behind or pushed forward tend to jump the gun a lot during such times.

Back when consumer PCs were first invented, for example, there was a huge tech leap in all technologies. HDDs were finally small enough to hold in one hand, when, not so long before, it took a fridge-sized machine to hold 5 MB.

With the post-PC era coming so gradually, we won't get to see such things. But before we get to that, I may as well explain why I'm calling the post-PC era real.

Intel has essentially confirmed it

I really should have seen it back in September, at IFA 2014. I was there when the company released the Core m CPU in Berlin. Seeing that the first ever Broadwell, 14nm chip was a mobile CPU instead of a desktop or at least full-sized laptop processor was a big enough giveaway.

However, after all those years of being exposed to hot air about the post-PC era that never came, I dismissed the possibility.

Now, Intel has essentially decided to kill its computer division. Oh, it's not dropping its tech roadmaps, but it has decided to merge the PC Division and the Mobile CPU group into a single unit called Client Computing. This merger will complete in 2015.

Meanwhile, AMD and NVIDIA are extending their reach further into the supercomputing segment than before, since the consumer GPU and APU product lines have proven rather problematic as of late.

What we'll be seeing ten or twenty years from now

Desktops won't all be gone, since the DIY part market will be alive and kicking for as long as the Internet endures. Most will just settle for a laptop or tablet/hybrid device, however, content with using their TV as secondary display if they need a bigger screen.

Still, many could give up PCs altogether by 2020 or 2030. Why? Because of cloud computing and the Internet of Things.

NVIDIA is actually helping this part, with its GRID graphics card-based architecture, which gives servers enough graphics prowess to play several high-end, 3D games at once. Good for supporting multiple account holders.

All people will need is a TV with a keyboard and mouse, as well as a home/neighborhood server or cloud storage account somewhere, and they'll be able to run all their games and programs remotely. The server will, say, run Crysis, Civilization, Assassin's Creed or whatever else and stream it to the TV via LAN, wired or wirelessly.

This essentially makes high-end PCs unnecessary. Even if you don't want to go so far, you only need a smart TV to surf the web. And everything can be done on the small screen of a phone easily enough too.

Now consider that Intel and everyone else in the multimedia and appliance markets want all electronics in the home to be interconnected. Add to that all the voice and gesture recognition solutions already running rampant and you can connect to the web or dictate a word document from anywhere in your house or a suitably equipped office building.

From there, with a small price premium you can install a transparent screen instead of a normal door on, say, a microwave oven. Or use an LCD instead of normal windows. You'll be able to stream the game session or film you're playing on the living room TV straight to the kitchen if the whim strikes you.

Great for those midnight snacks or sudden visits from friends that would otherwise cut into the viewing of your favorite show. Same goes for video chat and conferences, and everything else you can think of.

I still wish the transition to the post-PC era didn't happen so perfectly gradually though. Someone might have turned up something a bit more unusual than smartwatches and 2-in-1 devices which are really just laptops with detachable / foldable bottoms.

It might also have allowed some of those things I mentioned to already be implemented, at least in first-world countries. Then again, I might simply be impatient, though I'm sticking with “eager” since it sounds better to my ears.

Take a moment to contemplate these future casualties (5 Images)

Alienware Aurora gaming desktop, a possible future casualty
Intel Core i7 Broadwell CPUsIntel Core m broadwell mobile CPU
+2more