It will automatically adjust refresh rate, much like NVIDIA G-Sync does

May 13, 2014 09:54 GMT  ·  By

Some time ago, NVIDIA introduced the G-Synch technology, which could sync the refresh rate of a monitor with the texture draw rate of a video card, outclassing V-Sync by miles. Now, though, an open standard that does the same thing has been introduced.

In a sense, this is AMD's way of sticking it to its rival, because it's always better and most cost-effective (thus, more convenient) to have an open standard to use.

After all, open standards don't come with royalties. And there is one big advantage that the new adaptive refresh rate technology possesses compared to NVIDIA's: no hardware reliance.

Sure, you'll need the new DisplayPort 1.2(a) specification, but it's not the same as requiring a special printed circuit module to be installed in the monitor.

That's how NVIDIA G-Sync works, you see. A monitor maker needs to add a computing module to the display, in-factory.

That module syncs the refresh rate with the texture fill rate, eliminating all chances for artefacts or screen tearing.

The new DisplayPort 1.2(a) standard from VESA, however, adds something like G-Syn and FreeSync to all graphics cards/monitors. No wonder Advanced Micro Devices was so heavily involved in the research and development.

There are very high odds that G-Syn will become obsolete, because pretty much every display manufacturer will adopt VESA's (AMD's) solution.

We can't imagine NVIDIA being all that giddy about this. On the flip side, the Santa Clara, California-based company doesn't have to worry about it for now.

After all, it will take 6 months or even a year for Adaptive-Sync monitors to reach the market, but NVIDIA already has G-Sync ones on sale. Well, its partners do anyway.

“DisplayPort Adaptive-Sync enables a new approach in display refresh technology,” said Syed Athar Hussain, display domain architect at AMD and VESA board vice chairman.

“Instead of updating a monitor at a constant rate, Adaptive-Sync enables technologies that match the display update rate to the user’s content, enabling power efficient transport over the display link and a fluid, low-latency visual experience.”

Alas, it really does look like NVIDIA's G-Sync will play the role of proof of concept for adaptive sync technologies in general. Like those bulky, refrigerator-sized HDDs of the late 1980s predated the HDDs of today.

Not a comparison we expect NVIDIA to appreciate, but the PCB module that G-Sync relies on is, one might say, in an even worse position, because DP 1.2(a) Adaptive-Sync doesn't seem to employ such hardware at all.