
In Montreal, NVIDIA had gathered the hardware world press to present the G-Sync module solution to the problems of tearing, lag and stuttering experienced by PC gamers.
Before going into the details of this solution, Jen-Hsun Huang wished to explain to an audience of insiders why these problems schematically stems from a lack of synchronization between the graphics card and monitor. A problem that V-Sync option partially corrects, but it is not convincing because of tearing.
NVIDIA has made floor engineers and the result of their work is in a module called G-Sync. Small printed circuit board (pictured below), the G-Sync is integrated into the monitor to manage the refresh rate of the latter to be more in tune with the animation speed of the game.
On paper, the idea is simple: rather than operating at a fixed frequency, the monitor adjusts the distribution of the image to calculate the speed of the GPU. Suddenly, the synchronization between the two is perfected and the effect of tearing is removed then, of course the lag is non-existent because we are talking about Kepler technology compatible, ie GPU rather swift in nature.
In practice, this little card replaces the traditional monitors scaler and therefore requires to have a Kepler GPU (GeForce GTX 600 and above) on the graphics card. At present, only four display manufacturers signed with NVIDIA (Asus, BenQ, Philips and Viewsonic), but others are likely to follow soon.
We could obviously attend a convincing demonstration of this technology with the example of a pendulum and that of the last Tomb Raider game from Crystal Dynamics. Useful clarification: if the animation speed is very high then the system sets the possibilities of the monitor so that everything is always perfectly synchronized.
NVIDIA has not yet given any idea of the extra cost for the G-Sync monitors and have also not given any release date.