samedi 28 mars 2015

PC Gaming Week: Nvidia G-Sync vs AMD FreeSync

In the frame


Calling all gamers, we have good news and bad news. The good news is that AMD and Nvidia have both solved the problem of screen tearing and frame stuttering in demanding PC games. The bad news is that they each have their own solutions which are not compatible – in short we have a format war… again.


What is happening during PC Gaming Week?


What's the problem?


The source of displaying high-performance PC games is that monitors have a constant refresh rate, e.g. 75Hz means that the screen is updated 75 times per second. Meanwhile, graphics cards (GPUs) redraw the screen at a variable rate, depending on the computational load they're bearing.


Source: Nvidia


The difference in timings of the two mean that the current frame on the monitor and the current frame on the GPU become unsynchronised. Therefore, partway through the process of sending a frame to the monitor, the GPU moves onto the next frame.


Source: Nvidia


This switch appears as a discontinuity in what you seen on-screen. Usually, the discontinuity travels down the screen on concurrent frames as the phase difference between the GPU and monitor reduces. This discontinuity is what we call tearing and in extreme cases there can be several tears at once.


Source: Nvidia


Most PC games employ something called VSync as a way to reduce the tearing effect. VSync effectively limits the frame rate of the GPU, such that if one particular frame has taken too long to be rendered on the GPU and it misses its slot on the monitor, the GPU will delay sending any graphics data to the monitor until the next screen refresh.


Source: AMD


Brilliant! Problem solved then? Well, not quite. VSync is not perfect, the delay in sending a frame to the monitor causes stuttering and lag during the times that the GPU is under the most processing load, which is also the time a gamer needs the most response. Hence, many gamers choose to disable VSync in order to get the most responsive system, despite the ugly tearing effect. So, while VSync was the only remedy to tearing, many gamers choose to disable it.


Nvidia to the rescue, kind of


Since 2014, Nvidia has been promoting its solution to the VSync problem, that it has dubbed G-Sync. The basic concept with G-Sync is that the GPU actually controls the refresh rate of the monitor. By doing this, the monitor and GPU are always in sync and therefore there is never any tearing or stuttering. Prior to this, Nvidia had already been working on Adaptive VSync.


As PC Perspective notes, there are three regimes in which any variable refresh rate GPU/monitor system needs to operate within: A) when the GPU's frames per second is below the minimum refresh rate of the monitor. B) When the GPU's frames per second is between the minimum and maximum refresh rate of the monitor. C) When the GPU's frames per second is greater than the maximum refresh rate of the monitor.


Source: Nvidia


Case B mentioned above is straightforward – the GPU simply sets refresh rate of the monitor to equal its frames per second.


When a G-Sync compatible GPU and monitor are operating in case C, Nvidia has decided that the GPU should default to VSync mode. However, in case A, G-Sync sets the monitor's refresh rate to be an integer multiple of the current frames per second coming from the GPU. This is similar to the delaying frames strategy of VSync, but has the advantage of keeping in step with the monitor because of the whole number multiplier.


The (somewhat literal) price of this solution is that Nvidia needs to have a proprietary chip in every G-Sync compatible monitor. This has the undesirable result of G-Sync monitors incurring increased costs due to requiring the extra electronics and paying the associated license fees to Nvidia. Finally, it is not supported by AMD GPUs either.


AMD strikes back


While Nvidia was first to come up with the idea of the GPU controlling the monitor's refresh rate, AMD has struck back hard by coming up with its own solution, called FreeSync. It is based on an open standard – DisplayPort 1.2a. AMD collaborated with the VESA group to modify the DisplayPort standard to incorporate AdaptiveSync, which allows compatible GPUs and monitors to automatically negotiate the optimal refresh rate for the monitor. It thus requires no proprietary elements which in turn keeps costs lower than Nvidia's offerings. AMD even go so far as to claim that G-Sync will reduce frame rates, rather than making things better.


Source: AMD


The key technical difference between G-Sync and FreeSync, apart from the licensing requirements (or lack thereof!), is the way in which they handle GPU output that lies outside of the refresh rate range of the monitor. FreeSync is limited to matching refresh rates to frame rates via AdaptiveSync.


It cannot perform any other refresh rate tricks in the same way that G-Sync can, when the GPU's frame rate is outside of the monitor's refresh rate. Therefore, a FreeSync GPU has a frame rate outside of its monitor's refresh rate range, it defaults back to working with or without VSync as per the user's preference, which means there will be tearing or stuttering again.


Oh great, another format war


It is early days in the world of variable refresh rate graphics and so your options are limited if you want to try either technology. According to AMD, there are eight FreeSync monitors on the market, and Nvidia reports six G-Sync monitors are now available.


It is almost impossible to compare any of these monitors given the vastly differing specifications; but generally the G-Sync monitors are more expensive than the FreeSync models. As for GPUs, Nvidia's website says that any GPU from the 600 series onwards will support G-Sync. Meanwhile, compatible AMD Radeon GPUs are limited to the "R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs".


An important consideration for gamers is that we tend to upgrade monitors far less regularly than GPUs. So whatever type of variable refresh rate technology you choose now is going to determine your choice of GPU brand for a long time to come.


As ever, when big brands get into a format war, consumers become collateral damage.




















from Techradar - All the latest technology news http://ift.tt/1CZq82O

0 commentaires:

Enregistrer un commentaire

Popular Posts

Recent Posts

Text Widget