When thinking about what graphics card and monitor to buy, there are more factors to consider than pure performance. That’s because no matter how many FPS your graphics card can spit out, if it doesn’t sync up with your monitor you might notice some tearing or lagging even at stable framerates. In this article we summarized how G-Sync and Freesync combat this problem.
What Is Tearing and How Does It Happen?
You’ve probably all seen it before if you’ve played PC games. Some notice it more than others but fact is, it happens almost all the time. Tearing describes what occurs when the GPU and monitor aren’t properly synchronized. This manifests itself with the monitor not always receiving one frame at a time, but sometimes even two or more, causing a noticeable “tear” in the image whenever it happens.
To a certain degree, Vertical Sync (or VSync) can reduce this artifact. With VSync activated, the monitor puts a cap on the refresh rate. If, for example, your monitor supports a maximum of 60Hz (60 FPS) while your GPU could happily crank out 160 FPS, VSync puts a cap on the framerate and limits it to 60 FPS. In the real world however, this doesn’t always work as desired. Another drawback is that if your graphics card can’t achieve 60 FPS in a certain game your display will have to “wait” for new frames o display. This causes lag and stuttering even though it should theoretically run t 50 FPS. VSync also increases input-lag which can become especially annoying in fast-paces shooters or E-Sports in general.
However, there is already a solution to both problems: NVidia’s G-Sync and AMD’s Freesync. Both technologies do pretty much the same but use slightly different methods: They synchronize GPU and display so that every frame that the graphics card produces will immediately be displayed onscreen – as long as their within the limits of the display’s refresh rate.
Freesync vs. G-Sync
So, which solution is the better one? From a technical standpoint, AMG’s Freesync has the edge because unlike Nvidia, AMD isn’t trying to push their own proprietary standard that has to be licensed to equipment manufacturers. Freesync only relies on the DisplayPort 1.4A standard which also allows Adaptive Sync. This means that display manufacturers just have to implement said DisplayPort standard in order to support Freesync. The result is a much bigger number of compatible displays that don’t run for an added premium just because they’re supporting Freesync. The prices for entry-level models are 200$ or less.
With the brand new Freesync 2.0, AMD are further improving their free standard and are implementing more features. The graphics card can now handle the HDR-Tone-Mapping and deliver the final image to the display. Before this, GPU and display had to do the Tone-Mapping separately which resulted in higher input-lag when using HDR. To use this new feature, you’ll need a brand-new display that supports Freesync 2.0 however. Older Freesync standards are not compatible. The first of said displays are hitting the market as we speak. The upside is that all AMD graphics cards from the GCN 1.2 generation onward already support Freesync 2.0.
NVidia on the other hand are betting on their own, proprietary G-Sync standard which requires a special chipset to be integrated into a display. While the end-result is essentially the same you get with AMD, Nvidia charges quite the premium for this extra. While a 24” Freesync-compatible display can run for as low as 140€, a G-Sync monitor with similar specs can go for almost 400€. Of course, we need to compare the details oft he two standards but you’ll already know, what direction we’re taking.
Of course, Freesync has some disadvantages compared to G-Sync. While G-Sync supports a fixed FPS-range of 30-144 Hz, with Freesync, this range can vary with each panel. Of course, a panel that supports a maximum refresh-rate of 60 Hz can only go up to 60 FPS. The minimum-boundary also varies however, meaning that you have to research prior to purchase, if Freesync will actually do much for you. Some displays for example have a tiny range of 48 to 60 Hz.
There is, however, a feature called “Low Frequency Compensation” inherent in both standards that compensates for low FPS by showing certain frames twice – as filler-material essentially.
Let’s talk about a practical example: In order to upgrade to G-Sync or Freesync I’ll need a new monitor and graphics card. If I go with Freesync, for only 400$ I’d be able to get a brand-new AMD RX 580 OC with 8 GB GDDR5 memory as well as a 24” Freesync display. If I want G-Sync, the cheapest monitor available right now runs for 400$ just by itself. A comparable graphics card, the GTX 1060 with 6GB GDDR5 memory will set you back another 240$.
While there are people saying that G-Sync is better than Freesync, offers lower input-lag, etc., there isn’t any scientific evidence for said notion. Sometimes one is better, sometimes the other one takes the cake – a fact that is pretty much true with any long-running NVidia vs. AMD battle. In the real world, it all depends on what gear you already have and what company you’re more attached to. The GPU fanboys always like to partake in heated arguments over which manufacturer is the best… In this case however, AMD comes out a clear winner.