BLOG

G-Sync vs V-Sync

At LEVEL51 we have G-Sync screen too!? What is G-Sync? How are they different from V-Sync? Let me tell you.

Before we get to know G-Sync and answer the question of how it differs from V-Sync in general gaming options, we need to know a little bit about how your graphics card (GPU) works. so that we understand that What is the problem caused by G-Sync to fix it?

FRAME BUFFER, DOUBLE BUFFERING, TRIPPLE BUFFERING

When the GPU renders an in-game image, a 3D object floats in its memory. which of course We can't see anymore what does it look like It will have to project that 3D object grafted onto a two-dimensional board. to calculate where this object is placed on the screen And what color must it be expressed? There are many complicated steps. which can still be kept for a long time

This board is called "Frame Buffer", which is the same "Frame" as "Frames per Second" or "Frame Rate". Frame Buffer is located on the main memory of the GPU, taking up space. On memory is equal to the number of pixels that must be displayed multiplied by the number of Bit Per Pixel of the image that needs to be displayed. For example, a 1920x1080 screen will have a number of pixels at 2,073,600 pixels, and if we display with 32-bit color, It uses space on memory 2073600x32 = 66,355,200 bit or 8,294,400 bytes (approximately 8 MB).

Each of these "Frames" is the image that we are about to see on the screen. The screen control will bring the image in this Frame Buffer to be displayed again.

Usually, we don't share the GPU and monitor frame buffer because they have to compete with each other. One person will write data to it (Render), the other person will read it and display it. Most of the time, we don't do it because Bandwidth of memory is limited. will have to wait and wait It will be too slow (but in the past, it might have been like this. because there is little memory)

to solve this problem Therefore, the Double Buffering technique was born, where the GPU will draw down "Back Buffer" or some people call it "Off-screen Buffer", which is the Frame Buffer that is not displayed in order for the GPU to draw the Frame down to complete the Frame at once. Faster than scrambling to read and write. Once the frame is drawn, then switch (Buffer Swap) to make Back Buffer become Front Buffer for the screen to display, and Front Buffer to become Back Buffer to remove the displayed Frame. already Recycle and use again, bring the GPU to draw again. Tripple Buffering is similar, but uses up to 3 buffers.

Everything looks good. The screen gets the image to display the entire image quickly. Don't fight, don't wait until...

Tearing

The funny thing is Our efforts to Double, Tripple, Quadruple Buffering are wasted. Because it's actually the first time. That is, we want to reduce the Tearing symptoms.

The fact that there is only Front Buffer will make people sitting and looking at the picture see that the picture is gradually drawn up in lines from top to bottom. Windows 95/98 over there) You can see that when the program just opened It will paint blue on top. run to the bottom

But why do Double Buffer and still have Tearing symptoms!!!? Because in the end, everything is bottlenecked at the monitor itself. Imagine that the monitor is a Frame Buffer that projects the color information in each pixel into the light that we see again. It's no different from having only the Front Buffer.

The way the screen shows the colors we see, it will scan the image it receives from top to bottom, one line at a time, from line 1 to line 1080, which scans from line 1 to line 1080 on a 1920x1080 screen. At 60Hz, it takes exactly 1 in 60 seconds. equal every time, called "Refresh Rate"

Of course, it's not equal to the rate at which the GPU renders for sure, because the frame rate is difficult to maintain. (Unless we write a program that Sometimes it can be as fast as 100fps (or 100Hz), sometimes it may be slower, for example 40fps (or 40Hz). The Frame Rate is higher than the Hz of the monitor will cause Tearing because The screen still does not scan after finishing the first image sent by the GPU, it appears that the GPU sends the second image again, and if the fps and Hz are very different, accidentally, we may see 3 images on the same screen. Like the example picture above.

If you can't figure it out, check out the NVIDIA graphics.

V-Sync

So there is a solution to this problem by having the GPU and the screen match up. Actually, it's not really because the Display Controller is on the GPU and the Frame Buffer of the screen is also on the GPU. So the graphics card designers It can make it sync that OK now the screen Scan until the last line or not? Therefore, the origin of the word V-Sync is "Vertical Synchronization".

Everything seems to fit quite well...if the GPU can render at 60fps or more all the time...

Of course not always!!!

Stutter

V-Sync is good for us, if the GPU is faster than the monitor, for example it may take 12ms to draw while the screen takes 16ms per scan cycle, the GPU will have time. 4ms stop waiting for the monitor Show all previous images first. (Which is a very long time for a GPU), so it gradually Swap Buffer sent to display (Scan 1), but it just realized that. It should take 4ms negativity to draw a second picture. because the second picture It takes longer than the first image and also over 16ms. It doesn't send to the screen in time for V-Sync, because if it sends it, it may cause Tearing, so it has to wait for the V-Sync cycle again before it can swap buffers.

The moment the GPU doesn't send a new image, the screen still has the Frame Buffer that is the same image. The Controller doesn't know that. This image is an image that has been scanned for one round and it still has to perform its function, namely the Scan Frame Buffer, to display the results. The result is that we will see the same image twice!

The same picture symptoms are repeated twice. Foreigners call it Stuttering, and the bad thing is that this Stuttering symptom doesn't show up in fps as well, because sometimes the fps is 30-40fps, which is ok, but we feel that it's jerky, especially slow. When the camera pans back and forth in FPS (First Person Shooter) games and driving games, when we watch movies at 24fps, we still don't feel it stutter at all. Really, this sluggish feeling is Stuttering.

Adaptive V-Sync

To prevent this, NVidia previously had an option called Adaptive V-Sync, which combines the benefit of turning V-Sync off and V-Sync on. The driver will disable V-Sync if it finds a lower fps. The Hz of the screen to reduce stuttering and input lag and turn on V-Sync if the fps is higher than the Hz of the screen, but it still doesn't help much. It just dropped. because it did not fix the root cause. The monitor should display an image immediately after the GPU finishes drawing, right!!!

G-Sync

It is a technology from the side of NVidia (AMD's called Free Sync), which the concept is similar, which is based on the LCD screen does not have to refresh the image every time before displaying. If we don't do anything about it It will be the same image. Now that you know this, why not use this ability???

(An ancient CRT screen, it has to refresh the image all the time because if it doesn't refresh, the image will disappear from the screen. which some people see that the screen is flickering and is the reason why some people have eyestrain when playing on the computer)

In short, the screen no longer needs to refresh at a constant rate. Well, when the GPU finishes drawing, just send the image to the screen. The screen was drawn to the end. Then hold the image and wait until there is a new image. That's all!!!

but in the depths It's still a bit more complicated. Because each screen There are different capabilities, some monitors, if not refresh it, the image will fade. So it still needs to refresh some to stimulate the same brightness. Compared to Free Sync's WiKi pages, most monitors still need to refresh at a minimum of 30, so if the GPU hasn't sent a new image yet, G-Sync will need to draw the old image first. to stimulate the screen And if that happens, bad luck halfway drawn When the image from the GPU comes in, the image from the GPU may have to wait a bit. so that the old picture sent to stimulate the screen of the painting is complete first So I sent the new picture to believe that Nvidia's engineers would have enough magic. To calculate it to fit the rhythm, because the G-Sync screen must pass the Certify from Nvidia too!!!

Simple summary as in Slide

  • No-Tearing No more tearing because G-Sync still has to wait for the scan to finish before it can draw, just like V-Sync.
  • No-Stutter The glitch is much less likely because G-Sync refreshes the screen at any time. Or perhaps it can also control the fps of the GPU to optimize the GPU's drawing cadence with the screen relative to it.
  • Less Lag notice that it's not "No Lag" because it's still stuck on the screen, needs some stimulation and still has to wait for the scan to finish if the rhythm doesn't really match.

In Real Life?

How is it in real life? I tried it myself and found that it works for some games! Well, why not support yourself? Because we are sincere and G-Sync can be closed per game.

What I found is that the type of game that doesn't work with G-Sync is that it doesn't help when opening it. Sometimes it's worse, it's a game that locks or caps fps. It's a very nasty action because locking or capping fps, whether it's 30 or 60, is very upsetting. People with 144Hz screens want to play at > 100fps really? It's a monk. The game at Cap is 30fps!!!

For the game that pisses me off the most Because in addition to being difficult to connect to play, there is still a Cap Fps at 30/60, which is The Crew. I noticed that G-Sync will work very well if the fps goes up and down, the wider the range, the more slippery, because almost Stutter. It will disappear completely. Understand that Nvidia intended to design G-Sync with the assumption that fps will have to go up and down, but when the maximum fps is stable from being capped, it seems that the algorithm is confused. It turns out that the picture looks sluggish, jerky, because as you can see, G-Sync will try to blend the fps so that it doesn't jump too much. In order to make the picture look smooth as well, of course, the slippery is not slippery here. It was pure feeling. I think you should try playing it yourself.

I would like to leave a test video from Anandtech website . The picture is slowed down by half. To clearly see the symptoms of Stuttering and how G-Sync can help?

And of course, if playing Some games are worse, it is sluggish, as I said, you can turn off individual games in the Nvidia Control Panel.

Thank you for following and reading here. Where do you disagree? Let's come together.

BLOG

WHO IS LEVEL51?

We are Nepal's local Laptop Brand which use
the Laptop Chasis from CLEVO - Taiwan.

Our laptops are configurable and designed to be professionally -
If you are looking for Laptop for CAD/CAM/VRAY or Video Editing
or you simply wanted to game 16 hours a day
Look no further!

1317
Customers
0
THB 100,000 Builds
196
K
Average Build Price
0
K
Most Valuable Build

Our Government and Universities Customers:

Our Video Production, 3D Design, Software House Customers:

Landscape Design

Our Industrial and Construction Customers:

 

Thank you for reading this far! - Please register to keep this special discount coupon!