Quality time with G-Sync
We spent some time with the new NVIDIA G-Sync prototype monitor and came away just as impressed as we did in Montreal.
Readers of PC Perspective will already know quite alot about NVIDIA's G-Sync technology. When it was first unveiled in October we were at the event and were able to listen to NVIDIA executives, product designers and engineers discuss and elaborate on what it is, how it works and why it benefits gamers. This revolutionary new take on how displays and graphics cards talk to each other enables a new class of variable refresh rate monitors that will offer up the smoothness advantages of having V-Sync off, while offering the tear-free images normally reserved for gamers enabling V-Sync.
NVIDIA's Prototype G-Sync Monitor
We were lucky enough to be at NVIDIA's Montreal tech day while John Carmack, Tim Sweeney and Johan Andersson were on stage discussing NVIDIA G-Sync among other topics. All three developers were incredibly excited about G-Sync and what it meant for gaming going forward.
Also on that day, I published a somewhat detailed editorial that dug into the background of V-sync technology, why the 60 Hz refresh rate existed and why the system in place today is flawed. This basically led up to an explanation of how G-Sync works, including integration via extending Vblank signals and detailed how NVIDIA was enabling the graphics card to retake control over the entire display pipeline.
In reality, if you want the best explanation of G-Sync, how it works and why it is a stand-out technology for PC gaming, you should take the time to watch and listen to our interview with NVIDIA's Tom Petersen, one of the primary inventors of G-Sync. In this video we go through quite a bit of technical explanation of how displays work today, and how the G-Sync technology changes gaming for the better. It is a 1+ hour long video, but I selfishly believe that it is the most concise and well put together collection of information about G-Sync for our readers.
The story today is more about extensive hands-on testing with the G-Sync prototype monitors. The displays that we received this week were modified versions of the 144Hz ASUS VG248QE gaming panels, the same ones that will in theory be upgradeable by end users as well sometime in the future. These monitors are TN panels, 1920×1080 and though they have incredibly high refresh rates, aren't usually regarded as the highest image quality displays on the market. However, the story about what you get with G-Sync is really more about stutter (or lack thereof), tearing (or lack thereof), and a better overall gaming experience for the user.
One more thing worth noting right away is that performance testing with G-Sync displays takes a wildly different spin. As you know PC Perspective has adopted our own Frame Rating graphics testing process that uses direct capture from a GPU running a game, into a hardware capture system with an overlay that we then post-process and analyze to get real world performance that you cannot get with software like FRAPS. The negative to that method is that is that is currently requires DVI connectivity, which hasn't been a problem as all graphics cards support DVI today. But G-Sync is a DisplayPort exclusive feature meaning that we cannot use our Frame Rating capture systems currently. We are working with some partners to enable DP 1.2 capture and thus performance testing, but there are other hiccups in the way. We are definitely working on those issues and I think we'll have them solved in early 2014.
That being said, the performance in terms of frames per second or frames times, for G-Sync are going to be closely analogous to current monitors running with V-sync disabled. Today's monitors are displaying at a fixed refresh rate and when a new frame is completed they are simply replacing part of a buffer and the data is sent immediately to the screen, resulting a horizontal tear which I am sure everybody here is familiar with.
With G-Sync, as soon as that frame is done it polls the graphics driver to check to see if the display is in the middle of a scan. If so, it waits and this poll time takes about 1 ms to be completed. Then it will tell the monitor to prepare for a new frame by resending the Vblank signal that is actually put on hold by NVIDIA's driver. The end result is that a G-Sync monitor and enabled systems performance will very closely mirror, in terms of frames per second, a standard configuration with V-Sync disabled. The benefit of course is that you no longer have any of this distracting, distorting horizontal tearing on the display.
Because of that polling time, NVIDIA did warn us that there is currently a 1-2% performance delta between V-Sync off frame rates and G-Sync enabled frame rates. G-Sync is a little bit slower because of that polling time that Tom Petersen indicated was in the 1ms area. Interestingly though, they did say they were planning to improve that time to basically 0ms with some driver updates once monitor partners begin to ship production units.
So performance results here are going to be very minimal, and in fact we are only going to show you a handful of graphs. We are going to show you V-Sync on versus V-Sync off, where V-Sync off will emulate the performance G-Sync though without the visual anomalies associated with it. In the graphs below we are using our standard GPU test bed with a SNB-E platform and processor, 16GB of DDR3 memory, an SSD and we are testing with a single GeForce GTX 760 2GB reference card using the latest NVIDIA 331.93 beta drivers. The two sets of results you see are Frame Rating captured results, one with V-Sync enabled and one with it disabled.
NVIDIA's G-Sync Demo Application
I decided to use the GeForce GTX 760 graphics cards as it is a very common, mainstream GPU and also allows us to find instances in games where G-Sync is very effective. In scenarios where you are gaming on a 60 Hz monitor and you are running enough graphics hardware to keep the frame rate over 60 FPS 100% of the time, it is true that the many of the benefits of G-Sync will be lost. However, I will argue that even dropping under that 60 FPS mark for 5% of your game time results in a sub-par experience to the gamer. Take into account that even the mighty GeForce GTX 780 Ti can be brought to its knees at 1920×1080 with the highest quality settings in Crysis 3 and I think that G-Sync technology will be useful for mainstream and enthusiast gamers alike.
Also note that higher resolution displays are likely to be shown at CES for 2014 release.
Results will show you instantaneous frame rates, average frame rates over time and how variance is affected as a way to attempt to demonstrate how stutter and frame time variance can affect your actual user experience. This is very similar to how we have tested SLI and CrossFire over the past two years, helping to showcase visual experience differences in a numeric, quantitative fashion. It's difficult to do, no doubt, but we believe that attempting this is at least required of a solid overview of G-Sync technology.
Based on the first graph, you might think that the experience of playing Crysis 3 (High, High, 4xMSAA settings) would be the same with V-Sync enabled or disabled, but it clearly is not. Even though the average rates per second are nearly the same, the second graph, that shows the instantaneous frame time tells a different story. The black line representing the V-Sync disabled test results shows a rather smooth transition of frame rates from the 0 second mark through the 60 second mark with a couple of hiccups along the way.
The orange line that shows a V-Sync enabled result is actually very quickly oscillating back and forth between a 16.6ms and 33.3ms frame time, essentially hitting either a 60 FPS mark or 30 FPS mark in any given moment. The result is unsmooth animation – the human eye is quite adept and seeing variances in patterns and the "hitching" or stutter that appears when the game transitions between these two states is explained very well in our interview with Tom Petersen above.
A zoomed-in graph (just the first 3 seconds) shows the back and forth frame times more clearly. The orange line shows a few frames at 16.6ms (great!) and then a spike to 33.3ms (not great), repeating over and over. The black line shows a more regular and consistent frame time of anywhere from 20-23ms.
It would seem obvious then that in this case, where performance is not able to stay above the refresh rate of the panel, the black line shows the better, smoother experience. However, with standard V-Sync options, this meant horrible tearing across the screen. G-Sync offers nearly the same performance levels as the V-Sync off result but without the horizontal tearing.
Another interesting angle to take on this debate is that with V-Sync off, and in analog G-Sync enabled, you are getting the full performance out of your graphics card, 100% of the time. Your GPU is not waiting on monitor refresh cycles to begin outputting a frame or begin rendering the next frame. Gamers have been enabling this for years by disabling V-Sync, but the tearing problem was again the result. It was a trade off between frame rate, responsiveness and tearing versus stutter, input latency and a tear-free image.
I’ll hold off until someone
I’ll hold off until someone does some serious test and not some PR repost.
Guru3D
“So if the game has “game engine” issues, G-SYNC will not fix them”
So all games will still have issues in one way or another. LOL!!!..
What if the games are storyline with cut-scenes and or videos in 24fps ?
Will the experience be screwed up and be a stutter-fest when the game switches to the cut-scene ?
Excellent review Ryan. Great
Excellent review Ryan. Great job man. There is a lot of effort in your analysis and we definitely appreciate it.