Quantcast

Montreal, to me, has always meant hockey masks, maple syrup and Mounties. I never thought it was the doorway to gaming heaven.

But when we launched NVIDIA G-SYNC here today, I could see my life as a gamer getting better. Actually, I could see the rest of my life getting better, too, because we’ve been working flat out to get this out the door for several years.

The idea behind G-SYNC is simple, even if the technology isn’t. It’s to deliver visually stunning games, without the artifacts that jolt you out of the zone – like a guy who keeps standing up in front of you during a great movie.

GSYNCVERTICAL
Gaming headaches? Take one of these and call us in the morning.

An Obstacle to Great Gaming

Now, I’ve been gaming on PCs for just about 20 years. I started out with “Doom” and “Descent.” Ever since, I’ve been reaching for the hard stuff. A perfect evening for me is coming home after a long day at NVIDIA, cracking open a Guinness and sitting down to a session of “Starcraft II.” Or even better, jumping into a new indie release like “Antichamber.” That thing still blows my mind.

But much as I love gaming, I’ve always hated the choices you have to make synchronizing to your monitor. With V-SYNC off you can have fast input response time, but images are seriously corrupted by tearing. Or, you can use V-SYNC on, but then games get laggy, and any time the GPU’s FPS falls below the refresh rate of the monitor, animation stutters badly.

Imagine if your fully armed buddies can see you. But your system won’t let you see them. The input lag can get you killed. Given the options, it’s not surprising that competitive gamers pick the lesser of two evils and run with V-SYNC off. But it’s still short of perfection.

What I want from gaming is to get immersed in the experience. I want to feel the cracked concrete under my feet. Or the buzzing jungle closing in around me. Or the flash of the grenade nearby. Stuttering and tearing are distortions that bring me back to the beige carpet in my game lounge. They make me wonder what caused them and I get jolted out of the zone pretty fast.

G-Sync
G-Sync: your ticket to gaming nirvana.

Getting to the Root of The Problem

This same observation got Jen-Hsun, our CEO, to commission some of the brightest minds at NVIDIA to solve this problem. We brought together about two dozen GPU architects and other senior guys to take apart the problem and look at why some games are smooth and others aren’t.

It turns out that our entire industry has been syncing the GPU’s frame-rendering rate to the monitor’s refresh rate – usually 60Hz – and it’s this syncing that’s causing a lot of the problems.

Sadly, monitors, for historic reasons, have fixed refresh rates at 60Hz. That’s because PC monitors initially used a lot of technology from TVs, and in the U.S. we standardized on a 60Hz refresh way back in the 1940s, around the time Ed Sullivan was still a fresh face. That occurred because the U.S. power grid is based on 60Hz AC power, and setting TV refresh rates to match that made early electronics easier to build for TVs. The PC industry just sort of inherited this behavior because TVs were the low-cost way to get a display.

So back at NVIDIA, we began to question whether we shouldn’t do it the other way. Instead of trying to get a GPU’s output to synchronize with a monitor refresh, what if we synchronized the monitor refresh to the GPU render rate?

No More Tearing, No More Stutters

Hundreds of engineer-years later, we’ve developed the G-SYNC module. It’s built to fit inside a display and work with the hardware and software in most of our GeForce GTX GPUs.

With G-SYNC, the monitor begins a refresh cycle right after each frame is completely rendered on the GPU. Since the GPU renders with variable time, the refresh of the monitor now has no fixed rate.

This brings big benefits for gamers. First, since the GPU drives the timing of the refresh, the monitor is always in sync with the GPU. So, no more tearing. Second, the monitor update is in perfect harmony with the GPU at any FPS. So, no more stutters, because even as scene complexity is changing, the GPU and monitor remain in sync. Also, you get the same great response time that competitive gamers get by turning off V-SYNC.

G-SYNC moves us a little closer to gaming nirvana – a world of great image quality with no tearing, no monitor stutter, and really fast input response. That lets me get back in the zone when I game. Already, I can see my beige shag carpet and Costco art prints in their crooked frames slowly replaced by the hard rock roads and dragon fire of “Skyrim.”

Life is good again.

  • Thomag0

    Could this also fix the video stuttering problems ReClock is a (for the most part not really working) makeshift solution for?

  • BlacKHeaDSg10

    Probably every fan of Nvidia will try to kill me for question i will ask it. Will G-Sync work only with Nvidia graphic cards ?

  • Johan de la Rosée

    So we will need to buy a new monitor/graphiccard? :)

  • bro3886

    NVIDIA CEO Jen-Hsun Huang hinted at this during the Nvidia Montreal gig: “[...] It includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. [...]” Source From a business point of view this move makes very good sense for Nvidia as monitors are not replaced as frequently as graphics cards, urging gamers to upgrade to – yet another – G-Sync compatible grahpics cards.

  • bro3886

    If you don’t own one of the Kepler GPUs, which go with the tech, yes.

  • Soup

    There are 2 big issues with cranking up the refresh rate.

    The first is the amount of bandwidth available in the display interface. In order to drive a display at 500Hz, you’d need more than 4x the bandwidth necessary to drive a 120Hz display. From the information I’ve been able to find, DisplayPort will max out at 240Hz at 1080p.

    The other big issue is the display response time itself. Not only is the manufacturer-reported response time an ideal grey-to-grey (so any other colors will be slower) but they are achieving these timings by overdriving the display, which causes ghosting artifacts. Those ghosting artifacts will be amplified as the refresh rate increases, due to not allowing them time to decay before the next frame is drawn.

  • BlacKHeaDSg10

    Tnx for info

  • Gunnar Lilleaasen

    There is no good reason why some special NVIDIA hardware should be able to solve the issue, because there are limitations in the whole chain they are relying on – the chain is no stronger than its weakest link.

    Sure, the bandwidth of DP is not enough to transfer 500 frames per second, but it should be able to transfer 120-240 frames per second. So why not just set the display to allow refresh rates up to that limit, regardless if you have NVIDIA hardware in the monitor?

    And if the display itself isn’t able to achieve a non-ghosting picture with 120+ frames per second, then there is no way hardware from NVIDIA can solve that issue because it’s inherently a problem with LCD which is what actually displays the pixels.

    The point being, you still need LCD screens that are able to draw the frames and decay fast enough, you still need a display interface that has the available bandwidth, and these problems could be solved without limiting yourself to using NVIDIA hardware.

    Sure, if you push 150 frames per second from your NVIDIA card to some NVIDIA hardware inside your monitor, that hardware still cannot draw the 150 frames without causing ghosting artifacts unless the display was able to do that fine to begin with if it had just reported a higher refresh rate to the OS.

  • Soup

    I think you’re missing the point of the G-sync. It’s not meant to increase the refresh rate of the display. It’s meant to reduce input latency, provide smoother motion, and eliminate screen tearing and it can do this BETTER than a higher refresh rate display.

    The reason they’re making this is because there are so many technical limitations preventing us from increasing the refresh rate (not to mention how expensive those displays would be, I expect the G-sync to be very reasonably priced.)

  • Gunnar Lilleaasen

    They should be able to render 120 FPS on the GPU side and with V-SYNC still output 60 FPS without ever dropping below 60 FPS unless the GPU renders at a lower rate. So their current V-SYNC is bad, they can make it a lot better and fix this on the GPU side, yet their solution is to implement a device in the monitor that does what should be done on the GPU side of the link.

    Because even if you push 60+ FPS from the GPU to the device inside the monitor and then handle the output with that device, you’ll still end up with showing only as much as the max refresh rate allows. So you haven’t removed any actual latency.

    Example:
    GPU generates 75 FPS. Screen supports 60 FPS. But we’ve disabled V-SYNC, so it still gets 75 FPS. G-SYNC device now gets 15 FPS more than the display can handle. Hence it drops frames to avoid ghosting and you still get as many actual frames drawn as you would with V-SYNC.

    Where in that process is it I am missing out on something? How is it different from just a V-SYNC that works well?

  • lozandier

    Well Kepler is licenseable tech; Nvidia announced that a few weeks ago.

    I’m beginning to think it may be possible that they’re open to allow non Nvidia cards use the technology but with a price. It’ll just be immediately possible on new/newer Nvidia cards.

    That’s sort of moot of course because it’ll take a while for each new architecture they make to be licenseable for others, and others may not want to push that new cost to the end users of their products with an increased price point.

    However, I think it may be irrelevant to point out that Kepler Arch. is licenseable.

  • Soup

    I don’t understand what this problem you perceive with vsync is. You can’t display more than 60 fps on a 60 Hz display even with vsync off. What you end up with is screen tearing with lots of partial frames which has a negative impact on motion quality. In the very best case, you’re still only getting 60 mostly tear-free frames.

    As for G-sync, I’m assuming it’s going to cap your framerate at the maximum refresh rate of your display since generating more would negatively impact motion quality no matter how smart the device is.

    The benefit of G-sync won’t be seen when your framerate matches the display’s refresh rate, it’s when your framerate drops. For instance, if I’m generating 40 fps vsynced on a 60 Hz display, frames will alternate between 1 refresh and 2 refreshes per frame. Turning vsync off would cause significant screen tearing which will on average cause every other refresh to display half of two different frames. Both of these methods will cause significant visual stuttering. With G-sync, the display will simply refresh at 40 Hz in time with the GPU, eliminating both problems.

    I should note that at this point we don’t know if G-sync will even be offered on 60 Hz displays. The only model nVidia has mentioned so far is a 144 Hz display.

  • Vinson Peters

    Glad I jumped into PC gaming a couple of weeks ago. Guess I’ll be waiting to pick up a monitor.

  • bro3886

    No problem.

  • pocketdrummer

    I agree with Clevername, I found that a 120-144hz monitor was much better suited for fast refresh rates than for 3D. I really wish the marketing department would catch up with that concept and stop trying to push 3D.

  • pocketdrummer

    Well, the better way for them to do this would be to double up. If it’s running 30fps, run it at 60hz. at 20fps, 40hz. Anything above 30 can run 1:1.

  • Horation_Tobias_HumpleDinK

    Is this a thing you can buy for a monitor or does it have to be built in? I assume the latter (as in need new monitor), shame cant get drivers or something that can allow the feature.

  • BobBobson

    Since v-sync stands for vertical-synchronization, then shouldn’t yours be called F-sync for frame-synchronization? Or at the very least, the self-promoting N-sync? I suppose GPU-synchronization could work, but frame makes more sense since a lot of stuff happens in the GPU, but you are only synchronizing to the frame output.

  • Gunnar Lilleaasen

    So what you are describing is variable refresh rate. Which should be standardized and not be implemented as something NVIDIA specific IMO.

  • Peter Mcteague

    So, I’d be able to buy something for my BenQ XL2420t that would allow me to use g-sync, without buying a new monitor?

  • abstraction

    I know info on the g-sync mod kit is still a bit hazy, nevertheless, I’m curious about whether or not I’ll have the option of modding a Asus pb278q monitor. Additionally, are there any 27 inch ips or pls monitors on the way that will support g-sync? Any information regarding this would be greatly appreciated as I am in the market for both a new card and monitor. Thanks!

  • Wanton

    Clearly you don’t know what you are talking about.
    Watch this video I don’t think vsync can be explained simpler that this.
    http://www.youtube.com/watch?v=KhLYYYvFp9A

  • bro3886

    By far the most informative post yet. Thanks, Wanton!

  • Wanton

    G-SYNC sounds so good and I like to know if Asus or any of your other partners plan to release 27 inch monitors with this tech during 2014 Q1 launch?

  • ThePurpleGamer

    You guys should make your own version of the Oculus Rift

  • Jeffrey Byers

    Sony PS4 +
    Sony G-Sync HDTV

    = awesome!

    Seriously, have NVidia license G-Sync. It helps Sony sell both Sony HDTV’s and PS4′s. It’s a huge win for NVidia as well not just because of licensing. Many people look at new consoles and say “they use AMD so I should buy AMD for my PC.”

    People could look at consoles and say “wow, that G-Sync is great, I’ll get that tech for my PC!”

  • THANATOSXR

    MEH AMD’s R9s are better and cheaper, I seriously have no clue how Nvidia isnt bankrupt yet.

  • Katana Seiko

    Soo.. How long until you go onto the market with this?

  • bro3886

    “[...] ASUS plans to release a G-SYNC-enhanced VG248QE gaming monitor in the first half of 2014 with pricing set at $399 USD in North America. [...]” Source…

  • bro3886

    “[...] ASUS plans to release a G-SYNC-enhanced VG248QE gaming monitor in the first half of 2014 with pricing set at $399 USD in North America. [...]” Source…

  • Denis

    This work on Apple Cinema?

  • Mikle Sorokin

    When the NVidia developers got the idea of G-Sync?

  • waymon04

    Will there be a place in Chicago I can try out a G-Sync monitor?

  • David Barnes

    Ok so I have a 144Hz G-Sync monitor and I mostly get +144fps…so am I right in saying…when ever my frame rate drops below 144fps the monitor drops with it…and when ever my gpu would be putting out more than 144fps the GpU is capped at 144…so always in sync? I have a g-sync monitor and 780ti…and I’m getting a weird stutter @ high frame rates. And I believe it’s because the GPU is trying to put out many more frames than 144/sec…or I have a dodgy unit?