Quantcast

Montreal, to me, has always meant hockey masks, maple syrup and Mounties. I never thought it was the doorway to gaming heaven.

But when we launched NVIDIA G-SYNC here today, I could see my life as a gamer getting better. Actually, I could see the rest of my life getting better, too, because we’ve been working flat out to get this out the door for several years.

The idea behind G-SYNC is simple, even if the technology isn’t. It’s to deliver visually stunning games, without the artifacts that jolt you out of the zone – like a guy who keeps standing up in front of you during a great movie.

GSYNCVERTICAL
Gaming headaches? Take one of these and call us in the morning.

An Obstacle to Great Gaming

Now, I’ve been gaming on PCs for just about 20 years. I started out with “Doom” and “Descent.” Ever since, I’ve been reaching for the hard stuff. A perfect evening for me is coming home after a long day at NVIDIA, cracking open a Guinness and sitting down to a session of “Starcraft II.” Or even better, jumping into a new indie release like “Antichamber.” That thing still blows my mind.

But much as I love gaming, I’ve always hated the choices you have to make synchronizing to your monitor. With V-SYNC off you can have fast input response time, but images are seriously corrupted by tearing. Or, you can use V-SYNC on, but then games get laggy, and any time the GPU’s FPS falls below the refresh rate of the monitor, animation stutters badly.

Imagine if your fully armed buddies can see you. But your system won’t let you see them. The input lag can get you killed. Given the options, it’s not surprising that competitive gamers pick the lesser of two evils and run with V-SYNC off. But it’s still short of perfection.

What I want from gaming is to get immersed in the experience. I want to feel the cracked concrete under my feet. Or the buzzing jungle closing in around me. Or the flash of the grenade nearby. Stuttering and tearing are distortions that bring me back to the beige carpet in my game lounge. They make me wonder what caused them and I get jolted out of the zone pretty fast.

G-Sync
G-Sync: your ticket to gaming nirvana.

Getting to the Root of The Problem

This same observation got Jen-Hsun, our CEO, to commission some of the brightest minds at NVIDIA to solve this problem. We brought together about two dozen GPU architects and other senior guys to take apart the problem and look at why some games are smooth and others aren’t.

It turns out that our entire industry has been syncing the GPU’s frame-rendering rate to the monitor’s refresh rate – usually 60Hz – and it’s this syncing that’s causing a lot of the problems.

Sadly, monitors, for historic reasons, have fixed refresh rates at 60Hz. That’s because PC monitors initially used a lot of technology from TVs, and in the U.S. we standardized on a 60Hz refresh way back in the 1940s, around the time Ed Sullivan was still a fresh face. That occurred because the U.S. power grid is based on 60Hz AC power, and setting TV refresh rates to match that made early electronics easier to build for TVs. The PC industry just sort of inherited this behavior because TVs were the low-cost way to get a display.

So back at NVIDIA, we began to question whether we shouldn’t do it the other way. Instead of trying to get a GPU’s output to synchronize with a monitor refresh, what if we synchronized the monitor refresh to the GPU render rate?

No More Tearing, No More Stutters

Hundreds of engineer-years later, we’ve developed the G-SYNC module. It’s built to fit inside a display and work with the hardware and software in most of our GeForce GTX GPUs.

With G-SYNC, the monitor begins a refresh cycle right after each frame is completely rendered on the GPU. Since the GPU renders with variable time, the refresh of the monitor now has no fixed rate.

This brings big benefits for gamers. First, since the GPU drives the timing of the refresh, the monitor is always in sync with the GPU. So, no more tearing. Second, the monitor update is in perfect harmony with the GPU at any FPS. So, no more stutters, because even as scene complexity is changing, the GPU and monitor remain in sync. Also, you get the same great response time that competitive gamers get by turning off V-SYNC.

G-SYNC moves us a little closer to gaming nirvana – a world of great image quality with no tearing, no monitor stutter, and really fast input response. That lets me get back in the zone when I game. Already, I can see my beige shag carpet and Costco art prints in their crooked frames slowly replaced by the hard rock roads and dragon fire of “Skyrim.”

Life is good again.

  • unt1tled

    If my setup allows for 130fps+ gaming, what happens when the GPU renders FPS at a higher frequency than the monitor naturally supports?

  • dextronian

    will this be a future licensed technology or will we be buying special nVidia comp. monitors?

  • realjjj

    Any chance Nvidia will bring G-SYNC to mobile devices? Not just when connected to external screens , make a display driver chip if you have to.

  • brobsrt

    I want this on my Overlord Tempest 120Hz IPS panel! Give the IPS peeps some love too!!

  • Dave Reed

    This could be awesome – way more exciting than Mantle :)

    Let’s hope this gets supported on a good selection of monitors (16:10 preferred!) Would be great to see it on mobile, too. Shame the next-gen consoles are going to miss out…

  • bro3886

    Will we be seeing a “test drive” with/without G-SYNC anytime soon? A YouTube video demo setup would be great.

  • stealthgyro

    Watched the PCPer Live stream, and they said it doesn’t translate well over a video of it. Much like trying to show off 4k on a 1080p screen just doesn’t work. Hopefully Frys, Tigerdirect, Microcenter, and other Retail outlets get a good setup I can go checkout.

  • HombreGranJefe

    As a HTPC gamer, I hope you expand this technology into partners in the TV industry; refresh rate is one of the most difficult things for me to combat in my gaming experience and I’d rather not have this tech exclusive to when I use my machine as a regular PC.

  • panik

    or just get a 144HZ Asus monitor and not see any stuttering. The issue isn’t as bad as the article says it is. While I agree the 60Hz standard is too low, there is already better alternatives on the market that aren’t proprietary products made by Nvidia. Also don’t tell me that the human eye can’t see above 24Hz anyways, you can see a massive difference with a 120Hz+ monitor compared to 60Hz.

  • http://www.geekministry.org/ TC Johnson

    This is a very exciting technology. How close is it to a market release?

  • stealthgyro

    The amount of monitor manufacturers listed at the event itself, I’m going to assume licensed technology. Nvidia has been on a PR parade since AMD won both consoles. Their GameWorks API helps AMD performance, and they are showing tools available on linux. Nvidia wants to be the catch all. Yes our products work better together, but we also help the other guys.

  • Jure Slegel

    I wonder how the low framerate like 20 – 40fps will affect the lcd flickering.

  • panik

    If someone doesn’t have a PC that can run the game above 100 Fps of course they won’t get the performance increases.

  • PillowSmeller

    Can they do this to more than just monitors? I use a 32″ Samsung HDTV, so i can save a power outlet from a dedicated audio.

  • panik

    A monitor that can’t display above a certain Frequency will just turn off and display an error.

  • http://www.stephen3.com/ Stephen Smith

    The TV probably has to be built with it inside.

  • Brian_Caulfield

    Reached out to Tom Petersen, the blog’s author. Here’s his comment: “G-SYNC will initially be available only as a module that NVIDIA will sell to monitor OEMs like ASUS and BENQ. We will also like may a DIY kit available to gamers to retro fit selected existing monitors. We have no plans to license the technology at this point.”

  • Brian_Caulfield

    Great question. Checked with Tom Petersen, the blog’s author. Here’s his response: ‘The GPU will limit is render rate to match the max refresh rate of a monitor. Effectively G-SYNC acts similar to double buffered V-SYNC when a game is running extremely high FPS.’

  • Brian_Caulfield

    Checked with Tom Petersen, this blog’s author. Here’s his response: ‘Good idea… but no mobile announcements today.’

  • Zagdul

    Can you guys please cure world hunger and fix the United States national debt when this is done?

  • Brian_Caulfield

    Great question. I reached out to Tom Petersen, one of G-SYNC’s creators, for comment. Here’s what he had to say: ‘G-SYNC changes the refresh rate of a monitor and really can’t be shown on a traditional monitor. It is very hard /impossible to see the full benefit of G-SYNC in a video capture and played back at a fixed rate, We may in the future make some slow motion videos available on the web…but they honestly don’t do G-SYNC justice.’

  • Brian_Caulfield

    Great feedback. Will pass it on!

  • Brian_Caulfield

    :-)

  • Brian_Caulfield

    Thanks for the feedback! I’ll pass it along!

  • Brian_Caulfield

    Will pass on your comment about mobile!

  • realjjj

    So that’s a yes when the tech can be made small and cheap enough?
    And some folks are asking about laptops ,that one should be doable a lot easier ,so any intentions to push it in laptops maybe as soon as next year?

    The tech is exciting if it works well , price seems to be a huge problem at this point but one can hope that the price will drop close to insignificant soon enough.

  • Dave Reed

    For the first time ever, we’ll be able to see a ‘steady 40fps’, and any other framerate between 30 and 60, without tearing.

    40fps will probably look a fair bit better than 30. But the main advantage will be games that would run at ’60ish’ before – if they slow down when there’s a lot of action, it won’t be the dramatic drop to 30fps, it’ll be much more graceful, and still free from tearing.

    At the higher end, running at 100fps+ won’t give any visual benefit over a vsynced 60fps, at least to most average human eyes, but it should help with to reduce latency between input and seeing a visual response to the input – and with this tech, it won’t introduce tearing any more.

    Oh, and it’ll hopefully mean that emulators for old PAL systems can finally run at 50hz, too.

    It’s all good news, so long as you can afford both a monitor and graphics card upgrade when this launches :)

  • CuddleFuddle

    This sounds like something that may be useful for future VR headsets too like Oculus. Users are extremely sensitive to latency in those cases to the point where too much will make them barf!

  • Dave Reed

    YouTube would be useless for demonstrating this tech…

    1) YouTube is limited to, I think, 30fps. Which sucks, as it can’t even show the difference between 30fps and 60fps console games.

    2) The screen that you’re viewing the video on won’t have G-SYNC! It’d be about as effective as trying to show a 3D video on a non-3D-capable screen.

    This is going to be one of those things, like the Oculus Rift, that you just have to see with your own eyes

  • Gunnar Lilleaasen

    You can solve the described issue by getting a proper monitor that refreshes at say 120 Hz. You get the same benefit, you don’t need any NVIDIA specific stuff, and you still had to get a new monitor to make use of your NVIDIA specific stuff.

    The other solution is that you fix your V-Sync so that it doesn’t cause lag on the GPU side. Why not focus your engineering teams on that?

  • Alex

    Please pair this with your Lightboost tech for no motion blur.

  • Brian_Caulfield

    Thanks! I’ll pass your feedback along.

  • Soup

    If this technology is anywhere as good as it sounds, I really hope it becomes universal. Vsync + triple buffering works really well for me, but I know that’s not true for everyone.

  • Zepid

    When can I buy a monitor with this tech and what will be the first models shipping with this? Any news? Partners announced? What?

  • Gunnar Lilleaasen

    This “technology” does not provide anything you couldn’t achieve with getting a screen with higher refresh rate.

  • mlmcasual

    Brian, this is exciting news.

    But a big question I have is what size and resolutions are expected to be implemented? 1080P is simply not high enough res. for moderate and power users. Will this technology bee in monitors greater res. then 1080? This is a critical question.

  • Clevername

    Yeah, I’m a little disappointed there hasn’t been more official adaptation of this from Nvidia. A feature that was originally for 3D, yet people found more use for in 2D mode is pretty big.

  • xenphor

    Will this tecnology be retroactive? For example, there are game such as Doom 3 that are locked at 60fps. If you uncap the framerate, the physics go out of whack. This isn’t a problem if your monitor is 60hz and you can maintain 60fps at all times, but that is not the best solution.

    Plus, I believe syncing to lower FPS than your refresh (like a 30fps locked game on a 60hz monitor) increaes lag even more.

  • Soup

    The complete elimination of screen tearing can’t be achieved with a higher refresh rate, the effects are only lessened. The latency reduction is much more powerful though. At 60 fps, the worst case display latency is 16.67ms plus the response time of the display. At 120 Hz, the worst case is 8.33ms plus response time. With a technology like this, not only would the display latency be constant (which is a great improvement over fixed refresh rate) but it should be in the order of 1ms plus response time. That would be on par with a 1000 Hz display, which is far beyond the capabilities of our current LCD technology.

  • GravyMaker

    I’m very unlucky with new hardware, so my apologies if I seem overly pessimistic (because this sounds like a great technology), but supposing the G-Sync hardware in a given monitor were to have some sort of issue, would the monitor stop displaying entirely, or is there some failsafe to keep the basic functionality going?

  • Ramon II

    i just love NVIDIA like no other, just pure NVIDIA. Thanks for a wonderful blog sir Brian_Caulfield.

  • Younus

    i am about to buy Asus 144hz monitor but since G-sync is about to come .. can you tell me how long shall I wait if this thing is releasing ?

  • bro3886

    A comparative slow mo vid would be better than nothing.

  • pbvider

    VG248QE will debut next year at $400 with Nvidia’s G-Sync technology.

  • Younus

    I am in interested in VG27. basically want the 27 inch version .. will it also debut in next year in vg27 ?
    when we say next year .. will it be Q1 or Q2 or ever later ?
    please answer .. thank you :)

  • Gunnar Lilleaasen

    The limitation of the LCD display is the time it takes for the crystals to physically change their angle. That’s a hardware limit that you cannot bypass with just embedding another piece of hardware.

    For most screens that would be like 2 ms today. Meaning the screen should be able to draw a frame within 2 ms. This means it could draw a frame 500 times per second.

    So, the big problem here is the refresh rate. The screen only draws 60 frames per second. So, if the manufacturer had made the screen report the actual higher refresh rate and have the screen refresh frames at that rate (which it should be capable of), there would be no tearing.
    So the only way any embedded NVIDIA hardware could solve would be to bypass the limitation on how often frames are drawn by the screen (if not, it would mean it still draws 60 frames per seconds and you’d still see the same display latency). Why do we need hardware specific to NVIDIA to solve such an issue, when the manufacturer could just make the screen draw frames at the rate it is capable of?

  • pbvider

    Q1 2014,I really dunno anything about the 27 inch version,G-sync kit will go for sell at max 100$.

  • Enrico Prova

    OR…since the monitor follows the gpu, I imagine you can simply set a fps limit to your vga :)

  • Younus

    ok last question bussy .. sorry for bugging you though..
    if I go for Vg27 now will the G-sync kit be compatible with the 27″ version. ?
    if yes , then do I have to do any modification to my monitor or open up and cut down and things like that . Coz I am very very poor with modding stuffs.
    And if so it does need an advance level modding then I might wait.
    Thanks a lot :)

  • pbvider

    I dunno m8,we need to wait and see…

  • Thomag0

    Could this also fix the video stuttering problems ReClock is a (for the most part not really working) makeshift solution for?

  • BlacKHeaDSg10

    Probably every fan of Nvidia will try to kill me for question i will ask it. Will G-Sync work only with Nvidia graphic cards ?

  • Johan de la Rosée

    So we will need to buy a new monitor/graphiccard? :)

  • bro3886

    NVIDIA CEO Jen-Hsun Huang hinted at this during the Nvidia Montreal gig: “[…] It includes a G-SYNC module designed by NVIDIA and integrated into gaming monitors, as well as hardware and software incorporated into certain Kepler-based GPUs. […]” Source From a business point of view this move makes very good sense for Nvidia as monitors are not replaced as frequently as graphics cards, urging gamers to upgrade to – yet another – G-Sync compatible grahpics cards.

  • bro3886

    If you don’t own one of the Kepler GPUs, which go with the tech, yes.

  • Soup

    There are 2 big issues with cranking up the refresh rate.

    The first is the amount of bandwidth available in the display interface. In order to drive a display at 500Hz, you’d need more than 4x the bandwidth necessary to drive a 120Hz display. From the information I’ve been able to find, DisplayPort will max out at 240Hz at 1080p.

    The other big issue is the display response time itself. Not only is the manufacturer-reported response time an ideal grey-to-grey (so any other colors will be slower) but they are achieving these timings by overdriving the display, which causes ghosting artifacts. Those ghosting artifacts will be amplified as the refresh rate increases, due to not allowing them time to decay before the next frame is drawn.

  • BlacKHeaDSg10

    Tnx for info

  • Gunnar Lilleaasen

    There is no good reason why some special NVIDIA hardware should be able to solve the issue, because there are limitations in the whole chain they are relying on – the chain is no stronger than its weakest link.

    Sure, the bandwidth of DP is not enough to transfer 500 frames per second, but it should be able to transfer 120-240 frames per second. So why not just set the display to allow refresh rates up to that limit, regardless if you have NVIDIA hardware in the monitor?

    And if the display itself isn’t able to achieve a non-ghosting picture with 120+ frames per second, then there is no way hardware from NVIDIA can solve that issue because it’s inherently a problem with LCD which is what actually displays the pixels.

    The point being, you still need LCD screens that are able to draw the frames and decay fast enough, you still need a display interface that has the available bandwidth, and these problems could be solved without limiting yourself to using NVIDIA hardware.

    Sure, if you push 150 frames per second from your NVIDIA card to some NVIDIA hardware inside your monitor, that hardware still cannot draw the 150 frames without causing ghosting artifacts unless the display was able to do that fine to begin with if it had just reported a higher refresh rate to the OS.

  • Soup

    I think you’re missing the point of the G-sync. It’s not meant to increase the refresh rate of the display. It’s meant to reduce input latency, provide smoother motion, and eliminate screen tearing and it can do this BETTER than a higher refresh rate display.

    The reason they’re making this is because there are so many technical limitations preventing us from increasing the refresh rate (not to mention how expensive those displays would be, I expect the G-sync to be very reasonably priced.)

  • Gunnar Lilleaasen

    They should be able to render 120 FPS on the GPU side and with V-SYNC still output 60 FPS without ever dropping below 60 FPS unless the GPU renders at a lower rate. So their current V-SYNC is bad, they can make it a lot better and fix this on the GPU side, yet their solution is to implement a device in the monitor that does what should be done on the GPU side of the link.

    Because even if you push 60+ FPS from the GPU to the device inside the monitor and then handle the output with that device, you’ll still end up with showing only as much as the max refresh rate allows. So you haven’t removed any actual latency.

    Example:
    GPU generates 75 FPS. Screen supports 60 FPS. But we’ve disabled V-SYNC, so it still gets 75 FPS. G-SYNC device now gets 15 FPS more than the display can handle. Hence it drops frames to avoid ghosting and you still get as many actual frames drawn as you would with V-SYNC.

    Where in that process is it I am missing out on something? How is it different from just a V-SYNC that works well?

  • lozandier

    Well Kepler is licenseable tech; Nvidia announced that a few weeks ago.

    I’m beginning to think it may be possible that they’re open to allow non Nvidia cards use the technology but with a price. It’ll just be immediately possible on new/newer Nvidia cards.

    That’s sort of moot of course because it’ll take a while for each new architecture they make to be licenseable for others, and others may not want to push that new cost to the end users of their products with an increased price point.

    However, I think it may be irrelevant to point out that Kepler Arch. is licenseable.

  • Soup

    I don’t understand what this problem you perceive with vsync is. You can’t display more than 60 fps on a 60 Hz display even with vsync off. What you end up with is screen tearing with lots of partial frames which has a negative impact on motion quality. In the very best case, you’re still only getting 60 mostly tear-free frames.

    As for G-sync, I’m assuming it’s going to cap your framerate at the maximum refresh rate of your display since generating more would negatively impact motion quality no matter how smart the device is.

    The benefit of G-sync won’t be seen when your framerate matches the display’s refresh rate, it’s when your framerate drops. For instance, if I’m generating 40 fps vsynced on a 60 Hz display, frames will alternate between 1 refresh and 2 refreshes per frame. Turning vsync off would cause significant screen tearing which will on average cause every other refresh to display half of two different frames. Both of these methods will cause significant visual stuttering. With G-sync, the display will simply refresh at 40 Hz in time with the GPU, eliminating both problems.

    I should note that at this point we don’t know if G-sync will even be offered on 60 Hz displays. The only model nVidia has mentioned so far is a 144 Hz display.

  • Vinson Peters

    Glad I jumped into PC gaming a couple of weeks ago. Guess I’ll be waiting to pick up a monitor.

  • bro3886

    No problem.

  • pocketdrummer

    I agree with Clevername, I found that a 120-144hz monitor was much better suited for fast refresh rates than for 3D. I really wish the marketing department would catch up with that concept and stop trying to push 3D.

  • pocketdrummer

    Well, the better way for them to do this would be to double up. If it’s running 30fps, run it at 60hz. at 20fps, 40hz. Anything above 30 can run 1:1.

  • Horation_Tobias_HumpleDinK

    Is this a thing you can buy for a monitor or does it have to be built in? I assume the latter (as in need new monitor), shame cant get drivers or something that can allow the feature.

  • BobBobson

    Since v-sync stands for vertical-synchronization, then shouldn’t yours be called F-sync for frame-synchronization? Or at the very least, the self-promoting N-sync? I suppose GPU-synchronization could work, but frame makes more sense since a lot of stuff happens in the GPU, but you are only synchronizing to the frame output.

  • Gunnar Lilleaasen

    So what you are describing is variable refresh rate. Which should be standardized and not be implemented as something NVIDIA specific IMO.

  • Peter Mcteague

    So, I’d be able to buy something for my BenQ XL2420t that would allow me to use g-sync, without buying a new monitor?

  • abstraction

    I know info on the g-sync mod kit is still a bit hazy, nevertheless, I’m curious about whether or not I’ll have the option of modding a Asus pb278q monitor. Additionally, are there any 27 inch ips or pls monitors on the way that will support g-sync? Any information regarding this would be greatly appreciated as I am in the market for both a new card and monitor. Thanks!

  • Wanton

    Clearly you don’t know what you are talking about.
    Watch this video I don’t think vsync can be explained simpler that this.
    http://www.youtube.com/watch?v=KhLYYYvFp9A

  • bro3886

    By far the most informative post yet. Thanks, Wanton!

  • Wanton

    G-SYNC sounds so good and I like to know if Asus or any of your other partners plan to release 27 inch monitors with this tech during 2014 Q1 launch?

  • ThePurpleGamer

    You guys should make your own version of the Oculus Rift

  • Jeffrey Byers

    Sony PS4 +
    Sony G-Sync HDTV

    = awesome!

    Seriously, have NVidia license G-Sync. It helps Sony sell both Sony HDTV’s and PS4’s. It’s a huge win for NVidia as well not just because of licensing. Many people look at new consoles and say “they use AMD so I should buy AMD for my PC.”

    People could look at consoles and say “wow, that G-Sync is great, I’ll get that tech for my PC!”

  • THANATOSXR

    MEH AMD’s R9s are better and cheaper, I seriously have no clue how Nvidia isnt bankrupt yet.

  • Katana Seiko

    Soo.. How long until you go onto the market with this?

  • bro3886

    “[…] ASUS plans to release a G-SYNC-enhanced VG248QE gaming monitor in the first half of 2014 with pricing set at $399 USD in North America. […]” Source…

  • bro3886

    “[…] ASUS plans to release a G-SYNC-enhanced VG248QE gaming monitor in the first half of 2014 with pricing set at $399 USD in North America. […]” Source…

  • Denis

    This work on Apple Cinema?

  • Mikle Sorokin

    When the NVidia developers got the idea of G-Sync?

  • waymon04

    Will there be a place in Chicago I can try out a G-Sync monitor?

  • David Barnes

    Ok so I have a 144Hz G-Sync monitor and I mostly get +144fps…so am I right in saying…when ever my frame rate drops below 144fps the monitor drops with it…and when ever my gpu would be putting out more than 144fps the GpU is capped at 144…so always in sync? I have a g-sync monitor and 780ti…and I’m getting a weird stutter @ high frame rates. And I believe it’s because the GPU is trying to put out many more frames than 144/sec…or I have a dodgy unit?