G-Sync is the god-level gaming upgrade.

I'm interested in seeing this solution in action since I never use RTSS and think I always have smooth performance but maybe I'm missing something.

Do yourself a massive favour - if you're happy and think you have smooth performance, don't change anything. Seriously.
 
Do yourself a massive favour - if you're happy and think you have smooth performance, don't change anything. Seriously.

The one thing that is bothering me and I doubt this is related to GSync, is in 2d isometric games like Pillars of Eternity or the old Divine Divinity (first one), the background becomes slightly blurry during scrolling that it is slightly irritating to the eye even though I'm running those games at 144hz refresh rate. Is this blurriness normal? Can anyone suggest some troubleshooting tips?
 
The one thing that is bothering me and I doubt this is related to GSync, is in 2d isometric games like Pillars of Eternity or the old Divine Divinity (first one), the background becomes slightly blurry during scrolling that it is slightly irritating to the eye even though I'm running those games at 144hz refresh rate. Is this blurriness normal? Can anyone suggest some troubleshooting tips?

How much blurriness? All(?) tft screens have some amount of blurriness, ULMB can help with it but doesnt completely remove it.
 
How much blurriness? All(?) tft screens have some amount of blurriness, ULMB can help with it but doesnt completely remove it.

Yeah I agree.

I was just playing Rayman Legends (at 60hz) and that was really blurry with the scrolling. With 3D stuff I don't notice it much or at all, but high-res art scrolling to the side really highlighted the problem.
 
The one thing that is bothering me and I doubt this is related to GSync, is in 2d isometric games like Pillars of Eternity or the old Divine Divinity (first one), the background becomes slightly blurry during scrolling that it is slightly irritating to the eye even though I'm running those games at 144hz refresh rate. Is this blurriness normal? Can anyone suggest some troubleshooting tips?
Try lightboost
 
The one thing that is bothering me and I doubt this is related to GSync, is in 2d isometric games like Pillars of Eternity or the old Divine Divinity (first one), the background becomes slightly blurry during scrolling that it is slightly irritating to the eye even though I'm running those games at 144hz refresh rate. Is this blurriness normal? Can anyone suggest some troubleshooting tips?
The refresh rate may be high, but the framerate is low. Since G-Sync displays are flicker-free monitors (most LCD and OLED displays are, but G-Sync displays must be), that means the amount of motion blur is directly tied to the framerate.
ULMB, if your monitor has the option, strobes the backlight to reduce motion blur, but the problem is that it only works at 85/100/120Hz. 30 FPS strobed at 120Hz will break up into four distinct images in motion, instead of there being motion blur. You really need the framerate to match the strobe rate for it to work correctly. But you may still prefer this to motion blur.
Flicker is the reason why CRT displays had such clear motion. The less time an image is held on-screen, the less motion blur we will perceive, and CRTs had a very short "hold time" - which is why they flickered a lot.
What I'm really hoping for is NVIDIA to combine ULMB with G-Sync so that it will automatically strobe the image in sync with the framerate, instead of ULMB only working at 85/100/120Hz.
 
So I just got an Acer Predator x34, some questions:

There's no reason not to overclock this to 100hz right? And once I do, you just can always leave it there?

If a game isn't hitting near there and averaging closer to 60 fps, are you better off being at 60hz, or should you still keep it at 100?

I currently have a 1080ti and ryzen 7 1700 (not overclocked yet)

Any suggestions or tips for the monitor or settings on various things to make the most of all this would be a great help
 
The refresh rate may be high, but the framerate is low. Since G-Sync displays are flicker-free monitors (most LCD and OLED displays are, but G-Sync displays must be), that means the amount of motion blur is directly tied to the framerate.
ULMB, if your monitor has the option, strobes the backlight to reduce motion blur, but the problem is that it only works at 85/100/120Hz. 30 FPS strobed at 120Hz will break up into four distinct images in motion, instead of there being motion blur. You really need the framerate to match the strobe rate for it to work correctly. But you may still prefer this to motion blur.
Flicker is the reason why CRT displays had such clear motion. The less time an image is held on-screen, the less motion blur we will perceive, and CRTs had a very short "hold time" - which is why they flickered a lot.
What I'm really hoping for is NVIDIA to combine ULMB with G-Sync so that it will automatically strobe the image in sync with the framerate, instead of ULMB only working at 85/100/120Hz.

The framerate im Divine Divinity is very high, over 100 fps at all times... I'll try ULMB.
 
How much blurriness? All(?) tft screens have some amount of blurriness, ULMB can help with it but doesnt completely remove it.

It's not much by any means, I suspect this is a normal thing for TFT screens, was just hoping there is a way to improve it.
 
Got a Dell S2716DG because it seemed really cheap for a 27" G-Sync ($450 at best buy) and I like my other Dell monitors.

I wasn't instantly blown away, but it's looking really good after messing with calibrating a lot. Still getting some slight ghosting when scrolling in windows, which apparently I can fix with ULMB enabled, but that disables G-Sync. Is there a workaround for this?

G-Sync is pretty sweet. It's also my first time with a >60Hz monitor. Rocket League at 140fps/144Hz feels pretty great. So I want to cap FPS to around 142 in each game if my refresh rate is 144Hz, with ingame vsync off? Is that basically the only checkmarks for setting up each game for G-Sync?

I noticed the biggest difference in Planet Coaster. The FPS gets a little chunky when in ride camera. But now it feels really smooth even when it drops. I was also surprised when playing XCOM 2 that my FPS was only ~40 because it felt smoother. My framerate in that game has been all over the place since I got the DLC, not sure what's up with that..
 
So I just got an Acer Predator x34, some questions:

There's no reason not to overclock this to 100hz right? And once I do, you just can always leave it there?

If a game isn't hitting near there and averaging closer to 60 fps, are you better off being at 60hz, or should you still keep it at 100?

I currently have a 1080ti and ryzen 7 1700 (not overclocked yet)

Any suggestions or tips for the monitor or settings on various things to make the most of all this would be a great help

There's no reason not to overclock to 100hz, no. Well, I suppose it does put some strain on the monitor, since you're technically overclocking it, and some people did report having to slightly lower the overclock after months of use, because the monitor wouldn't go 100hz anymore, but I wouldn't worry about it. Certainly not while still under warranty. 100hz is one of the selling points of this monitor, it'd be a sin not to take advantage of it.

You don't have to lower the refresh rate to 60hz in any scenario. This is a g-sync monitor, its refresh rate is tied to the framerate. G-sync takes care of everything, just be sure to enable it and turn v-sync off in games.

https://www.howtogeek.com/270672/how-to-enable-optimize-and-tweak-nvidia-g-sync/

As for settings, I'm very happy with the ICC profile from TFTCentral.

http://www.tftcentral.co.uk/articles/icc_profiles.htm
 
You don't have to lower the refresh rate to 60hz in any scenario. This is a g-sync monitor, its refresh rate is tied to the framerate. G-sync takes care of everything, just be sure to enable it and turn v-sync off in games.

That’s not entirely true. I own a few games where game speed is tied to framerate and they’re essentially broken unless I force my 144 hz G-Sync monitor to 60 hz. The new Darius game, for example.
 
That's not entirely true. I own a few games where game speed is tied to framerate and they're essentially broken unless I force my 144 hz G-Sync monitor to 60 hz. The new Darius game, for example.

giphy.gif


There are exceptions and a few other g-sync tricks, like limiting your fps to a couple of frames below your monitor max refresh rate, but in general he should be golden.
 
Man, I wished they still sold the Gsync kits. I can't find them anywhere on the internet. I would honestly buy like three right now if I could. Does anyone know where or if anyone has an extra one?
 
I own a few games where game speed is tied to framerate and they’re essentially broken unless I force my 144 hz G-Sync monitor to 60 hz. The new Darius game, for example.
Usually creating a profile in RTSS to cap them to 60 FPS works for games like that. Not always, but most of the time.
That way you still benefit from the higher refresh rate and lower input lag.
 
Yeah, ULMB at 120hz is significantly clearer than gsync at 144hz

Problem is you can't use gsync with ULMB and ULMB darkens your monitor quite a bit (depending on your monitor)

Good place to test a comparison here: https://www.testufo.com/#test=framerates

So I tried ULMB and honestly couldn't see a difference in Divine Divinity or the UFO test you linked. In any case, does getting an IPS monitor improve this in any way?
 
It shouldn't be stuttering when the framerate hits the refresh rate, unless you have either disabled V-Sync or enabled Fast Sync.
Optimal settings are to have V-Sync enabled in the NVIDIA Control Panel, and an RTSS framerate limit set 3 FPS lower than the maximum refresh rate. In-game V-Sync is optional, though some games may perform better with it enabled.
If it is a known-good framerate limiter, it's possible to slightly reduce latency even more by using the game's own framerate limiter instead of RTSS, but RTSS is universal and will give you a more consistent experience.

This is good to know, thanks.
 
Ay bruhs, any date news for the ROG Swift PG35VQ and Acer Predator X35?

I decided one of them will be my next upgrade, but I'm only seeing "Q4" for release dates so far. Anything more definitive? I have my $~1500 for the 3440x1440 @ 200Hz + Quantum Dots and HDR glory.
 
Ay bruhs, any date news for the ROG Swift PG35VQ and Acer Predator X35?
I decided one of them will be my next upgrade, but I'm only seeing "Q4" for release dates so far. Anything more definitive? I have my $~1500 for the 3440x1440 @ 200Hz + Quantum Dots and HDR glory.
I think they were recently delayed to 2018, and they may be more than $1500.
 
Picked up the Viewsonic XG2703 GS. So far so good but I am getting occasional artifact bars flickering at various parts of the screen when games are loading. Changing my game from windowed to full screen mode apparently fixes it but anyone knows what's the deal here?

Running dual monitor with a second 1920x1200 60Hz monitor on a 1070.
 
Picked up the Viewsonic XG2703 GS. So far so good but I am getting occasional artifact bars flickering at various parts of the screen when games are loading. Changing my game from windowed to full screen mode apparently fixes it but anyone knows what's the deal here?

Running dual monitor with a second 1920x1200 60Hz monitor on a 1070.

Check if g-sync on windowed mode is enabled on the nvidia control panel.
 
No, it stops you going over your monitor refresh rate, which disables gsync

Then why limit the framerate bellow the monitor max ?
That is the one thing I don't understand. I mean, doesn't gsink/vsync over monitor limit just cover the whole framerate spectrum ? Is it for input lag ?
 
Then why limit the framerate bellow the monitor max ?
That is the one thing I don't understand. I mean, doesn't gsink/vsync over monitor limit just cover the whole framerate spectrum ? Is it for input lag ?
V-Sync is a hard cap. The game cannot exceed the V-Sync limit.
But when the framerate hits the V-Sync limit it can start buffering frames, which adds input lag.

Framerate limiters are supposed to be a hard limit, but do not have the same guarantee like V-Sync that they will never be exceeded.
Due to frame-time variance, even if you have a framerate limit set below the maximum refresh rate, sometimes you might get a frame that exceeds this limit - which will tear if you don't have V-Sync enabled.
V-Sync compensates for this frame-time variance and prevents tearing, but the framerate limiter prevents V-Sync from starting to buffer frames.

So 99% of the time the framerate limiter prevents tearing/stuttering, but having V-Sync enabled at the same time guarantees it.

Picked up the Viewsonic XG2703 GS. So far so good but I am getting occasional artifact bars flickering at various parts of the screen when games are loading. Changing my game from windowed to full screen mode apparently fixes it but anyone knows what's the deal here?
Running dual monitor with a second 1920x1200 60Hz monitor on a 1070.
What do you mean by "artifact bars" ? Screen tearing, or something else?
It does sound like screen tearing, if windowed mode fixes it. Make sure that V-Sync is set ON for the global profile in the NVIDIA Control Panel.

What happens if you disable the other monitor? (WIN+P)
 
I am having second thoughts about getting a G-Sync monitor after reading up on this thread a bit. All this stuff with ghosting and blurring and overclocking. I think i would do just fine with a 144hz refresh rate and save myself some money in the process. However! I have a very stupid question for you gaf; im sorry, i have never actually seen anything higher than 75hz (just recently) i have been living my whole life thinking that the human eye can't "really" see anything higher than 60. Until recently i have witnessed 75 refresh rates at a friends place (playing overwatch) and i could immediately notice the smoothness of it. Hence my consideration for G-Sync. Anyway sorry i got derailed off my question; so what happens if im running my games at a lower fps than my screens refresh rate? Lets say im playing Red Dead Redemption 2 at 60fps on a 144hz monitor?
 
I am having second thoughts about getting a G-Sync monitor after reading up on this thread a bit. All this stuff with ghosting and blurring and overclocking. I think i would do just fine with a 144hz refresh rate and save myself some money in the process. However! I have a very stupid question for you gaf; im sorry, i have never actually seen anything higher than 75hz (just recently) i have been living my whole life thinking that the human eye can't "really" see anything higher than 60. Until recently i have witnessed 75 refresh rates at a friends place (playing overwatch) and i could immediately notice the smoothness of it. Hence my consideration for G-Sync. Anyway sorry i got derailed off my question; so what happens if im running my games at a lower fps than my screens refresh rate? Lets say im playing Red Dead Redemption 2 at 60fps on a 144hz monitor?
If the game is refreshing at a different rate than a non-Gsync display, then you get tearing, unless you use some form of Vsync which will add lag. If the display has a Gsync module, then the display refresh rate is adjusted to be in sync with the game refresh rate, meaning no tearing, and therefore no need to add lag due to Vsync.
 
If the game is refreshing at a different rate than a non-Gsync display, then you get tearing, unless you use some form of Vsync which will add lag. If the display has a Gsync module, then the display refresh rate is adjusted to be in sync with the game refresh rate, meaning no tearing, and therefore no need to add lag due to Vsync.

Yes, thank you. What i meant though, is with Vsync on? Would that mean that the monitor refresh rate is also variable? In other words: 144hz monitor with vsync on , game running at say 60fps would that introduce tearing? Or would the monitor refresh rate go down to 60 to match that of the game?
 
I am having second thoughts about getting a G-Sync monitor after reading up on this thread a bit. All this stuff with ghosting and blurring and overclocking. I think i would do just fine with a 144hz refresh rate and save myself some money in the process. However! I have a very stupid question for you gaf; im sorry, i have never actually seen anything higher than 75hz (just recently) i have been living my whole life thinking that the human eye can't "really" see anything higher than 60. Until recently i have witnessed 75 refresh rates at a friends place (playing overwatch) and i could immediately notice the smoothness of it. Hence my consideration for G-Sync.
I don't think that high refresh rate displays are worth it without also supporting variable refresh rates. (G-Sync for NVIDIA, FreeSync for AMD)
It is only "overclocking" in the sense that it exceeds the DisplayPort 1.2 spec, so it's possible that not all cables will support it. It is expected that most people will use their displays with this option enabled.

G-Sync was not created to eliminate motion blur. By supporting >60Hz refresh rates and having tightly controlled overdrive, it does reduce motion blur somewhat.
The technology is designed to improve game smoothness and eliminate screen tearing without adding latency, rather than eliminate motion blur.

The way to fix motion blur is to make the display flicker by strobing the backlight, just like CRTs used to flicker.
That's what ULMB does. You cannot (currently) combine G-Sync with ULMB - you have to pick one or the other. Most 144Hz+ G-Sync displays also support ULMB, though it's typically limited to 85/100/120Hz.

Anyway sorry i got derailed off my question; so what happens if im running my games at a lower fps than my screens refresh rate? Lets say im playing Red Dead Redemption 2 at 60fps on a 144hz monitor?
Without G-Sync, you will have stuttering (always) and either tearing (V-Sync Off) or input lag (V-Sync On).

Yes, thank you. What i meant though, is with Vsync on? Would that mean that the monitor refresh rate is also variable? In other words: 144hz monitor with vsync on , game running at say 60fps would that introduce tearing? Or would the monitor refresh rate go down to 60 to match that of the game?
V-Sync will prevent tearing. However it adds lag, and you would have to lock the game to a divisor of the refresh rate for it to be smooth: 144, 72, 48, 36 FPS.
You cannot play 60 FPS games smoothly on a 144Hz display - it will stutter constantly. You could manually set the refresh rate to 60Hz or 120Hz though.

The point is that G-Sync automatically syncs the refresh rate to the framerate no matter what it is, rather than updating at a fixed rate.
With a fixed refresh rate display, even if you drop the refresh rate to 60Hz you cannot guarantee that the framerate will always be 60 FPS unless it's an old game that isn't demanding to run at all.
If the game drops to 58 FPS on a 60Hz display, it will stutter. On a G-Sync display, the refresh rate will also drop from 60Hz to 58Hz and it will remain smooth rather than stuttering.
 
If the game drops to 58 FPS on a 60Hz display, it will stutter. On a G-Sync display, the refresh rate will also drop from 60Hz to 58Hz and it will remain smooth rather than stuttering.

Also, worth noting is that quite a few games don't support G-Sync and loads more have problems with framerates higher than 60, so depending on what games you play you might not even notice any difference.
 
Also, worth noting is that quite a few games don't support G-Sync and loads more have problems with framerates higher than 60, so depending on what games you play you might not even notice any difference.
The vast majority of games do work with G-Sync.
There is still a latency benefit for games capped to 60 FPS when used on a high refresh rate display. And again, even games capped to 60 still have the potential to drop below 60.
 
I don't think that high refresh rate displays are worth it without also supporting variable refresh rates. (G-Sync for NVIDIA, FreeSync for AMD)
It is only "overclocking" in the sense that it exceeds the DisplayPort 1.2 spec, so it's possible that not all cables will support it. It is expected that most people will use their displays with this option enabled.

G-Sync was not created to eliminate motion blur. By supporting >60Hz refresh rates and having tightly controlled overdrive, it does reduce motion blur somewhat.
The technology is designed to improve game smoothness and eliminate screen tearing without adding latency, rather than eliminate motion blur.

The way to fix motion blur is to make the display flicker by strobing the backlight, just like CRTs used to flicker.
That's what ULMB does. You cannot (currently) combine G-Sync with ULMB - you have to pick one or the other. Most 144Hz+ G-Sync displays also support ULMB, though it's typically limited to 85/100/120Hz.


Without G-Sync, you will have stuttering (always) and either tearing (V-Sync Off) or input lag (V-Sync On).


V-Sync will prevent tearing. However it adds lag, and you would have to lock the game to a divisor of the refresh rate for it to be smooth: 144, 72, 48, 36 FPS.
You cannot play 60 FPS games smoothly on a 144Hz display - it will stutter constantly. You could manually set the refresh rate to 60Hz or 120Hz though.

The point is that G-Sync automatically syncs the refresh rate to the framerate no matter what it is, rather than updating at a fixed rate.
With a fixed refresh rate display, even if you drop the refresh rate to 60Hz you cannot guarantee that the framerate will always be 60 FPS unless it's an old game that isn't demanding to run at all.
If the game drops to 58 FPS on a 60Hz display, it will stutter. On a G-Sync display, the refresh rate will also drop from 60Hz to 58Hz and it will remain smooth rather than stuttering.

Much obliged sir, much obliged. Making me reconsider now. Should i bite the bullet? I love metroidvanias though alot, been playing sundrered lately (great game for the genre lovers) and i would hate to have all blurred out and stuff. Cant have yourcake and eat it too, i guess. There always is a compromise.
 
The vast majority of games do work with G-Sync.
There is still a latency benefit for games capped to 60 FPS when used on a high refresh rate display. And again, even games capped to 60 still have the potential to drop below 60.

I'd frame it that a vast majority of games that would benefit from G-Sync will work with G-Sync.
I play a lot of Visual Novels, and I'm yet to find one that works with it ;)
Games using Unity don't seem to support G-Sync as well. :\

And yeah, reduced input latency is golden, and it works even when source framerate is capped at 60 fps - it's really, really noticeable in emulators when played with a gamepad. I had to switch to my secondary, non-Gsync, 60 Hz monitor a few weeks ago and despite the game running at the same framerate on both monitors, it felt like molasses on the secondary.
 
Much obliged sir, much obliged. Making me reconsider now. Should i bite the bullet? I love metroidvanias though alot, been playing sundrered lately (great game for the genre lovers) and i would hate to have all blurred out and stuff. Cant have yourcake and eat it too, i guess. There always is a compromise.
Unless you are using a CRT, all displays have a lot of motion blur due to flat panels generally being flicker-free.
G-Sync displays are some of the best non-CRT displays for it due to the support for high refresh rates and tightly-controlled overdrive.
Just make sure that you buy one with a TN or IPS panel rather than a VA panel if you are concerned about ghosting.

I'd frame it that a vast majority of games that would benefit from G-Sync will work with G-Sync.
I play a lot of Visual Novels, and I'm yet to find one that works with it ;)
Games using Unity don't seem to support G-Sync as well. :\
Do you have G-Sync set to Fullscreen-only, rather than Fullscreen+Windowed Mode?
I don't really play Visual Novels, but I just checked and it seems to work in VA-11 Hall-A and Danganronpa.
Danganronpa is really the only one where that matters at all, since it's a 3D game while most games in the genre the display mostly-static 2D images with text.
 
Do you have G-Sync set to Fullscreen-only, rather than Fullscreen+Windowed Mode?

I generally leave it as Fullscreen only, and switch the Windowed mode per app with Inspector, as the later introduces issues in places.
But yeah, just tried Valhalla and it supports G-Sync in all of its 30 fps glory :P

All three currently used panel technologies have issues - TN with poor colors and terrible viewing angles, IPS with bad blacks and VA with response time, as you mentioned. Just a question what is more important for you.

Oh, and I did some tinkering and was able to force G-Sync on in an Unity game (Hollow Knight) by making the game use DXGI flip presentation model.
 
I generally leave it as Fullscreen only, and switch the Windowed mode per app with Inspector, as the later introduces issues in places.
That will be why you aren't seeing it working in so many games then.
Unity games nearly always use borderless windowed mode rather than true fullscreen exclusive mode.

Oh, and I did some tinkering and was able to force G-Sync on in an Unity game (Hollow Knight) by making the game use DXGI flip presentation model.
You shouldn't have to do anything to get G-Sync working with Hollow Knight.

I would suggest creating a G-Sync blacklist profile that you add specific applications to, rather than enabling it per-game via NVIDIA Profile Inspector.
Create a new profile, set Application State and Application Requested State to "Disallow" and add the executables you want to blacklist to that profile.
That will soft-disable G-Sync for those applications rather than switching the monitor out of G-Sync mode.
 
That will be why you aren't seeing it working in so many games then.
Unity games nearly always use borderless windowed mode rather than true fullscreen exclusive mode.

Yeah, but having G-Sync enabled for windowed mode makes any UAC screen flicker like hell. Not really something a blacklist will fix, AFAIK.
 
You mean when the system prompts for elevated permissions? That shouldn't be happening.

Hmm, just checked and it's not doing that anymore. Either a driver upgrade solved it, or it was a bug in my G-Sync firmware (I had to send my screen to a service depot recently and they had to upgrade the thing).
 
I would suggest creating a G-Sync blacklist profile that you add specific applications to, rather than enabling it per-game via NVIDIA Profile Inspector.
Create a new profile, set Application State and Application Requested State to "Disallow" and add the executables you want to blacklist to that profile.
That will soft-disable G-Sync for those applications rather than switching the monitor out of G-Sync mode.

What's the difference between 'Disallow' and 'Fixed Refresh Rate' ?

I myself have it set to 'Fixed Refresh Rate' globally, and just set it to 'Allow' on any profile where I actually want to use V-sync.
 
What's the difference between 'Disallow' and 'Fixed Refresh Rate' ?
I myself have it set to 'Fixed Refresh Rate' globally, and just set it to 'Allow' on any profile where I actually want to use V-sync.
If I recall correctly the other options would disable G-Sync so the screen would flicker every time you switched applications. Disallow has the same effect but G-Sync is still kept active so the screen doesn’t flicker.
Can I ask why you would keep G-Sync disabed by default? Doesn’t make a lot of sense to me.
 
BTW, I had a talk with Kaldaien, and apparently NVIDIA driver can state the game supports G-Sync even if it's not really working.
Case in point:
yTK7tid.jpg


According to Kaldaien, MSAA implementation disables G-Sync. Having the game use Swap Effect other than Discard does the same. Same with Swap Interval being set to zero. Games using sRGB color mode will not work as well...

Not sure what to think of it, NVIDIA wouldn't so blatantly lie to consumers, would they?
 
BTW, I had a talk with Kaldaien, and apparently NVIDIA driver can state the game supports G-Sync even if it's not really working.
Case in point:
yTK7tid.jpg


According to Kaldaien, MSAA implementation disables G-Sync. Having the game use Swap Effect other than Discard does the same. Same with Swap Interval being set to zero. Games using sRGB color mode will not work as well...

Not sure what to think of it, NVIDIA wouldn't so blatantly lie to consumers, would they?

What happens if you turn v-sync on in this scenario?
 
BTW, I had a talk with Kaldaien, and apparently NVIDIA driver can state the game supports G-Sync even if it's not really working.
Case in point: https://i.imgur.com/yTK7tid.jpg
According to Kaldaien, MSAA implementation disables G-Sync. Having the game use Swap Effect other than Discard does the same. Same with Swap Interval being set to zero. Games using sRGB color mode will not work as well...
Not sure what to think of it, NVIDIA wouldn't so blatantly lie to consumers, would they?
I don't think this would be NVIDIA "lying to customers" so much as there being certain combinations which cause the driver to report that it's enabled without properly being activated if you're using a tool which can adjust those settings.
Since I use Fullscreen+Windowed, the indicator LED is always red on my monitor anyway. I use the monitor's "FPS Counter" to confirm whether G-Sync is working or not.
Dynamically changing = working, fixed at maximum = not. It's usually pretty obvious whether G-Sync is active or not anyway, and I'm still not seeing why it matters in a game that is 90% static screens.
 
Dynamically changing = working, fixed at maximum = not. It's usually pretty obvious whether G-Sync is active or not anyway, and I'm still not seeing why it matters in a game that is 90% static screens.

That's just a game I'm playing at the moment, I have the same issue in other games, like say, TRI: Of Friendship and Madness (NVIDIA driver reports G-Sync being enabled, while SpecialK doesn't).
That's a game that would benefit from G-Sync, so the discrepancy between two tools reports bothers me a bit. Just trying to understand what's at issue here, 's all.
 
Top Bottom