G-Sync is the god-level gaming upgrade.

Best upgrade I've made in recent years. It allows you to stay a bit behind the frontier on everything else, and as such it has saved me a lot of upgrade money.

So nice to have a 144Hz monitor as well!
 
I'm missing something here, why not just cut some settings and lock 60fps? If developers implemented vsync correctly, the input lag may be minimal.

I thought this is mostly useful for enthusiasts who want really high settings at 1440p but can't consistently hit 60 on those.
Some games just struggle to maintain a consistent enough framerate to hold vsync at 60fps without a ton of overhead. Gsync is just a great cover-all solution that means you have to worry less about being able to specifically hold a refresh-rate divisible framerate in the first place. And many Gsync monitors are 120/144hz so you've got a huge range of potential framerates that are open to you without any fear of tearing or capping/vsync or input lag or anything.

It's freedom.
 
Eh, it's okay.

It's not going to make 40fps feel smooth, and if you have a game with an fps up in the 100s it's so much more worth it to fuck with the settings to get a stable 120fps framerate and turn on the ULMB.

In the 50s~90s it's pretty nice to have. But even then again, post 60s framerate kinda feels overrated to me. It's pretty much like 60fps, but less blurrier as the framerate goes up higher. It's a cool thing to see, but when you're putting down the dosh to get hardware that can crank games up that high with a monitor that can display it you should be aware that you're doing it for enthusiast reasons, not for practicality.
 
How's the difference beween a 22 inch 1080p screen and a 27 inch 1440p screen? Will the increase in actual screen size potentially increase more jaggies? I don't really see how this works yet. I can assume 1440p would look better, but I could also see a smaller screen to be more compressed which would then lead to less jaggies. Correct me if I'm wrong (which I probably am lol).

Couple of new questions btw:

Are there also 60Hz G-sync screens? I will never aim for framerates higher than 60, because I don't like the feel of 60 fps (too gamey) and because I'd rather use my GPU (GTX 980) usage for downsampling. 144hz is complete overkill in that regard. My aim is 45 fps. Nothing higher than that. So my question is: how is G-sync for 45 fps?

The screen I'm currently looking into is the ASUS ROG SWIFT PG278Q. It doesn't have IPS though. Is that terrible? And what else about this screen? Is it any good?

Nah, you are correct. It's about the resolution in relation to the screen size (and viewing distance). You'd have a slightly higher relative resolution with 1440P on 27 inch.

And G-Sync for 45 FPS works well, but finding 60 Hz monitors with them will be difficult if there are any.

What's the story with this guy?

http://www.amazon.com/dp/B00NUCRBCU/?tag=neogaf0e-20

I'm thinking about buying it...

That is an older version, with a lower resolution.

http://www.neogaf.com/forum/showthread.php?t=998372

That is the extremely well received monitor of recently. A newer version. Might be that the old version still is good, but it still has a lower resolution and there might be good alternatives. The new one is $200 more I think or isn't in stock.
 
Nah, you are correct. It's about the resolution in relation to the screen size (and viewing distance). You'd have a slightly higher relative resolution with 1440P on 27 inch.

So does that still mean I'll get better DSR applied anti-aliasing results with a smaller 1080p monitor? But on the other hand I can imagine the increase in native resolution being much, much, much more a worthwhile visual upgrade. All in all at least a good justification for upgrading to a 27 inch 1440p screen from a 22 inch 1080p screen.
 
So does that still mean I'll get better DSR applied anti-aliasing results with a smaller 1080p monitor? But on the other hand I can imagine the increase in native resolution being much, much, much more a worthwhile visual upgrade. All in all at least a good justification for upgrading to a 27 inch 1440p screen from a 22 inch 1080p screen.

Difficult to say whether you'll have a better looking image with a smaller monitor with some DSR applied or a bigger 27 inch monitor.

If they were the same sizes 1440P would be better than a DSR resolution of 1440P.

When you have the 1440P monitor being a lot bigger it gets a bit more difficult to tell, but I doubt you'd regret it.
 
Would gsync with 144hz at 1440p be better than 4k without? I have a Titan x coming but think I might get a better experience going all out 1440p
 
So does that still mean I'll get better DSR applied anti-aliasing results with a smaller 1080p monitor? But on the other hand I can imagine the increase in native resolution being much, much, much more a worthwhile visual upgrade. All in all at least a good justification for upgrading to a 27 inch 1440p screen from a 22 inch 1080p screen.
Downsampling from 1440p or something on a 22" 1080p monitor would certainly grant you a pretty nice level of AA, no doubt.

But it's also possible that the general 'visual upgrade' of going to a native 1440p resolution with a 27" monitor would be negligible considering the extra size. Depends on just how far away from your monitor you are.

Do you not value screen size at all, though? Using a 27"+ monitor is like playing on a big screen TV. I think you'd have to be quite the aliasing stickler to stick with a smaller 1080p screen while still running 1440p or whatever through DSR.

Would gsync with 144hz at 1440p be better than 4k without? I have a Titan x coming but think I might get a better experience going all out 1440p
You'll definitely get better performance in games, that's for sure.
 
Some games just struggle to maintain a consistent enough framerate to hold vsync at 60fps without a ton of overhead. Gsync is just a great cover-all solution that means you have to worry less about being able to specifically hold a refresh-rate divisible framerate in the first place. And many Gsync monitors are 120/144hz so you've got a huge range of potential framerates that are open to you without any fear of tearing or capping/vsync or input lag or anything.

It's freedom.

All this AND the fact that Vsync can cause stutter, as frames may have to be repeated by the GPU to meet the display's requirement. Fixed refresh rate intervals are a holdover from CRT and LCD technology does not require them. It makes no sense to continue with them for a lot of games on the PC where hardware ability is extremely unpredictable and so are framerates. It was simply way easier for display manufacturers to never innovate and continue to sell the same technology forever. Thank fuck some engineers cared enough about the problem to begin to try and tackle it.
 
So does that still mean I'll get better DSR applied anti-aliasing results with a smaller 1080p monitor? But on the other hand I can imagine the increase in native resolution being much, much, much more a worthwhile visual upgrade. All in all at least a good justification for upgrading to a 27 inch 1440p screen from a 22 inch 1080p screen.

Why not get a smartphone with 1080p screen and be happy with the insane pixel density? If you use a smaller screen you'll sit closer, bigger screen you'll sit more far away from the screen and the pixel density will look the same. You're so worried about perfect IQ that you're willing to game on a small 1080p screen? Why? You can not get a locked 45fps with a non Gsync/Freesync panel, I mean you can but it will look worse than 30fps. Get a 30hz 4K panel if you're so worried about IQ and don't care about framerate or get a 1440p Gsync monitor and downsample from 4K. That's what I do and you'll start seeing 1080p as if it was 720p, you cant go back.
 
I'm missing something here, why not just cut some settings and lock 60fps? If developers implemented vsync correctly, the input lag may be minimal.

I thought this is mostly useful for enthusiasts who want really high settings at 1440p but can't consistently hit 60 on those.

Considering that G-Sync is almost exclusively available only on displays supporting high refresh rates, yeah, the actual benefits may well be circumstantial depending upon use. Think about it, if one owns a monitor that can display a high refresh rate, regardless of whether it supports G-Sync, why wouldn't one then want to prioritize adjusting game settings as needed to achieve a framerate that properly takes advantage of whats arguably the most important feature of the display, the high refresh rate?

Now, if one does have a high, stable framerate to go along with a high refresh rate, what's the benefit of G-Sync, relative to V-sync being off? Honestly, if one were to claim that they can actually see any appreciable screen tearing in properly developed games that are consistently rendering at frames per second anywhere north of 100, I wouldn't believe them. That's like claiming to be able to see the difference between 120 fps and 144 fps - it doubtlessly exists, but on a practical level normal people just don't have the acuity to see it. Similarly, screen tearing becomes less noticeable at high framerates, the severity of any tearing, whether lateral or vertical (and your ability to see it as a result), decreases significantly as frametime does.

In a situation where one has a stable, high framerate there's no terrible downside to just playing with v-sync off - there's no additional input lag nor appreciable tearing. Alternatively, one can pay a ~$200 premium for G-Sync (and ULMB, of equally finicky benefit to gaming) and then be forced to accept having only a DP input as a tradeoff. Even in the case of 1440p resolution displays, for which G-Sync becomes much more practically beneficial, again, considering that all G-Sync enabled monitors have high refresh rates, it would be better suiting the most important capabilities of the monitor in any situation in which fully ideal hardware is absent to just lower the rendering resolution and any other settings to maintain the high framerate.

I'd like to think most PC gamers, given the choice of picking either smoother gameplay or better looking gameplay, would chose the former in most situations, and if that's the case for you then G-Sync is of a limited benefit. Perhaps part of the misconception some folks seem to have about G-Sync lies in the presupposition that it's solving a problem to begin with, when that's not necessarily true in all cases - one of those being a case that happens to be applicable to each and every current owner of a G-Sync display.
 
Considering that G-Sync is almost exclusively available only on displays supporting high refresh rates, yeah, the actual benefits may well be circumstantial depending upon use. Think about it, if one owns a monitor that can display a high refresh rate, regardless of whether it supports G-Sync, why wouldn't one then want to prioritize adjusting game settings as needed to achieve a framerate that properly takes advantage of whats arguably the most important feature of the display, the high refresh rate?
Because you value image quality over framerates above ~40 in lots of genres? I know I do.

Honestly, if one were to claim that they can actually see any appreciable screen tearing in properly developed games that are consistently rendering at frames per second anywhere north of 100, I wouldn't believe them.
I do, but that's besides the point. The point is that to mantain a stable framerate above 100, you need to give up a lot in terms of IQ and effects, compared to simply maintaining around 60 FPS on average. And at such framerates no Vsync is completely unbearable.

ULMB, of equally finicky benefit to gaming
What do you mean by "finicky" in this case? For gamers who want motion clarity, ULMB is pretty much the biggest thing since CRTs. Personally, I've never been too fussed by minor smearing, but if you talk to the people at blurbusters any LCD without strobing is completely unusable, and I can see their position.


I feel like you are creating one particular use case in your mind and basing a general conclusion on variable refresh rates on that which couldn't be further from the truth. As a PC gamer for 20 years now, the general case is that you have a framerate which varies around 60 -- and that's exactly where G-sync is at its most useful.
 
I feel like you are creating one particular use case in your mind and basing a general conclusion on variable refresh rates on that which couldn't be further from the truth. As a PC gamer for 20 years now, the general case is that you have a framerate which varies around 60 -- and that's exactly where G-sync is at its most useful.

Yep, basically for me the minimum is 45 fps in many games, like all multiplayer games or all without motion blur and direct camera control.
If i can reach average around 60fps with drops to max [min? :p] 45fps, then i'm totally happy with performance and G-Sync completely eliminates weakness of LCDs in those framerates.

And i dont see benefit of playing in 120hz games like Skyrim. Actually i could play it in locked 30hz if it had motion blur and would prefer to crank up IQ to eleven and post-processing instead.
 
I have been playing on my asus rog since launch (well, a month or so later) and I couldn't be more happy.

G_sync is one of those upgrades that once you have it there's no going back.

Think of it as something like ssd: you didn't think it would make a difference and yet it still is one of the biggest improvements you can make to any build.

G-sync is the greatest improvement in gaming since 3d fx. xD
 
I'm toying with the idea of getting a G-Sync monitor now (I need a monitor either way old 22" is gone and I'm currently running my PC on TV only). Question is : is it worth it with a modest configuration?

I currently have a x51 with i5 3450 and GTX660. I was thinking of investing into a gtx 960, possibly the 4gb ram version, to extend a bit it's gaming life before a completely new rig in 2016 (gtx 970 has been an option, but I don't feel confident about fitting it in the x51 in terms of size AND with a 330w PSU).

I was aiming at the Acer predator 24" with the purpose of it serving me for next rig too.
 
I'm toying with the idea of getting a G-Sync monitor now (I need a monitor either way old 22" is gone and I'm currently running my PC on TV only). Question is : is it worth it with a modest configuration?

I currently have a x51 with i5 3450 and GTX660. I was thinking of investing into a gtx 960, possibly the 4gb ram version, to extend a bit it's gaming life before a completely new rig in 2016 (gtx 970 has been an option, but I don't feel confident about fitting it in the x51 in terms of size AND with a 330w PSU).

I was aiming at the Acer predator 24" with the purpose of it serving me for next rig too.

Gsync shines exactly at this, makes games on medium range hardware look smooth at lower framerates.
 
I just took the jump to 122 Hz gaming (which is fucking amazing) recently on the cheap and it'll last me a while until my next major GPU bump.

I'll take a look at FreeSync-compatible monitors when I have the funds to splurge on one that doesn't suck/have intense ghosting issues.
 
I'm missing something here, why not just cut some settings and lock 60fps? If developers implemented vsync correctly, the input lag may be minimal.

I thought this is mostly useful for enthusiasts who want really high settings at 1440p but can't consistently hit 60 on those.

1. I don't want to be limited to 60fps. I have a 144hz monitor, I'd like to use it.
2. I'd have to downgrade my visual experience even further than with G-Sync.
3. vsync just isn't implemented correctly and even when it is, it's a solution to a problem that we shouldn't have to deal with in the first place. The default relationship should have always been, in an ideal world, the GPU controlling refresh, now we finally have that.

Lower frame rates are more tolerable with G-Sync and that's one of the main cited benefits, but for me, I still try and stay above 60, but the dips into the 50s aren't jarring and are still smooth. But also, when I'm getting well above 60, into the 70s, 80s or even 100s, I'm getting the benefit of every single frame.

I couldn't get that before. I had to either lock at 60 or try and get to 120 or 144 and with all the problems that caused, G-Sync is far, far superior. I like to describe G-Sync as a fix, not necessarily an 'upgrade'. It fixes headaches, if you're fine with the current situation, then you probably won't care for it so much, but I wasn't.
 
Because you value image quality over framerates above ~40 in lots of genres? I know I do.

I do, but that's besides the point. The point is that to mantain a stable framerate above 100, you need to give up a lot in terms of IQ and effects, compared to simply maintaining around 60 FPS on average. And at such framerates no Vsync is completely unbearable.

What do you mean by "finicky" in this case? For gamers who want motion clarity, ULMB is pretty much the biggest thing since CRTs. Personally, I've never been too fussed by minor smearing, but if you talk to the people at blurbusters any LCD without strobing is completely unusable, and I can see their position.


I feel like you are creating one particular use case in your mind and basing a general conclusion on variable refresh rates on that which couldn't be further from the truth. As a PC gamer for 20 years now, the general case is that you have a framerate which varies around 60 -- and that's exactly where G-sync is at its most useful.


In the games I play, I value a smooth gaming experience more than I value a good looking experience, within reasonable limits, of course. Personally, I don't play any games where I'd be satisfied with a framerate in the forties, but if you say you would be, that's cool with me, different strokes.

I agree, at framerates around 60 screen-tearing is an undeniable problem, and G-Sync solves it. Fortunately for me, I don't have that problem because I'd more often go to the option of lowering fidelity instead of fluidity. I think the financial compromise of going the other way is a huge deterrent. Screen tearing becomes an issue in such a situation, and unless one pays a premium for G-Sync to remedy the issue with a solution that provides no other benefits - low FPS will remain low - that person will then have to suffer through the tearing otherwise.

In my experience with popular games like CS:GO, TF2, Civilization V, or Diablo III, backlight strobing doesn't make an appreciable difference in motion clarity because there simply isn't enough fast paced motion to begin with. This is a matter of practicality for me. If one is playing Quake 3, for example, maybe they'll get more benefit out of ULMB. Most games aren't nearly as fast paced, though, so there's less benefit to a reduced persistence. The bottom line is that these two proprietary Nvidia technologies should make one ask oneself if the benefit is justified given the ~$200 premium. I don't know what games the folks over at Blurbusters play, but they must not play the same games you do because they certainly wouldn't care for the "unusable" amount of ghosting that would be present in any game rendering at around 40 fps, I'd imagine.

If one spends the extra money to purchase a monitor with a high refresh rate it would stand to reason that it would be important to such a person to achieve a framerate that takes advantage of that feature. Such a person's general use case would likely be targeting a framerate that takes advantage of their hardware, in my opinion. That happens to be my general case, and apparently yours is different. That's fine with me.
 
The difference between your post now and your previous post is that you more clearly mark your own preferences as just that, rather than using them to question the value of variable refresh rates in general. That's fine. It's true that if you are planning on playing everything at 100+ FPS, variable refresh might not be worth it.

However, what a G-sync display does is give you a huge amount of flexibility: for your general RPG/strategy/action-adventure game (which is most of what I play) you can crank up the IQ and still get non-juddery, V-synced fluidity at, say, a variable 45-80 FPS. And for highly competitive or very reflex-heavy games you can take the significant IQ hit and lock on to 85, 100 or 120 Hz with ULMB.

You see buying a G-sync display as buying a high refresh rate monitor that incidentally has a minor extra feature. I see it as buying a variable refresh rate display that also incidentally supports high refresh rates when necessary. Really, the only drawback until recently was that they were all TN, and that is fixed now so I'm jumping in.
 
The difference between your post now and your previous post is that you more clearly mark your own preferences as just that, rather than using them to question the value of variable refresh rates in general. That's fine. It's true that if you are planning on playing everything at 100+ FPS, variable refresh might not be worth it.

However, what a G-sync display does is give you a huge amount of flexibility: for your general RPG/strategy/action-adventure game (which is most of what I play) you can crank up the IQ and still get non-juddery, V-synced fluidity at, say, a variable 45-80 FPS. And for highly competitive or very reflex-heavy games you can take the significant IQ hit and lock on to 85, 100 or 120 Hz with ULMB.

You see buying a G-sync display as buying a high refresh rate monitor that incidentally has a minor extra feature. I see it as buying a variable refresh rate display that also incidentally supports high refresh rates when necessary. Really, the only drawback until recently was that they were all TN, and that is fixed now so I'm jumping in.
Did you pick up a XB270HU yet, Durante? I cannot remember if you said you were going to.
 
People seem oblivious to the fact that monitors with variable refresh and G-Sync in particular, are a different breed to your standard LCD. They provide many modes of operation, including standard fixed refresh and Vsync operation. It's this flexibility afforded by a more capable panel that people like and that also raises the cost of these monitors.
 
I'd have to see it in action. Triple buffered has made tearing and lag not a problem. But there's less lag than triple buffering, right?
 
Some games just struggle to maintain a consistent enough framerate to hold vsync at 60fps without a ton of overhead. Gsync is just a great cover-all solution that means you have to worry less about being able to specifically hold a refresh-rate divisible framerate in the first place. And many Gsync monitors are 120/144hz so you've got a huge range of potential framerates that are open to you without any fear of tearing or capping/vsync or input lag or anything.

It's freedom.
That actually sounds pretty good once you put it that way.

I can play at desired settings instead of hardcapping 60 in an unhappy compromise.
 
I'd have to see it in action. Triple buffered has made tearing and lag not a problem. But there's less lag than triple buffering, right?
Most of the ways you can get triple buffering do introduce some lag. However, more importantly, triple buffering invariably introduces some judder (or more accurately, logical frame time and presentation time mismatch) if you drop below 60 Hz (since your frametimes will only be either exactly 16.6 or 33.3 ms). G-sync eliminates that judder.
 
Most of the ways you can get triple buffering do introduce some lag. However, more importantly, triple buffering invariably introduces some judder (or more accurately, logical frame time and presentation time mismatch) if you drop below 60 Hz (since your frametimes will only be either exactly 16.6 or 33.3 ms). G-sync eliminates that judder.

I have a 144hz monitor.
 
If games actually used triple buffuring properly, gsync wouldn't be necessary.

Or think of it this way, we're in the ideal situation now with adaptive refresh. The relationship between the GPU and monitor should have always had the GPU in the driving seat in regards to refresh, things like vsync were there to deal with issues because of the existing relationship which had the monitor refreshing at set intervals.

vsync can be seen as a fix for an issue that didn't need to exist.
 
Haha, yeah of course I noticed that screen too. Unfortunately, it's not available in any web-shop in Holland. It's just unavailable here, I guess.

But still, every (pro and consumer) review I've read of the ROG Swift is really positive, so I'm positive I've bought a good product. The colours were pretty good for a TN screen was something that was mentioned a lot, so at least I won't be losing too much in comparison to the Acer Predator XB270HUbprz (which I believe is probably the only reason you guys were hinting at that screen anyway).
 
I've had the Rogers swift for a little less than a week now and g-sync is indeed amazing.

Forget 60fps. On getting consistent 100+ fps with no tearing on some games, and the ones below 100 feel smoother as well @_@

The colors are definitely far better than your usual TN though I sent quite a bit of time adjusting it to where I wanted it. And there I no discernible backlight bleed.
 
Oh snap, now I found a shop that does have the Acer xb270hu, but the delivery time is a fair bit longer than my just ordered ROG Swift. Okay, GAF, should I really have patience and get the Acer xb270hu?
 
Oh snap, now I found a shop that does have the Acer xb270hu, but the delivery time is a fair bit longer than my just ordered ROG Swift. Okay, GAF, should I really have patience and get the Acer xb270hu?

Yes as an owner of an Rog Swift I'm not a huge fan of the TN panel even though it probably the best TN panel out there. The screen finish looks a bit grainy.
 
Dark Souls2: SOTFS still tears like crazy for some reason even with G-Sync on. >:-O

Following up, after shutting down my machine, unplugging, re-plugging the monitor and booting back up, Dark Souls 2: SOTFS is running perfectly with G-Sync.

Anyone have this?

http://www.asus.com/us/Monitors/ROG_SWIFT_PG278Q/

Can anyone comment on it?

It's great, the TN panel has a tiny amount of grain that you can notice on the desktop, but not at all while gaming.

As far as overall picture quality goes, I do photo editing occasionally and recently switched to the ROG Swift from one of those Korean IPS displays, the Crossover 27QW. This seems completely backwards, but after calibrating the Swift, I was seeing artifacts in the black areas of pictures I was editing that I couldn't see before on the Korean IPS because of the extremely bright and probably uneven back-light. The colors aren't as vibrant, but it's damn near perfect and actually serving me better for business and pleasure.

If I could, I'd still upgrade to the XB270HU, I'd expect it to be a fair bit better than the Crossover.
 
That's not really true. Even correct triple buffering still induces a mismatch between rendering intervals and presentation intervals.

Correct.

Before G-sync, the only way to achieve judder-free perfect motion was with double-buffered Vsync (or "add a frame to the render queue" version of triple-buffered Vsync, although the latter being obsolete for this purpose) if you consistently rendered at or above your refresh rate (technically I mean each frame time or instantaneous fps).

There are several problems with this though, for one, it's limited to intervals of your refresh rate. Consistency is difficult to achieve, and spikes or fluctuations are common unless you're "brute-forcing" an older engine like Source. Additionally, input latency is often a matter of some criticism with Vsync, and although you can lower it (by using a frame cap at your refresh rate, or using a high refresh display), it's still present.

G-sync solves all of the above. You no longer have to compromise and pick and choose what aspects matter most, you get the best of every world.
 
Yeh that sounds like a hassle to me considering my increasingly frequent use of said mode. Does Freesync suffer from that problem too?
Of course.

Really, it's not that it "doesn't work". It's that the DWM actually refreshes at a rate which is completely divorced from whatever any windows on it may end up doing.
 
Of course.

Really, it's not that it "doesn't work". It's that the DWM actually refreshes at a rate which is completely divorced from whatever any windows on it may end up doing.

Hmm, I see. Though I'm not sure I understand it thoroughly yet. Would the problem arise when the DWM would try to up the framerate above the game's framerate in some situations, e.g. when you mouse quickly over several icons on your desktop? So you'd need some interface access to the DWM to limit the framerate dynamically. Then there's the question how to deal with multiple games open in borderless windowed mode and so on.

Well, I can see the problem now.
 
Top Bottom