No, the consoles currently aren't capable of sending a variable refresh rate signal.Would console games with unlocked framerates benefit from gsync or is it exclusively a PC gaming thing?
No, the consoles currently aren't capable of sending a variable refresh rate signal.Would console games with unlocked framerates benefit from gsync or is it exclusively a PC gaming thing?
Color me stupid, but does 144Hz mean that game has to run at 144fps for optimal experience? 1440p at such frame rate sounds expensive GPU wise![]()
Would console games with unlocked framerates benefit from gsync or is it exclusively a PC gaming thing?
really appreciated for the detailed post, thanks.There are basically three broad types of LCD panels sold right now: "TN", "IPS" and "VA". [..]
that and actually input delay....I've been scorned on great monitors only to go use a controller plugged in and it has a delay.....seriously a fricken delay.
Keep in mind I'm talking about max settings. Could easily turn aliasing or a lighting setting or two down for better results. 50s feel really close to 60.
LG announced a freesync ultrawide for CES. No price/release date though.
I didn't say it doesn't help in high framerate scenarios, just that the benefit does get smaller. Tearing at 45fps is a lot more noticeable than at 100fps.
This is very true from my experience. Tearing at 120 fps is really hard to see, unlike 40 or so.
The same concept implemented in some way for consoles would be a great thing though
Is 120 Hz enough? Even DisplayPort 1.3 tops out at 129 Hz at 4k.Getting one step closer to that 4k, 144hz G-sync monitor that I dream about.
It would be kind of pointless atm considering how much horsepower it would take to drive 4K @ 144hz
Is 120 Hz enough? Even DisplayPort 1.3 tops out at 129 Hz at 4k.
The point of the example is that the impact of Gsync *does* increase/decrease with framerate.I know, I was saying I don't think it does get smaller. The benefit of the particular problem you mention diminishes, but other benefits come into play at higher frame rates. As someone who plays Counter-Strike, tearing is still unbearable at 140fps, and vsync isn't an option as it introduces too much input lag for that type of game.
I dunno, I just found that the biggest benefits of Gsync once I got it were just as big, if not greater at above 60fps than below. But maybe that's because I still find 45fps still quite low fps wise, so it isn't something I'd like to stay around at. It made the dips from 60 into the mid to high 50s almost unnoticeable and a good experience, and that's the extent of the benefit for me of adaptive refresh in lower frame rates. But maybe I'm more sensitive to lower frame rates than some other people.
I disagree. Again, back to my Counter-Strike example. Because I didn't want to have the game running at 300 or 400fps, as that's just mostly wasted power and heat, I capped it at around the typical maximum server tickrate (e.g. 128). So my FPS cap was around 130-140, and tearing was still really, really annoying.
But you've also done what Seanspeed did in choosing examples of two opposite sides of the scale. It isn't 40fps or 120, what if I can only reach 70fps avg? Tearing is still terrible there. 80? 90? Still all quite bad in my experience.
____
As for it being implemented in consoles, it is something I don't expect this generation (Adaptive refresh in any way), but I really hope it has caught on by the next generation. My PC is also hooked up to my plasma TV for occasions where I want to to play on that, but since getting a Gsync monitor, I rarely often want to do that. The only times I will is when a game is capped at 60fps and is an easily attainable 60fps (in which gsync benefits aren't as applicable if I can easily get that locked and solid 60hz lock).
I do agree with you that screen tearing can be pretty brutal, but in general I have noticed that I have a hard time seeing it at 120 or so. It is obviously there though, I just do not notice it nearly as much as I used to with other monitors that were limited to 60hz.I know, I was saying I don't think it does get smaller. The benefit of the particular problem you mention diminishes, but other benefits come into play at higher frame rates. As someone who plays Counter-Strike, tearing is still unbearable at 140fps, and vsync isn't an option as it introduces too much input lag for that type of game.
I dunno, I just found that the biggest benefits of Gsync once I got it were just as big, if not greater at above 60fps than below. But maybe that's because I still find 45fps still quite low fps wise, so it isn't something I'd like to stay around at. It made the dips from 60 into the mid to high 50s almost unnoticeable and a good experience, and that's the extent of the benefit for me of adaptive refresh in lower frame rates. But maybe I'm more sensitive to lower frame rates than some other people.
I disagree. Again, back to my Counter-Strike example. Because I didn't want to have the game running at 300 or 400fps, as that's just mostly wasted power and heat, I capped it at around the typical maximum server tickrate (e.g. 128). So my FPS cap was around 130-140, and tearing was still really, really annoying.
But you've also done what Seanspeed did in choosing examples of two opposite sides of the scale. It isn't 40fps or 120, what if I can only reach 70fps avg? Tearing is still terrible there. 80? 90? Still all quite bad in my experience.
____
As for it being implemented in consoles, it is something I don't expect this generation (Adaptive refresh in any way), but I really hope it has caught on by the next generation. My PC is also hooked up to my plasma TV for occasions where I want to to play on that, but since getting a Gsync monitor, I rarely often want to do that. The only times I will is when a game is capped at 60fps and is an easily attainable 60fps (in which gsync benefits aren't as applicable if I can easily get that locked and solid 60hz lock).
The point of the example is that the impact of Gsync *does* increase/decrease with framerate.
Red is non g-sync one unfortunatly![]()
But that one purpose is related to how it works in general. :/For one purpose, yes. For others, I have found it to be the opposite (as in the benefit of the package increases with frame rate, or at the very least remains the same, but doesn't diminish). I just felt that needed to be represented. There isn't just one benefit of adaptive refresh.
I do agree with you that screen tearing can be pretty brutal, but in general I have noticed that I have a hard time seeing it at 120 or so. It is obviously there though, I just do not notice it nearly as much as I used to with other monitors that were limited to 60hz.
A stable framerate has the benefit of predictable controller response.It doesn't matter because lower framerate is shit even if "wow, now perfectly synced".
I don't want "perfectly synced lower framerate". If I buy expensive hardware it's because I simply don't want lower framerate, period.
But that one purpose is related to how it works in general. :/
In what area does the impact not diminish at higher framerate?
We'll call you in a decade or so, when a 144hz 4k panel will be on the market and there will be GPUs powerful enough to make proper use of it.Wake me up when we have the same stats with 4k and freesynch. Probably going to be a long ass sleep.
Talking about merely milliseconds, human "prediction" doesn't really come into play much, nowhere near as much as the advantages of faster rendering and higher refresh rate reducing input lag, anyway.A stable framerate has the benefit of predictable controller response.
We'll call you in a decade or so, when a 144hz 4k panel will be on the market and there will be GPUs powerful enough to make proper use of it.
Who knows, maybe at the time Freesync will even be an actual thing instead of merely AMD promising "We'll catch up too, someday".
Talking about merely milliseconds, human "prediction" doesn't really come into play much, nowhere near as much as the advantages of faster rendering and higher refresh rate reducing input lag, anyway.
Yep. I'm willing to wait a couple years for it. It will be glorious.Getting one step closer to that 4k, IPS 144hz G-sync monitor that I dream about.
Would console games with unlocked framerates benefit from gsync or is it exclusively a PC gaming thing?
Depends on how much framerates fluctuate. With a reasonable upper limit I guess you're mostly right.Talking about merely milliseconds, human "prediction" doesn't really come into play much, nowhere near as much as the advantages of faster rendering and higher refresh rate reducing input lag, anyway.
VA also often has much better viewing angles and colors than TN, though slightly worse than IPS. Personally, I believe VA-type panels are best for media consumption and general gaming.
Yup. Dunno if you already noticed the Samsung SE790 announcement, 3440x1440 VA:
http://www.samsung.com/ch/consumer/...tor/curved-ultra-wqhd-monitors/LS34E790CNS/EN
I'm fine with my Eizo for now, but the specs of that Samsung plus G-Sync would be an insta-buy.
Smokey with the L after the swift purchase lul
Under $1,000 and I'll bite.
*throws RoG Swift out window*
Ashamed to say this but... over $1000 and I'll still bite.
Zero frame is marketing bullshit - instead of frame you have internal frame on the screen which ruins perceived black level.
Dell's Ultrasharp u3415w is rolling out now and it has a slight curve;
![]()
I agree with what you're saying, but that's a pretty specific situation. You're comparing an ultrawide with higher vertical res to two lower res monitors.As someone who uses a single 27" 1440p display at home (and when working from home) and 2x 1080p displays at work, there isn't much real difference. The higher vertical res of the 1440p display helps when having code editors or web inspectors (I'm a web developer) open but overall the difference to two 27" displays isn't much. At one point I had a 1440p and 1600p display side by side but at least for my uses I didn't figure out much need for that much desktop space.
How does it ruin "perceived black level" any more than the already thick, super black frames on every monitor?
out of curiousity. do you have a spreadsheet or blog or something like that which shows your current pc setup, past purchase, hardware "lying" around?I was already looking at Acers new 32" 4k monitor before this was announced.
I'm always on the move
*throws RoG Swift out window*