First 144Hz 1440p IPS g-sync display annouced by Acer

Color me stupid, but does 144Hz mean that game has to run at 144fps for optimal experience? 1440p at such frame rate sounds expensive GPU wise :D

There is never any need for that. G-Sync helps make the experience smoother when you're running say something like 40-60 fps but above that I opt for ULMB (which AFAIK is part of the G-Sync module so should be available on the Acer as well) as it makes the picture clearer even on a display as fast as the ROG Swift. The difference is noticeable in very fast paced games (think first person shooters etc, despite running at max I prefer playing for example MGS Ground Zeroes with G-Sync instead).

But generally the benefit of high refresh rate is there's less overall blur depending on the display panel's capabilities. I'm interested to see how the Acer ends up as previously only TN panels have been fast enough. I also have little faith in Acer as a manufacturer so I'm not sure how well calibrated the displays will be from the factory.

With the new real 8-bit TN panels, image quality is no longer an issue unless you need wide gamut (which most don't as most content is sRGB). Viewing angles are still a problem though, mainly vertically. With my ROG Swift I have no issues in everyday use when going side by side. Aside from high default brightness, it came calibrated perfectly from the factory. I replaced my trusty old 30" IPS Dell with the ASUS, don't regret it all despite giving up a bit of size and vertical resolution.
 
Would console games with unlocked framerates benefit from gsync or is it exclusively a PC gaming thing?

G-Sync requires a recent NVIDIA graphics card, so it's not compatible with any existing consoles

The same concept implemented in some way for consoles would be a great thing though
 
There are basically three broad types of LCD panels sold right now: "TN", "IPS" and "VA". [..]
really appreciated for the detailed post, thanks.

i heard of TN and IPS before and thought they are the only variants with OLED being the new tech on the horizon.

something new learned for when i get a new monitor. VN it is then.
 
that and actually input delay....I've been scorned on great monitors only to go use a controller plugged in and it has a delay.....seriously a fricken delay.

Delay is caused by scaller and display electronics- since it uses same G-sync module as Asus Swift we can also expect same near zero input lag.
 
Keep in mind I'm talking about max settings. Could easily turn aliasing or a lighting setting or two down for better results. 50s feel really close to 60.

LG announced a freesync ultrawide for CES. No price/release date though.

If the price isn't too high I'm going for that, unless something better releases this year.
 
I didn't say it doesn't help in high framerate scenarios, just that the benefit does get smaller. Tearing at 45fps is a lot more noticeable than at 100fps.

I know, I was saying I don't think it does get smaller. The benefit of the particular problem you mention diminishes, but other benefits come into play at higher frame rates. As someone who plays Counter-Strike, tearing is still unbearable at 140fps, and vsync isn't an option as it introduces too much input lag for that type of game.

I dunno, I just found that the biggest benefits of Gsync once I got it were just as big, if not greater at above 60fps than below. But maybe that's because I still find 45fps still quite low fps wise, so it isn't something I'd like to stay around at. It made the dips from 60 into the mid to high 50s almost unnoticeable and a good experience, and that's the extent of the benefit for me of adaptive refresh in lower frame rates. But maybe I'm more sensitive to lower frame rates than some other people.

This is very true from my experience. Tearing at 120 fps is really hard to see, unlike 40 or so.

I disagree. Again, back to my Counter-Strike example. Because I didn't want to have the game running at 300 or 400fps, as that's just mostly wasted power and heat, I capped it at around the typical maximum server tickrate (e.g. 128). So my FPS cap was around 130-140, and tearing was still really, really annoying.

But you've also done what Seanspeed did in choosing examples of two opposite sides of the scale. It isn't 40fps or 120, what if I can only reach 70fps avg? Tearing is still terrible there. 80? 90? Still all quite bad in my experience.
____

As for it being implemented in consoles, it is something I don't expect this generation (Adaptive refresh in any way), but I really hope it has caught on by the next generation. My PC is also hooked up to my plasma TV for occasions where I want to to play on that, but since getting a Gsync monitor, I rarely often want to do that. The only times I will is when a game is capped at 60fps and is an easily attainable 60fps (in which gsync benefits aren't as applicable if I can easily get that locked and solid 60hz lock).
 
I'm waiting to Freesync to release with a 1440p/144Hz monitor. My current Yamakasi 1440p display looks nice, but the build quality is shit, and it only has a D-DVI input
 
It would be kind of pointless atm considering how much horsepower it would take to drive 4K @ 144hz

Indeed, I'm not the type of person to go buy two or three of the current best gpu. That being said, my current asus 1080p monitor has lasted five years. It's more of having the features I want and to stick with a monitor for a long time.

Is 120 Hz enough? Even DisplayPort 1.3 tops out at 129 Hz at 4k.

I didn't know that DP 1.3 currently had that limitation. 120hz would be more than enough.
 
I know, I was saying I don't think it does get smaller. The benefit of the particular problem you mention diminishes, but other benefits come into play at higher frame rates. As someone who plays Counter-Strike, tearing is still unbearable at 140fps, and vsync isn't an option as it introduces too much input lag for that type of game.

I dunno, I just found that the biggest benefits of Gsync once I got it were just as big, if not greater at above 60fps than below. But maybe that's because I still find 45fps still quite low fps wise, so it isn't something I'd like to stay around at. It made the dips from 60 into the mid to high 50s almost unnoticeable and a good experience, and that's the extent of the benefit for me of adaptive refresh in lower frame rates. But maybe I'm more sensitive to lower frame rates than some other people.



I disagree. Again, back to my Counter-Strike example. Because I didn't want to have the game running at 300 or 400fps, as that's just mostly wasted power and heat, I capped it at around the typical maximum server tickrate (e.g. 128). So my FPS cap was around 130-140, and tearing was still really, really annoying.

But you've also done what Seanspeed did in choosing examples of two opposite sides of the scale. It isn't 40fps or 120, what if I can only reach 70fps avg? Tearing is still terrible there. 80? 90? Still all quite bad in my experience.
____

As for it being implemented in consoles, it is something I don't expect this generation (Adaptive refresh in any way), but I really hope it has caught on by the next generation. My PC is also hooked up to my plasma TV for occasions where I want to to play on that, but since getting a Gsync monitor, I rarely often want to do that. The only times I will is when a game is capped at 60fps and is an easily attainable 60fps (in which gsync benefits aren't as applicable if I can easily get that locked and solid 60hz lock).
The point of the example is that the impact of Gsync *does* increase/decrease with framerate.
 
I know, I was saying I don't think it does get smaller. The benefit of the particular problem you mention diminishes, but other benefits come into play at higher frame rates. As someone who plays Counter-Strike, tearing is still unbearable at 140fps, and vsync isn't an option as it introduces too much input lag for that type of game.

I dunno, I just found that the biggest benefits of Gsync once I got it were just as big, if not greater at above 60fps than below. But maybe that's because I still find 45fps still quite low fps wise, so it isn't something I'd like to stay around at. It made the dips from 60 into the mid to high 50s almost unnoticeable and a good experience, and that's the extent of the benefit for me of adaptive refresh in lower frame rates. But maybe I'm more sensitive to lower frame rates than some other people.



I disagree. Again, back to my Counter-Strike example. Because I didn't want to have the game running at 300 or 400fps, as that's just mostly wasted power and heat, I capped it at around the typical maximum server tickrate (e.g. 128). So my FPS cap was around 130-140, and tearing was still really, really annoying.

But you've also done what Seanspeed did in choosing examples of two opposite sides of the scale. It isn't 40fps or 120, what if I can only reach 70fps avg? Tearing is still terrible there. 80? 90? Still all quite bad in my experience.
____

As for it being implemented in consoles, it is something I don't expect this generation (Adaptive refresh in any way), but I really hope it has caught on by the next generation. My PC is also hooked up to my plasma TV for occasions where I want to to play on that, but since getting a Gsync monitor, I rarely often want to do that. The only times I will is when a game is capped at 60fps and is an easily attainable 60fps (in which gsync benefits aren't as applicable if I can easily get that locked and solid 60hz lock).
I do agree with you that screen tearing can be pretty brutal, but in general I have noticed that I have a hard time seeing it at 120 or so. It is obviously there though, I just do not notice it nearly as much as I used to with other monitors that were limited to 60hz.

To each his own of course, I still think Gsync and variable refesh rate with no tear or stutter is amazing.
 
The point of the example is that the impact of Gsync *does* increase/decrease with framerate.

For one purpose, yes. For others, I have found it to be the opposite (as in the benefit of the package increases with frame rate, or at the very least remains the same, but doesn't diminish). I just felt that needed to be represented. There isn't just one benefit of adaptive refresh.
 
Which of these monitors would be better for console competitive fps and Starcraft 2? Are they both overkill for these purposes? Pretty happy with my monitor but the picture quality is kind of crappy.
 
For one purpose, yes. For others, I have found it to be the opposite (as in the benefit of the package increases with frame rate, or at the very least remains the same, but doesn't diminish). I just felt that needed to be represented. There isn't just one benefit of adaptive refresh.
But that one purpose is related to how it works in general. :/

In what area does the impact not diminish at higher framerate?
 
I do agree with you that screen tearing can be pretty brutal, but in general I have noticed that I have a hard time seeing it at 120 or so. It is obviously there though, I just do not notice it nearly as much as I used to with other monitors that were limited to 60hz.

Well, it's not a coincidence: requiring exactly half time to refresh a frame, the higher the framerate is, the less likely screen tearing becomes. Regardless of the the refresh rate being synced or not.
A 120hz panel has literally twice as much chances to "accidentally" sync the refresh signal with the frame being visualized, compared to a typical 60hz one.
 
It doesn't matter because lower framerate is shit even if "wow, now perfectly synced".
I don't want "perfectly synced lower framerate". If I buy expensive hardware it's because I simply don't want lower framerate, period.
A stable framerate has the benefit of predictable controller response.
 
But that one purpose is related to how it works in general. :/

In what area does the impact not diminish at higher framerate?

The benefit of being able to take advantage of smooth and tear free image at any refresh rate (above at certain point), and because a higher frame rate is better than a lower one, the higher your frame rate the naturally better your image is. A higher frame rate is better, no? G-Sync making that tear free and smooth at those higher frame rates is a benefit of G-Sync + the higher refresh rate monitor. The benefit of the package (of a high refresh rate monitor + gsync) climbs with the frame rate.

In the 5 or 6 months I've been using G-Sync, I'd say the benefits below 60fps are quite overstated. It's an improvement, just an exaggerated one when it comes to the actual real world results, because it doesn't make 45fps (for example) a significantly better experience.
 
Wake me up when we have the same stats with 4k and freesynch. Probably going to be a long ass sleep :(.
We'll call you in a decade or so, when a 144hz 4k panel will be on the market and there will be GPUs powerful enough to make proper use of it.
Who knows, maybe at the time Freesync will even be an actual thing instead of merely AMD promising "We'll catch up too, someday".

A stable framerate has the benefit of predictable controller response.
Talking about merely milliseconds, human "prediction" doesn't really come into play much, nowhere near as much as the advantages of faster rendering and higher refresh rate reducing input lag, anyway.
 
We'll call you in a decade or so, when a 144hz 4k panel will be on the market and there will be GPUs powerful enough to make proper use of it.
Who knows, maybe at the time Freesync will even be an actual thing instead of merely AMD promising "We'll catch up too, someday".


Talking about merely milliseconds, human "prediction" doesn't really come into play much, nowhere near as much as the advantages of faster rendering and higher refresh rate reducing input lag, anyway.

10 years sounds about right at the rate tech evolves lately ;). Just don't use a buzzer when you wake me up... :)
 
Hopefully they have a better roll out with this than Asus had with the ROG swift...

I'm so glad I didnt buy a ROG swift now.
 
Talking about merely milliseconds, human "prediction" doesn't really come into play much, nowhere near as much as the advantages of faster rendering and higher refresh rate reducing input lag, anyway.
Depends on how much framerates fluctuate. With a reasonable upper limit I guess you're mostly right.
Of course there are special cases where it's unacceptable, like virtual reality.
 
Nice. Very interested in swapping this 4k out.

4k is nice but I can't play most games at 4k. I can do 2560x1440. Maybe a bit higher with DSR, but it doesn't work with 4k. Not with my asus one. Well it works. But it just won't stretch to fullscreen. It just takes up the space of 4k. It works if you want to play at higher than 4k resolutions.

Nah. Give me 2560x1440 native and let me choose. G-sync will be nice. Will make sure to get rid of mine in the next month or two.
 
I know it sounds stupid but I would buy it in a heartbeat and sell my new GSync TN montior if there was a 1920x1080 version. I'm comfortable with 1080p still and I want to maintain that high framerate as much as I possibly can. My machine is pretty beefy but 1440p on demanding games is tough if you want to keep a higher than 60fps framerate. It's targeting high end consumers and it will be expensive so I know it won't happen.
 
Oh is this Acer using the AHVA displays from AU Optronics? I just realized this is an IPS panel. I was wondering why people were saying they were throwing away their Swifts.
 
Zero frame is marketing bullshit - instead of frame you have internal frame on the screen which ruins perceived black level.

How does it ruin "perceived black level" any more than the already thick, super black frames on every monitor?

"Zero frame" > ugly archaic black frame.
 
Dell's Ultrasharp u3415w is rolling out now and it has a slight curve;

dell-u3415w-curved-monitor-01.jpg

What game is this?
 
As someone who uses a single 27" 1440p display at home (and when working from home) and 2x 1080p displays at work, there isn't much real difference. The higher vertical res of the 1440p display helps when having code editors or web inspectors (I'm a web developer) open but overall the difference to two 27" displays isn't much. At one point I had a 1440p and 1600p display side by side but at least for my uses I didn't figure out much need for that much desktop space.
I agree with what you're saying, but that's a pretty specific situation. You're comparing an ultrawide with higher vertical res to two lower res monitors.

What happens when you are comparing monitors with the same vertical res and height?
 
How does it ruin "perceived black level" any more than the already thick, super black frames on every monitor?

The flush inner bezel will give it slightly more obvious contrast against the poor IPS black level. That said, no light bezel will make the grey IPS blacks be perceived dark. Complaining about such things is the height of nitpicking honestly. If you want better contrast, get a VA/Plasma/OLED panel.
 
I was already looking at Acers new 32" 4k monitor before this was announced.

I'm always on the move
out of curiousity. do you have a spreadsheet or blog or something like that which shows your current pc setup, past purchase, hardware "lying" around?
 
Ever since moving up to a 1440p panel, I've been wanting a higher refresh rate model, looks like I may have to take the dive if they aren't instantly sold out forever like the ROG Swift when they release.
 
So, maybe a silly question but if you decide to play a game at 1080p instead of 1440p on the monitor/any 1440p monitor really. Does it look worse(fuzzier) then on a native 1080p screen?
 
Top Bottom