120/144Hz Thread of Lightboosting Pixel Perfect Headshots

Display changes image every ~6ms instead of ~16ms, this causes tearline and vframe drops etc. to be visible half the time when compared to 60hz monitor.

Thanks! In other words, if the display updates in 120hz you get the less noticeable tearing irrespective of how many frames are rendered per second. And as a corollary, if you are not using vsync, there is no reason to update a display slower then its maximum hz?
 
Is the reduced visual impact of tearing from updating a display in 120hz preserved irrespective of how many frames per second the game is rendered in?
Is there any benefit in limiting the maximum frames per second at say 30 or 60 when updating the display in 120hz? (thinking about performance intensive games with variable fps like assassins creed unity)

1 - Yes, it's preserved;
2 - Not intrinsically because of 120Hz. Capping the framerate at a divisor of your display will make things judder free. The same applies when capping at 30 in a 60Hz monitor. This is only true if V-SYNC is OFF, though.
 
1 - Yes, it's preserved;
2 - Not intrinsically because of 120Hz. Capping the framerate at a divisor of your display will make things judder free. The same applies when capping at 30 in a 60Hz monitor. This is only true if V-SYNC is OFF, though.

Thanks for answering the second question as well. Looking forward to testing Assassins Creed Unity in 120Hz, capped 40fps, no vsync, when I get home.
 
Thanks for answering the second question as well. Looking forward to testing Assassins Creed Unity in 120Hz, capped 40fps, no vsync, when I get home.

I capped Watch Dogs at 30fps in 120Hz by using Nvidia Inspector and telling it to Vsync 1/4 of my refresh rate, as well as using MSI Afterburner to cap it at 30. Looked great, and had less judder compared to 60Hz. You need an Nvidia card, though. I'm sure there's something similar in RadeonPro.
 
I just spent a couple of hours testing vsync off, 120hz and capping max fps in nvidia inspector. Advanced warfare plays wonderfully in 60 fps with 4x super sampling. Assassin's Creed Unity is completely smooth in 40 fps. Tearing is hardly noticeable, input delay is minimal and the experience is judder free. I wish I started experimenting with capping fps sooner ...

Additionally, I think I like the black frame insertion tech ("turbo 240") even when running in 120hz/60fps. Motion still looks much sharper, despite not hitting 120fps.

What is the downside of using "turbo 240" (or LightBoost I guess) when running 120hz/60fps? I only noticed that I felt some eye strain after playing for a while, but switching between on/off, "turbo 240" definitively made the visuals clearer in motion


I capped Watch Dogs at 30fps in 120Hz by using Nvidia Inspector and telling it to Vsync 1/4 of my refresh rate, as well as using MSI Afterburner to cap it at 30. Looked great, and had less judder compared to 60Hz. You need an Nvidia card, though. I'm sure there's something similar in RadeonPro.

The input delay is quite bad on my sli system when running 30fps with vsync, but I guess that may be a good option if you are using a single gpu and if you are sensitive to tearing.
 
I capped Watch Dogs at 30fps in 120Hz by using Nvidia Inspector and telling it to Vsync 1/4 of my refresh rate, as well as using MSI Afterburner to cap it at 30. Looked great, and had less judder compared to 60Hz. You need an Nvidia card, though. I'm sure there's something similar in RadeonPro.

Afterburner? You mean RivaTuner? You know, if you're capping at 30fps in RivaTuner you can set the in-game cap to 60fps or even uncapped and it won't matter. RivaTuner will still do its job. A lot of games perform better at 60fps in-game cap than at 30fps in-game cap for whatever reason. I've seen minimum frame rate almost double before. (forget what game) from like 19fps in 30fps cap to like 34fps in 60fps cap
 
The question of G-Sync and ULMB working together was direct at Nvidia back in september. You can watch their response to it hear: http://youtu.be/cr-GToUHino?t=1h25m58s

I wouldn't be surprised if Nvidia comes out with G-Sync2 that did VRR+ULMB.

Yeah but as Tom says right in the video, the technologies aren't really compatible. You get a backlight pulse every time the screen refreshes so it only really works if the framerate is stable, gsync is a technology specifically designed for non-stable framerates. Though from his comment they are clearly investigating it, I wonder what they'll come up with.
 
Yeah but as Tom says right in the video, the technologies aren't really compatible. You get a backlight pulse every time the screen refreshes so it only really works if the framerate is stable, gsync is a technology specifically designed for non-stable framerates.
They "aren't compatible" only if you make additional assumptions which you did not state. Is there anything that would prevent pulsing the backlight at an unstable rate and adjusting the pulse length to stabilize perceived brightness?
 
Why is it that I can't find any 28"+ 120Hz monitors? I am trying to replace my tv setup with a large monitor but it just doesn't seem feasible.

You won't find any >27" monitors with a high refresh rate because the native resolution most commonly supported in such displays is 1080p. A 1080p's pixel density at 27" is already in the low 80's, so given a normal viewing distance for a desk, if that number were to go any lower one might as well have a fly screen tapped to one's display. Monitors are commonly associated with computers and desks, so you don't often see "desktop monitors" in 30"+ sizes. There are plenty of desktop displays with non-standard aspect ratios that have longer diagonal length than 27", but they appeal to an audience with the appropriate hardware and desk space to accommodate their needs and often render higher resolutions to compensate.

There are plenty of 120hz television set options in the ~30" range, though.
 
You won't find any >27" monitors with a high refresh rate because the native resolution most commonly supported in such displays is 1080p. A 1080p's pixel density at 27" is already in the low 80's, so given a normal viewing distance for a desk, if that number were to go any lower one might as well have a fly screen tapped to one's display. Monitors are commonly associated with computers and desks, so you don't often see "desktop monitors" in 30"+ sizes. There are plenty of desktop displays with non-standard aspect ratios that have longer diagonal length than 27", but they appeal to an audience with the appropriate hardware and desk space to accommodate their needs and often render higher resolutions to compensate.

There are plenty of 120hz television set options in the ~30" range, though.
Yeah, but those aren't actually 120Hz panels. They don't display a 120Hz signal.

But in any case...

acer_xr341ck1-300x237.jpg


Acer XR341CK 34″ 144Hz G-SYNC ultrawide :P
 
Aren't they 120Hz panels but they lack the ability to actually process a 120Hz signal?
Yeah, so the only thing you can do with that extra refresh rate is interpolation shit that is terrible in every situation. But really, if you can't display a 120Hz signal, GTFO with your 120Hz marketing. Amirite?
This is it. This is the one.

Please be IPS or VA

EDIT: I didn't think anyone would be crazy enough to make a monitor with all of these features so soon...
I'm almost positive it's either IPS or AHVA (IPS kinda).
 
Yeah, so the only thing you can do with that extra refresh rate is interpolation shit that is terrible in every situation. But really, if you can't display a 120Hz signal, GTFO with your 120Hz marketing. Amirite?

Eh I'm torn tbh. It made sense when the started to exist because they displayed 24Hz film better, but now that we have true 120/144Hz panels for PC's, it's a bit weird.
 
Yeah, but those aren't actually 120Hz panels. They don't display a 120Hz signal.

But in any case...

Acer XR341CK 34″ 144Hz G-SYNC ultrawide :P

Actually there are a few that can display 120hz without interpolation.

Aren't they 120Hz panels but they lack the ability to actually process a 120Hz signal?

The limiting factor for a modern television would be the bandwidth capacity of the connection interface, which is HDMI in most cases. HDMI has had the ability to transmit 120hz/1080p for awhile, the latest version enables 2160p/60hz, or 4K, which uses about twice as much bandwidth.
 
Im just posting to say that going from 60hz to 120hz is unbelieveable! Holy shit, I got a BenQ XL2411Z, and after calibrating and using the Blur Strobe thingy it blows my mind.

Upgraded from a BenQ GW2400 from 2008, so this is a major improvement. Wow!
 
I plan on possibly snagging a 27" 144hz monitor at some point in the very near future. The only problem I can see is I'm currently gaming on a 42" tv on my desk, and I love having everything right there for me. I mostly just play League of Legends but I'd really rather have the responsiveness of a 144hz monitor at this point, I'm just not sure if I'd be sacrificing a lot of visual information by downsizing screen size that much.

Anyone done anything similar in size downgrades?

Yep had 30" something setup despite the size I hated the color accuracy and lack of response compared to a monitor design we are talking about. As long as you sit close 24" or 27" should be more than enough.
 

I really wish monitor manufacturers would reach out to arcade developers and get them to start using these monitors rather than their old 60hz dinosaurs. (assuming they're willing to pack in the necessary gpu hardware )

What better place to showcase this stuff? Arcades were always the next frontier of new hardware, and most people have never seen games running at a gsynced 144hz.
 
Diablo 3 still has this tiny little stutter, I'm not sure how to explain it. Framerate stays about 140 fps, GSync enabled, SLI or single GPU, there's this slight stutter that I have a hard time ignoring now. It's very strange. I guess it's just something I'll always just have to deal with when I play that game.
I think everyone gets this. I get it even when theres hardly any enemies on screen.
 
I've been researching the world of 120hz monitors and I was wondering if actual and experienced owners could help me out.

My goal is to get a single, high quality monitor on which a PC, a PS3, a PS4 and possibly more will be connected. I'd really like to get rid of as much motion blur as possible by going for a 120hz panel.

As I understand it (correct me if I'm wrong), 120fps on a 120hz monitor is optimal and general OS usage (Windows) will see a definite improvement though fluidity, cursor/window movement.

But
  • What happens to games outputing 30 and 60 fps on the PS3/PS4? Will they look the same as they do on the 60hz monitor or will they benefit and in what way? What happens when there are framerate drops?
  • What happens to PC video playback of 23.98->24fps content(2D/CG animation,live acton)? Do I get the so called "soap opera effect"? How is camera pan judder affected?
  • The specs I have in mind will be capable of 60fps for the newest PC games. Can I use vertical sync and a limiter to benefit from the 120hz panel's lower motion blur without 120fps?
  • Are emulators affected in any special way?

Thanks.
 
  • What happens to games outputing 30 and 60 fps on the PS3/PS4? Will they look the same as they do on the 60hz monitor or will they benefit and in what way? What happens when there are framerate drops?
'
The only thing that matters is the display mode HZ computer/console wants to initialize with monitor.
With consoles they will most likely just use 60hz mode, so there will be no advantage.
  • What happens to PC video playback of 23.98->24fps content(2D/CG animation,live acton)? Do I get the so called "soap opera effect"? How is camera pan judder affected?
If you have 120hz monitor at 120hz display mode and 24fps content you can display the content without judder, just like you see in movie theater. (every image is shown 5 times.)
No, there is no soap opera effect. (what is described by that term is the motion estimation methods which try to create new frames inbetween those 24 main frames.)

With 60hz monitor there is always some judder with 24fps content.

  • The specs I have in mind will be capable of 60fps for the newest PC games. Can I use vertical sync and a limiter to benefit from the 120hz panel's lower motion blur without 120fps?
The panels motion blur is independent of the games fps, the screen updates 120 times per second independently of game framerate. (when using traditional monitor, not one with g-sync_/freesync.)

  • Are emulators affected in any special way?

Thanks.
If displaymode is 120hz and emulator/games are not limited to certain framerates, you can play games with very nice 120fps.
 
As I understand it (correct me if I'm wrong), 120fps on a 120hz monitor is optimal and general OS usage (Windows) will see a definite improvement though fluidity, cursor/window movement.
It is optimal everything usage. It's like reclaiming the CRT days.

What happens to games outputing 30 and 60 fps on the PS3/PS4? Will they look the same as they do on the 60hz monitor or will they benefit and in what way? What happens when there are framerate drops?
I don't know how PS3/PS4 work, but my Wii U will only output at 60hz, which sets the monitor to 60hz, so yes it is the same. HDMI won't go higher, so unless those other two have DVI you won't get 120hz. You will probably get less input lag than a TV, as many TVs have postprocessing like crazy.

What happens to PC video playback of 23.98->24fps content(2D/CG animation,live acton)? Do I get the so called "soap opera effect"? How is camera pan judder affected?
No, because it isn't adding interpolated frames to make it 120fps. If you were playing a 24fps video over a 120hz signal, it would just show each frame multiple times. I still notice judder in DVDs I play in windows, even though I have software set to try and fix it, but those aren't utilizing HDMI 24hz mode like a bluray. But even with that, 24hz is going to look like shit when it pans anyway, as I think it does in the theater. Making it the least-shit it can be at 120hz is all complicated math shit I can't be arsed to figure out.

The specs I have in mind will be capable of 60fps for the newest PC games. Can I use vertical sync and a limiter to benefit from the 120hz panel's lower motion blur without 120fps?
This is a bit complicated, because it depends on the game. Not all games allow 120hz output, but most newer games will. When you can get it, yes it will update your screen at 120hz for maximum clarity yet while only changing the displayed image as often as the game does. I personally do not think it looks as smooth as a 60fps game on a 60fps monitor, but it is sharper for sure, and I'd only do 30fps at 120hz otherwise the blur is unbearable.

If you use vsync, that just means the game waits until the next monitor cycle, so with a 120hz monitor it should never have to wait. If your CPU/GPU can do it, then it will be fine. I have found that just using 120fps with strobing eliminates most screen tearing so I don't think vsync is even needed. If a game only lets you output at 60hz, strobing is very noticeable and can give you a headache.

Are emulators affected in any special way?
The ones I have tried all worked fine. However, not all windows games have. Some of them are made for 60hz, so if you try to run in windowed mode, where the game must comply to the desktop setting, it just shows up black and then crashes. Other times, for some unknown reason, games have a hard time running 120hz in a window but will do fine 120hz fullscreen. But most of the time if a game disagrees with your monitor, it will just make the monitor change to 60hz and you'll notice immediately.
 
Thanks a ton pottuvoi, Dice for your valuable information.

It seems that PC + 120hz monitor is a dream combo for every PC use I have in mind.


The panels motion blur is independent of the games fps, the screen updates 120 times per second independently of game framerate. (when using traditional monitor, not one with g-sync_/freesync.)

If displaymode is 120hz and emulator/games are not limited to certain framerates, you can play games with very nice 120fps.

In other words, the image will be clearer (less motion blur) when, for example, quickly rotating the camera regardless of whether I configure the hardware to cap at 60 fps or 120 fps. I presume the same applies to the emulators running at 60 fps? 120fps is prohibitive for many old games that often tie game logic to framerate, and it can't work too well with the inherent instability of emulation. But that's not a problem, I'm after a clearer moving image first and foremost, and 120fps fluidity comes second.


I don't know how PS3/PS4 work, but my Wii U will only output at 60hz, which sets the monitor to 60hz, so yes it is the same. HDMI won't go higher, so unless those other two have DMI you won't get 120hz. You will probably get less input lag than a TV, as many TVs have postprocessing like crazy.


This is very good info. So your monitor can easily switch between 120hz and 60hz, and effectively act as a 2 in 1 (is that standard for 120hz monitors? is a 120hz on 60-mode equal to a regular 60hz?).

Is it even possible to force the monitor to stay at 120hz when connected to a console? I'm looking into creating a similar situation as 60fps PC gaming on a 120hz monitor. The console games of course have predetermined fps caps, but according to the information 30 fps or 60 fps doesn't prevent one to benefit from 120hz clarity so long as the monitor maintains that refresh rate.

In an other thread I mentioned especially Ico on PS3, with its free and fast camera. It's a pain in the eyes to play due to the blur induced.

Many thanks!
 
Yes, the clarity is from the speed of the refresh rate of the monitor and strobing working in combination, so it does not need the game to be running at a high framerate to work. I play FF XIII at 30fps because for whatever reason GeDoSaTo can only run it like that, but I run it windowed fullscreen so it is 120hz and it is very clear. High smoothness of motion will only come from actual higher framerates in the game to give your eyes/brain more motion tracking information.

This is very good info. So your monitor can easily switch between 120hz and 60hz, and effectively act as a 2 in 1 (is that standard for 120hz monitors? is a 120hz on 60-mode equal to a regular 60hz?).
Uhhh... monitors have been this way for as long as I can remember. They all have their individual ranges of multiple supported resolutions and frequencies, and typically games will take over what setting they are on just like you can set the options in windows.

Is it even possible to force the monitor to stay at 120hz when connected to a console?
As far as I know, no. The monitor follows the signal being fed to it, so it depends on the limitations of the console. I don't think HDMI goes above 60hz. If any console supports it, it would need to have a DVI out.
 
I think I read newer hdmi specs can do 120hz 1080p. I'd love to buy a monitor with my tax refund, but I have a notebook with hdmi and vga only. :(
 
HDMI 1.4 should be able to, but you'd need the ports and cable and firmware to support it. That won't be for a while.
 
But
  • What happens to games outputing 30 and 60 fps on the PS3/PS4? Will they look the same as they do on the 60hz monitor or will they benefit and in what way? What happens when there are framerate drops?


  • PS4 will look fine at 1080p, but most PC Monitors scaling really sucks so any game from your PS3 catalog should just be played on the TV.
 
PS4 will look fine at 1080p, but most PC Monitors scaling really sucks so any game from your PS3 catalog should just be played on the TV.
I was under the impression that PS3 and any console from that time or later scales things according to your settings. I know Wii U scales 720p games to 1080p.

Edit: Just tested now and my monitor's upscaling looks just as good as Wii U.
 
PS4 will look fine at 1080p, but most PC Monitors scaling really sucks so any game from your PS3 catalog should just be played on the TV.

I've heard that before, but I can't verify myself since I have an LG 2 in 1 TV Monitor, and scaling looks pretty good.



How does 144hz factor into all of this? Is it simply a slightly better (120+24) 120hz or is there a drastic difference?


This site
is absolutely amazing for this kind of topic. The article I'm linking in particular showcases some BenQ monitors that are the ultimate solution for both PC and Console motion blur-free gaming. Incredible.
 
I've never truly tested 144hz. I tried to set Jedi Knight to that but that turned off support for playing at the aspect ratio of the game and I didn't want to play it stretched out to 16:9 so I went back to 120hz. Not sure if going that high disables other features or not. I know it strangely keeps different settings (hue, saturation, black levels, etc) locked to different picture modes (I just use the mode that lets me adjust my most desired ones) so it may be arbitrary rather than a hardware limitation.
 
I was under the impression that PS3 and any console from that time or later scales things according to your settings. I know Wii U scales 720p games to 1080p.

Edit: Just tested now and my monitor's upscaling looks just as good as Wii U.

The PS3 does not have a built in global setting to upscale your games. It changes from game to game. The Xbox 360 and the Wii U do upscale everything. I don't know about the new consoles.
 
The PS3 does not have a built in global setting to upscale your games. It changes from game to game. The Xbox 360 and the Wii U do upscale everything. I don't know about the new consoles.


Assuming a 1920x1080p Monitor, according to my findings the PS3 will display properly if the game is at 1080p and terrible at anything else. It doesn't have an upscaler (hardware or software) and it expects the TV to do that(which do have hardware scalers), except we're talking about a monitor here that expects the (PC) to do the scaling if not at all using native resolution.

It's a messy situation that arises from trying to use a single screen for all those devices.

Thankfully both the PS4 and the Xbox One have hardware scalers of their own, and don't expect the display device to do anything other than display 1080p.

The 360 seems to be fine too, so the only one with a problem is the PS3 and previous generations (which can probably be emulated). Same series monitor, 720p is busted.

I've been looking at the BenQ XL-Z series monitors, specifically the well known XL2411Z, and concluded that it's one of the best 120/144hz monitors to eliminate motion blur in both PC and console (60hz) usage. The only unknown parameter is what happens when the PS3 outputs something other than 1080p. In that case it's up to the BenQ to scale the image and I can't find any information about that. There is a 'Smart Scale' mode, so there's some hope about this specific model at least. If anyone knows more, please share.


EDIT: Well that's Google for you, a user on the EU PS forums confirms the inevitable.
 
Assuming a 1920x1080p Monitor, according to my findings the PS3 will display properly if the game is at 1080p and terrible at anything else. It doesn't have an upscaler (hardware or software) and it expects the TV to do that(which do have hardware scalers), except we're talking about a monitor here that expects the (PC) to do the scaling if not at all using native resolution.

It's a messy situation that arises from trying to use a single screen for all those devices.

Thankfully both the PS4 and the Xbox One have hardware scalers of their own, and don't expect the display device to do anything other than display 1080p.

The 360 seems to be fine too, so the only one with a problem is the PS3 and previous generations (which can probably be emulated).

I've been looking at the BenQ XL-Z series monitors, specifically the well known XL2411Z, and concluded that it's one of the best 120/144hz monitors to eliminate motion blur in both PC and console (60hz) usage. The only unknown parameter is what happens when the PS3 outputs something other than 1080p. In that case it's up to the BenQ to scale the image and I can't find any information about that. There is a 'Smart Scale' mode, so there's some hope about this specific model at least. If anyone knows more, please share.
I have the monitor, what do you want me to test? Smart scale allows you to choose whether to image takes up the entire monitor or add black bars
 
I have the monitor, what do you want me to test? Smart scale allows you to choose whether to image takes up the entire monitor or add black bars

Thanks for the offer. It appears I confirmed my suspicion (EDIT) that if the PS3 outputs anything other than 1080p on this BenQ series, image quality is busted. They have no scaler apparently, like most monitors. EDIT: This can be solved to some extent with an hdmi upscaler between PS3 and Monitor, but how will it handle 1080p which doesn't need scaling and what effect this will have on motion blur reduction requires on site testing.


Do you connect a console to it too apart from your PC? I'd be interested to know if the monitor changes by itself from 144hz to 60hz.
 
Does anyone know what happens in the following scenario?



144hz monitor
Half Vertical Sync (=vsync at 72hz)= 72fps

In other words running half vsync at 72hz that limits the fps to 72, WHILE maintaining monitor refresh rate at 144.
  • Does the image benefit from the reduced motion blur, afforded by 144hz?
  • Is input lag the same as it would be at 60fps/60hz?
  • Does that additional 12 fps over 60 produce even a slight smoothness increase on the above setup?
  • Do any issues occur? (e.g judder)


Thanks. I'm getting conflicting information all the time and I'm totally confused.
 
Top Bottom