Playing at 40hz is garbage. Who started this shit?

I was getting around 40-50fps with RT and all that jazz in the AC Shadows benchmark (game was free, I've never spent a dime on any AC game). Playing with fluctuating FPS that low sucks so I locked it to 40fps (120hz monitor, gsync). Fluctuating FPS between say 80-120 is totally fine and smooth.

I played for about 4 or 5 hours at 40hz and it just looked so terrible. The motion clarity is garbage. I'm team green but FSR frame gen can be enabled along with DLSS in AC Shadows. I unlocked my framerate and turned on frame gen. Wow, motion clarity sky rocketed, beautiful game. And honest to God I did not notice any input delay which is insane. Most areas were over 100fps and it felt great (was playing with a wired controller so, kb/m might have been noticeable).

40hz is definitely slightly better than 30 but honestly it's not a big difference at all. You're kidding yourself if you say otherwise.

I see a lot of folks here get excited and praising a 40hz option. Pointless topic I know, but FUCK 40HZ, it sucks dick just like 30.

Have a great Saturday
40fpa is better than 30fps, yea it isn't 240fps but at least it ain't 30.
 
You are better off lowering graphics until you get to 60 fps, then use frame generation to get to 120.
And use Reflex to reduce input lag a bit.

And yes, everything bellow 60 fps sucks.
Not if you're a console friend, if you're on consoles 60fps is the holy golden standard all great games should strive for. On PC 60fps is dogshit garbage from the 2003 LCD days. On consoles display can go up to max 120hz, on PC we have 500+hz monitors and climbing with 1000hz monitors coming soon. It is what it is.
 

lh032

I cry about Xbox and hate PlayStation.
is playing games under VRR / and / 40hz under 120hz the same thing? any experts can explain?
 
Last edited:

Radical_3d

Member
9ohxs4.jpg
We’re gonna need a template.
 

Soodanim

Member
OP has to be trolling. And it worked, because you lot can't resist rage bait. You get caught up in the faint whiff of platform wars this is built on and completely ignore the retarded idea of 40fps being good to someone that is used to 120fps.

The only alternative is far worse, of course: OP believing his own premise. But with his taste for the correct use of "your" I don't think this is the case.
 
Last edited:

Filben

Member
40fps renders one picture every 25ms opposed to 33ms. If you notice the difference between 16.6 (60fps) and 10ms (100fps) upwards, you'll notice 30 vs 40.

If you don't notice the difference and improvements then don't play at 40 FPS. I don't know what to tell you nor what you want to hear. I feel like you expected the difference to be much bigger (for whatever reason) and noticed it ain't magic.
 

Bojji

Member
is playing games under VRR / and / 40hz under 120hz the same thing? any experts can explain?

It's not. 40fps on full VRR "window" will most likely end up being 80hz with LFC, it won't look as good as 40/120 1/3 divide.

40fps with fixed 120hz is IMO the best. It's hard to set up 40fps on PC to be as good as on consoles, I tried it at some point and just gave up. Games don't support it internally (outside of FFXVI) and most of the softwarze to cap framerate don't create satisfactory results with this kind of framerate.

On consoles 40fps can look quite decent and nice bump over 30fps IF developers set it up correctly.
 
Last edited:

yamaci17

Gold Member
is playing games under VRR / and / 40hz under 120hz the same thing? any experts can explain?
first of all almost no one actually plays at actual 40 hz
there are four ways to get a 40 FPS game

with VRR and LFC and framerate cap. you just cap to 40 FPS and VRR and LFC handles it. in this case it usually uses LFC to sync 40 fps to 80 hz. this is the lowest input lag option because there's no Vsync delay.

with VRR and framerate cap. you just cap to 40 FPS and VRR handles it. in this case it uses VRR to sync 40 FPS to 40 hz. this is the second lowest input lag option. it is laggier than LFC because you're getting more screen delay. it will also look horrible because most screens are not designed to run at 40 hz. it may even cause headaches due to such a low refresh rate. this is only possible if the screen has 40-144 VRR range

with 1/3 vsync. so screen refreshes at 120 hz (what consoles do often in their 40 FPS balanced modes, even if you have VRR enabled). this is a high input lag option because there's a big Vsync delay. however it may appear smoother than VRR since you buffer so many frames with vsync, you end up with smoother framepacing in many cases

1/1 vsync at 40 hz. almost no one does this. almost no screen actually support 40 hz and consoles do not support 40 hz output either.

It's not. 40fps on full VRR "window" will most likely end up being 80hz with LFC, it won't look as good as 40/120 1/3 divide.

40fps with fixed 120hz is IMO the best. It's hard to set up 40fps on PC to be as good as on consoles, I tried it at some point and just gave up. Games don't support it internally (outside of FFXVI) and most of the softwarze to cap framerate don't create satisfactory results with this kind of framerate.

On consoles 40fps can look quite decent and nice bump over 30fps IF developers set it up correctly.

you can actually use 1/3 vsync through nvidia inspector. it is quite good actually but I don't often bother because I don't like the vsync lag myself. I prefer having less smooth but more responsive gameplay. anyway I just play uncapped with reflex anyways so it doesn't matter much

not that it matters but forza horizon 5, a plague tale requiem and dragon age veilguard are games that I know that supports multiple vsync modes
 
Last edited:

kevboard

Member
I was getting around 40-50fps with RT and all that jazz in the AC Shadows benchmark (game was free, I've never spent a dime on any AC game). Playing with fluctuating FPS that low sucks so I locked it to 40fps (120hz monitor, gsync). Fluctuating FPS between say 80-120 is totally fine and smooth.

I played for about 4 or 5 hours at 40hz and it just looked so terrible. The motion clarity is garbage. I'm team green but FSR frame gen can be enabled along with DLSS in AC Shadows. I unlocked my framerate and turned on frame gen. Wow, motion clarity sky rocketed, beautiful game. And honest to God I did not notice any input delay which is insane. Most areas were over 100fps and it felt great (was playing with a wired controller so, kb/m might have been noticeable).

40hz is definitely slightly better than 30 but honestly it's not a big difference at all. You're kidding yourself if you say otherwise.

I see a lot of folks here get excited and praising a 40hz option. Pointless topic I know, but FUCK 40HZ, it sucks dick just like 30.

Have a great Saturday

So, I hope you didn't actually play at 40hz like you claim, because that would be horrendous.
I hope what you actually did is play at 40fps while your screen ran at either 80hz or 120hz.

But, 40fps isn't meant to be a "good" framerate, it's meant to be a compromise for people who don't mind playing below 60fps, but also don't want to go all the way down to 30fps.
on a PC I would never lock to 40fps and always lower the settings to reach 60.
 

Allandor

Member
Use 120hz and disable frame generation. Than the picture should be much better.
It is way better than 30fps, but not better than 60fps. But on console for 60 fps you often need to sacrifice to much image quality.

Like everything on consol it's a compromise.
 
Last edited:

rofif

Can’t Git Gud
I would prefer 40 than frame gen.
Also, it’s closer in feel to 60 than 30.
But as I said before, even 30 is fine if done right.
 

rofif

Can’t Git Gud
Starfox ran at like 14fps and it's a classic.
Because as kids, we didn’t care.
I was so surprised to learn than unreal1 runs like to 20fps on 3dfx…. I remember playing this game so much and being so impressed.
It’s only recently that people played some night fps games and think now they are too good for 30fps. It takes one evening to get used to it.
 
Because as kids, we didn’t care.
I was so surprised to learn than unreal1 runs like to 20fps on 3dfx…. I remember playing this game so much and being so impressed.
It’s only recently that people played some night fps games and think now they are too good for 30fps. It takes one evening to get used to it.
we were playing those games on CRT monitors which make a big difference of motion clarity vs our "modern" screens.
 

rofif

Can’t Git Gud
we were playing those games on CRT monitors which make a big difference of motion clarity vs our "modern" screens.
You are lying to yourself. If it was on a shittiest lcd, that would not make a difference back then.
And now we have amazing oleds. I find them better than crt on every front
 

Portugeezer

Member
40hz is definitely slightly better than 30 but honestly it's not a big difference at all. You're kidding yourself if you say otherwise.
And 60fps is just slightly better than 40fps? And 120fps is just slightly better than 60fps? We all have our limits I suppose.

In terms of frame time, going from 40 to 60fps is the same difference as 30 to 40fps. Each has an improvement of about 8ms.
 

Trilobit

Absolutely Cozy
I've never tried 40hz gaming, but 40fps is the lowest treshold I have for games on my OLEDTV. 30fps is a complete mess and unplayable and I always go for 60fps on my PS5 even if the IQ suffers greatly. 40fps is something I can get used to, but not 30fps. On the other hand if I play Switch for example then 30fps on my FullHD monitor is acceptable.
 
40hz is fucking shite. Anyone who defends this should be thrown into the gulag.

Ideally I prefer 90-120 FPS because the motion clarity is so much better.

I can still enjoy 60fps but that is the bare minimum I will accept.
 

Fafalada

Fafracer forever
Because as kids, we didn’t care.
I mean - we cared when we saw the alternative.
I remember being super impressed with SGI flight simulator demo (that only ran at 30fps actually) just because well - it was 'that smooth' compared to what we mostly had at home.
Hell Doom was that game too even though it only ran at 35fps - in an era where PC software was lucky to break 15.

But indeed we get used to it quick.

we were playing those games on CRT monitors which make a big difference of motion clarity vs our "modern" screens.
If frame-doubling (and worse) is considered 'clear'. Most people used their CRT in 50-70hz range - so whenever you were at 15-20fps - 'motion clarity' wasn't what you were getting.
Even Doom would have looked massively better on a CRT had you ran it at native 70fps, but there was no such option.

And now we have amazing oleds. I find them better than crt on every front
CRT's main advantage is Low persistence. Standard OLEDs will not match the clarity unless you simulate it (https://blurbusters.com/crt-simulation-in-a-gpu-shader-looks-better-than-bfi/).
 

Throttle

Member
And 60fps is just slightly better than 40fps? And 120fps is just slightly better than 60fps? We all have our limits I suppose.

In terms of frame time, going from 40 to 60fps is the same difference as 30 to 40fps. Each has an improvement of about 8ms.
And the same difference as 60 to 120hz, -8ms.

40 fps is the best "bang for buck" in all of these, since it only requires 1/3 more horsepower to deliver a meaningful improvement.

On the same note, 80fps can be to the 60-120 divide what 40 is to 30-60: a way to get most of the improvement with only a bit extra.
 
...

If frame-doubling (and worse) is considered 'clear'. Most people used their CRT in 50-70hz range - so whenever you were at 15-20fps - 'motion clarity' wasn't what you were getting.
Even Doom would have looked massively better on a CRT had you ran it at native 70fps, but there was no such option.


CRT's main advantage is Low persistence. Standard OLEDs will not match the clarity unless you simulate it (https://blurbusters.com/crt-simulation-in-a-gpu-shader-looks-better-than-bfi/).
You mean the main advantage is low persistence? But isn't perceptibly a similar thing than motion clarity? those LCD screens are so bad in motion and everything becomes a blurry mess as soon as you go under 60fps (not saying 60fps is perfect either). When I replay Zelda at 17fps (on a 50hz Trinitron), it's so much better than anything played at 30fps on a PS4 + LCD. I also do not own a OLED screen so I wouldn't know here.
 
Last edited:

TheUnicornGuy

Gold Member
You know this makes me wistful. I didn't even know what screen resolution was. I was innocent.
Now I feel like a holocaust survivor.
I remember playing games where your character was just a square and enemy characters were the same square, but a different colour, and when you moved you essentially teleported to the space next to you, kind of like the screen being a checkerboard and the specky not being able to display on 2 squares at the same time. I had fun back then and I still have fun now, regardless of framerate.

Fake edit: I could be misremembering and actually thinking of the zx81, but the point still stands that it was fun.
 

kevboard

Member
It makes sense for racing games.
But fps games @120fps are bullshit with controllers.

you are 100% wrong.
there are literally controller settings that you basically can not properly use unless your input latency is below 60ms, and to get there you need 120fps.

the biggest advantage a PC user had over a console user in Apex Legends for example, was the 120+fps you could get on PC.
now that the console version has a 120fps mode the playing field is more even. if you tried playing with PC friends in PC lobbies at 60fps, you'd be slaughtered.
 

Fafalada

Fafracer forever
You mean the main advantage is low persistence?
Yes.
If you ever tried VR on a high-persistence display (it can be the state of the art OLED, won't matter) all you see is a smeary, blurry mess, even at 120hz.
You would need either something past 500fps with that approach to not be blurry, or you go with low-persistence 90-120hz (which is the sweet spot for mostly clear VR).

those LCD screens are so bad in motion and everything becomes a blurry mess as soon as you go under 60fps (not saying 60fps is perfect either). When I replay Zelda at 17fps (on a 50hz Trinitron), it's so much better than anything played at 30fps on a PS4 + LCD.
Yes CRT will be better than LCD in those scenarios, though the biggest benefits are when you're close to CRTs native refresh. Eg. setting refresh to 25hz for a 25fps game will have better motion clarity than 50hz/25fps. However - 25hz is low enough to introduce perceptible flicker on CRT - so at that point it becomes a trade-off.
 

mathello

Member
The human eye can't see above 40fps what drugs are you guys on ?

Honestly 45fps is great on steamdeck oled. Its about not having a comparison nearby.
 

kevboard

Member
Honestly 45fps is great on steamdeck oled. Its about not having a comparison nearby.

depends. the issue I have with limiting the framerate on the Deck is that the built-in system level VSYNC introduces a lot of input lag.

if a game has an engine level half refresh Vsync then 45fps on the deck is more than fine. sadly most games don't offer that and most games also don't have granular framerate settings. usually you only get the typical 30, 60, 120, unlocked etc.

I tried playing Kena on the deck with a 40fps lock for example, but the input lag was just so bad that I just opted to run it completely unlocked at 90hz, as the stutters were less of an issue than the bad input lag (especially in Kena where you do need to be able to react to stuff fast at times)
 
40 fps is better 30 fps. If a game can't do 60 or above without completely losing like a blurry mess then 40 fps is a decent compromise. Hopefully with the PS6 it will be 60 fps minimum with good IQ but if not then I hope we will at least be able to have a 40fps option in every game to finally move on from 30 fps.
 

Spukc

always chasing the next thrill
you are 100% wrong.
there are literally controller settings that you basically can not properly use unless your input latency is below 60ms, and to get there you need 120fps.

the biggest advantage a PC user had over a console user in Apex Legends for example, was the 120+fps you could get on PC.
now that the console version has a 120fps mode the playing field is more even. if you tried playing with PC friends in PC lobbies at 60fps, you'd be slaughtered.
Bruh.

60 to 120fps on controllers is bull.
With MKB the step is insane
 

kevboard

Member
Bruh.

60 to 120fps on controllers is bull.
With MKB the step is insane

you are literally wrong. people in Apex on console, before the 120fps patch came, were not able to use certain settings that PC players used, because at 60fps it was basically impossible to control.

max sensitivity, 0 deadzone with a linear curve for example is basically unusable at 60fps, because you constantly have to manage the micro movements that randomly happen due to the lack of a deadzone and the fast linear curve you have.

that's a setting one of the best controller players used for a while, and it's only a viable one at 120+ fps
 

PeteBull

Member
Many legendary games are low fps.
Think of it like this:
Once girl has a bf with luxury car/huge house/high paying job that provides her lavish lifestyle, she will never downgrade to a loser that lives with his parents, even tho she did date such a guy back in highschool or in college even...

Same way us, gamers who back in the 90s and early 2000s were "fine" with 20fps or 30 that dipped all the time, now in 2025 have extremly hard time to go back to anything under stable 60, hell for multiplayer lots of guys who have 144/165 or even 240hz monitors go balls to the wall and reduce settings/res till they get max possible fps they can...

Personal standards always go up with time, never go down, thats the reason we got milions of alpha widowed 35+yo women who hit the wall but its too late for them to downgrade to date avg or barely above avg man, they rather stay single(aka sluts ;P ) or become sidebitch of some high profile guy :)
 

drotahorror

Member
Didn’t say otherwise.
So, I hope you didn't actually play at 40hz like you claim, because that would be horrendous.
I hope what you actually did is play at 40fps while your screen ran at either 80hz or 120hz.

But, 40fps isn't meant to be a "good" framerate, it's meant to be a compromise for people who don't mind playing below 60fps, but also don't want to go all the way down to 30fps.
on a PC I would never lock to 40fps and always lower the settings to reach 60.

120hz, 40fps locked in nvidia control panel. I used fps/hz interchangeably, which I shouldn't have.
 

Fbh

Gold Member
Nah, it's a great option, at least on console

Is it ideal? Like something I'd try to target if I had a decent PC? obviously not.
Does it look and feel as good as 60fps? no.

But with the increase of popularity of 120hz TVs it's a nice option to have, since it often provides similar visuals than the "graphics" mode in games while still looking and feeling significantly smoother than 30fps.
Now that we are seeing more console releases drop to Ps3 era resolutions in their performance mode, the 40fps modes offer a decent compromise between visuals and performance.

Yeah it's obviously a compromise, because that's always going to be a thing with $400 consoles in a market where we have $3000 GPUs. Personally I'd prefer if all devs built their games targeting 60fps at 1440p on console, but sadly that's just not the reality.
 
Last edited:
Top Bottom