Is 60 FPS killing overall graphical fidelity?

I know you're joking but I wonder what a 1080p/30fps game would look like with top notch AA.

For example, a Blu-ray movie still looks really good on a 4K TV, why wouldn't a 1080p game with really good AA not look good on a 4K TV? If they didn't focus on frames and added as much detail and lighting quality as possible, it could be pretty mind blowing.
It doesn't quite work out like you're thinking. For video games, the more detail you have inside a surface and the more polygons you have, the more shimmering there will be and you need higher resolutions to see those extra textures as well. Image reconstruction is getting better and better but we have a way to go, base resolution is still important.

Take a look at how blurry crash bandicoot 4 is on ps4 despite it being 1080p with TAA upscale, and it's like the blurriest thing i've ever seen on the switch version lol

Motion resolution is important as well ; the lower the fps there is the more those details will blur in motion.
 
Last edited:
60 fps+ is far more important than a minor asset or rez boost, and it is a selling point these days. Even young kids care about fps these days because of Twitch/etc.
 
Okay fair enough I get you now. I guess I am happy with 30fps because the majority of my favourite have all been 30 at release 🤷‍♂️

You raise and interesting point about the default being 60. People expect a solid 30 as a minimum these days no reason why it can't be 60.

Do you think all games need 60 though? The most obvious example is for turn based rpgs it would seem irrelevant. Slow paced stealth games as well it might be worth sacrificing the frames for more ambitious level design.

And dare I say souls games ( people will hate me for that). Gamers love the huge winding looping level design and this appears to stress the framrate. (Blight Town being the obvious example) Obviously the extra frames help with the combat but it's not as vital as say devil may cry or bayonetta.

Out of interest has every game had a 60fps mode on next gen consoles so far that's pretty cool.

I wouldn't be dogmatic about it. If a game is slow paced like you say, 30 should be fine.

For action games, though, I don't think 30fps is ever really the right choice. But what I would say is that PERSONALLY, having played Ratchet at 40fps, that seems pretty much ok. It's not as good as 60 but it's fine, and it's actually how I play that game now.

It's really beyond time for Sony to patch in VRR support, because that's probably how I'd play most games, 45 to 50fps is probably perfect for me and would allow a few more bells and whistles, or pixels, etc.

I tried 120fps on Doom Eternal and I LITERALLY can't tell the difference. Like, at all. I'm sure some people can but they're probably so rare that it'd be silly to cater to them - just let them buy a PC.
 
I wouldn't be dogmatic about it. If a game is slow paced like you say, 30 should be fine.

For action games, though, I don't think 30fps is ever really the right choice. But what I would say is that PERSONALLY, having played Ratchet at 40fps, that seems pretty much ok. It's not as good as 60 but it's fine, and it's actually how I play that game now.

It's really beyond time for Sony to patch in VRR support, because that's probably how I'd play most games, 45 to 50fps is probably perfect for me and would allow a few more bells and whistles, or pixels, etc.

I tried 120fps on Doom Eternal and I LITERALLY can't tell the difference. Like, at all. I'm sure some people can but they're probably so rare that it'd be silly to cater to them - just let them buy a PC.
120 vs 60 is a huge difference but not worth the resolution hit. Unless you're on PC at 4k already
 
Lol, what? Is this thread sponsored by Ubisoft the home of 30 fps cinematic experiences? - what? the what is ruining what? ----- Don't you mean 30fps is ruining the graphical experiences on consoles?
 
I was playing Ratchet and Clank Rifts apart today in fidelity mode and I have to admit the 30 fps wasn't that bad. But I can't tell the difference between fidelity and performance RT. The light looks better maybe.
 
I was playing Ratchet and Clank Rifts apart today in fidelity mode and I have to admit the 30 fps wasn't that bad. But I can't tell the difference between fidelity and performance RT. The light looks better maybe.

It's basically just resolution. Really depends how far you are from the screen, if it's 1x your screen size away, you'll probably find it quite a big difference; if it's 1.5x or more you're probably not going to notice much at all, just a slight softness.

I sit fairly close so I played through it on the performance mode, cos the extra clarity looked better than the RT. With 120fps mode now available, I'd go with fidelity at 40fps.
 
Pretty much every game will have 60fps option. So in a way they have to compromise the gfx detail. Having said that you can argue while playing, moving around the gfx is not much noticeable, in fact the game moving faster can trick your brain to look better because you see more stuff. Racing games definitely benefit from 60fps gameplay.
 
I was playing Ratchet and Clank Rifts apart today in fidelity mode and I have to admit the 30 fps wasn't that bad. But I can't tell the difference between fidelity and performance RT. The light looks better maybe.
Mayhaps, but can you deny that 60 fps+ray tractiung play a lot smoother?
 
What's the point of 60fps when the entire world is static and everything is hard as a rock? There's more to graphics than just still screenshots.
 
Pretty much every game will have 60fps option. So in a way they have to compromise the gfx detail. Having said that you can argue while playing, moving around the gfx is not much noticeable, in fact the game moving faster can trick your brain to look better because you see more stuff. Racing games definitely benefit from 60fps gameplay.
I still can't get over the slugish of FH4 on 4k 30 fps mode on xbox one x
 
I wouldn't be dogmatic about it. If a game is slow paced like you say, 30 should be fine.

For action games, though, I don't think 30fps is ever really the right choice. But what I would say is that PERSONALLY, having played Ratchet at 40fps, that seems pretty much ok. It's not as good as 60 but it's fine, and it's actually how I play that game now.

It's really beyond time for Sony to patch in VRR support, because that's probably how I'd play most games, 45 to 50fps is probably perfect for me and would allow a few more bells and whistles, or pixels, etc.

I tried 120fps on Doom Eternal and I LITERALLY can't tell the difference. Like, at all. I'm sure some people can but they're probably so rare that it'd be silly to cater to them - just let them buy a PC.
Cool nothing to really disagree with here. Not many variable framerate games these days.

I remember people being pretty satisfied with God of war 3 back in 2009 which averaged around 45 I was certainly fine with it. That was interesting because I remember the devs for Dantes Inferno bent over backwards to achieve a bullet proof 60fps but it got crushed by God of war 3 commercially and critically.

That's probably a better example of of devs cutting back on framerate to achieve more ambitious set pieces
 
What's the point of 60fps when the entire world is static and everything is hard as a rock? There's more to graphics than just still screenshots.
Are you seriously trying to argue that 30 fps looks better in motion? Fucking lol mate. 30 FPS games turn into a blur the moment you turn the camera.
 
Are you seriously trying to argue that 30 fps looks better in motion? Fucking lol mate. 30 FPS games turn into a blur the moment you turn the camera.
Man I never saw any 30fps game turning blur when moving the camera.
What make the game blurry are the resolution and/or some post processing AA.

What game are you playing? It is something that happens on PC?
 
Last edited:
Man I never saw any 30fps game turning blur when moving the camera.
What make the game blurry are the resolution and/or some post processing AA.

What game are you playing? It is something that happens on PC?
Lol what? Literally every single 30 FPS game has an enormous amount of motion blur the moment that the camera is turned because 30 fps games would be completely unplayable without it.
 
Lol what? Literally every single 30 FPS game has an enormous amount of motion blur the moment that the camera is turned because 30 fps games would be completely unplayable without it.
I never played any like that.
I have no ideia what are you saying the camera movement is not blurry.
And I play a lot of 30fps games.
 
Last edited:
Lol what? Literally every single 30 FPS game has an enormous amount of motion blur the moment that the camera is turned because 30 fps games would be completely unplayable without it.
Ethomaz- You seem to know a lot usually about console settings but you really didn't realize this about 30 fps games? Try dying light 2 or control ultimate on console and experiment with turning motion blur on and off. You will not only see how strong it is but you should see how awful it is trying to play them at 30 fps without motion blur on.
 
Ethomaz- You seem to know a lot usually about console settings but you really didn't realize this about 30 fps games? Try dying light 2 or control ultimate on console and experiment with turning motion blur on and off. You will not only see how strong it is but you should see how awful it is trying to play them at 30 fps without motion blur on.
I will try Control because I don't have Dying Lighting 2.

In any case when I turn the camera in 30fps games it is really normal… it like turning around… there is no blur unless of course the game IQ is blurry with low resolution, weird CBX, or post AA.

Even Zelda 20fps was fine on N64… or Goldeneye 15fps turning around the camera was fine.
 
Last edited:
Lol what? Literally every single 30 FPS game has an enormous amount of motion blur the moment that the camera is turned because 30 fps games would be completely unplayable without it.
He's right though about Forbidden West. In this particular game the 60 fps mode kills the iq. Yeah, the 30 fps mode will have lots of motion blur but the resolution differences (I'm telling you guys the reason the iq is so bad is because checkerboarding does not play nice with DRS- Look at Avengers for an almost identical scenario) are the culprit here.
 
I will try Control because I don't have Dying Lighting 2.

In any case when I turn the camera in 30fps games it is really normal… it like turning around… there is no blur unless of course the game IQ is blurry with low resolution, weird CBX, or post AA.

Even Zelda 20fps was fine on N64.
I bet you the reason the iq is so bad in Forbidden West is because of the combo of checkerboard and DRS. Avengers was also 1800p checkerboarded with DRS and that game was such a disappointment for me on ps5. Everyone was upset because it look worse than a typical 1400p game.

This is a shame with H:FW because Zero Dawn was 1800p checkerboard (I think) and had pretty darn good iq. I'm telling you'll drs and checkerboard don't play well together.
 
I bet you the reason the iq is so bad in Forbidden West is because of the combo of checkerboard and DRS. Avengers was also 1800p checkerboarded with DRS and that game was such a disappointment for me on ps5. Everyone was upset because it look worse than a typical 1400p game.

This is a shame with H:FW because Zero Dawn was 1800p checkerboard (I think) and had pretty darn good iq. I'm telling you'll drs and checkerboard don't play well together.
I didn't like Guerrilha CBR since they first used in Shadowfall MP… it is really bad :( and the reason I just played it few hours when I was very addictive with Killzone 2 and 3 MP.
 
Last edited:
I will try Control because I don't have Dying Lighting 2.

In any case when I turn the camera in 30fps games it is really normal… it like turning around… there is no blur unless of course the game IQ is blurry with low resolution, weird CBX, or post AA.

Even Zelda 20fps was fine on N64… or Goldeneye 15fps turning around the camera was fine.
Too bad you don't have DL2 because that is a perfect example of a really strong motion blur. Anytime you move the camera it's so strong it blurs the image to the point where you can't see any detail in the environment.

Btw- if you're OK with 30 fps H:fw might not be too bad. Zero Dawn was as smooth a 30 as possible imo
 
I will try Control because I don't have Dying Lighting 2.

In any case when I turn the camera in 30fps games it is really normal… it like turning around… there is no blur unless of course the game IQ is blurry with low resolution, weird CBX, or post AA.

Even Zelda 20fps was fine on N64… or Goldeneye 15fps turning around the camera was fine.

I'd be surprised if you won't see how blurry Control becomes in quality/30fps mode, but I think that had other engine issues adding to that
 
I can't go back 30 fps, I'm loving my ps5 right now, everything 60, please! I try fidelity in most games and it suuuucks
 
Too bad you don't have DL2 because that is a perfect example of a really strong motion blur. Anytime you move the camera it's so strong it blurs the image to the point where you can't see any detail in the environment.

Btw- if you're OK with 30 fps H:fw might not be too bad. Zero Dawn was as smooth a 30 as possible imo
If that is the case then it is the Motion Blur implementation? So a game by game case? I really don't remember any game with that issue.

Motion Blur should not a be a issue in games too… or at least I expect it not to be.
 
It had a lot of artifacts. GoW has excellent checkerboard. But Zero dawns iq on Pro looks better than FW by a mile
Yeap… sometimes I had the impression only the middle of the screen was actually being refreshed and the corners were off and not synced with the middle lol

Man it was awful… SP at 30fps was fine… no issue but I found the campaign mediocre compared with K2 & K3 campaigns.
 
Last edited:
I never played any like that.
I have no ideia what are you saying the camera movement is not blurry.
And I play a lot of 30fps games.
Yea guys, no idea what this blur thing is you guys are talking about.
XRI6dlt.jpg
EnWd94B.jpg
jXtY900.png
Wa4jXG1.jpg
 
Last edited:
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.

All those games are last gen, with higher settings, fps + res.

Games made for the current gen will have better visuals at 30fps and 60fps.
Like this gen, games like call of duty and Doom were competitive visually with 30fps games. But the very best looking games were 30fps, I dont think it will be different this gen apart from the chance devs may go for 40fps or maybe 45fps (which would look like 60fps with VRS.
 
Reads the OP.. not sure if serious.

- GT7 destroys Drive club. its not even close like wtf.
- 60 FPS is not enough. I can't play certain games at 60 FPS like COD or BF or Halo. They need to be butter smooth. 100+ fps for sure. 120 is ideal. ( one of the reasons I switched to PC ). i can no longer play any game ever on 30 fps. the minimum is 60 frames. )
 
Man I never saw any 30fps game turning blur when moving the camera.
What make the game blurry are the resolution and/or some post processing AA.

What game are you playing? It is something that happens on PC?
Have you got a game like The Last of Us 2 on PlayStation 5. Try the two different mode 30 and 60. Forget about motion blur with 30fps. If you move the screen left or right the whole screen shimmers because the frame can't keep up with fast movement.

The difference is so huge I can't believe no one would be able to not see the difference.
 
Are you seriously trying to argue that 30 fps looks better in motion? Fucking lol mate. 30 FPS games turn into a blur the moment you turn the camera.
Man this is almost not fair. Ethomaz resurrects a thread and all of a sudden Rykan is getting blasted out of nowhere for something he wrote almost six months ago.
 
Well new Horizon is another case.
60fps mode kills IQ.
Come on now. One could also say that 30fps kills playability. See how that goes? Both statements are hyperbolic (and I'm saying this as a firm 60fps Stan).

What is evident is that the anti aliasing technique GG has used in HFW is lacking. Much like FH5 which only looks great at 4k. On the other hand, there are loads of games which utilize TAA much more effectively and looks quite good and sharp at sub native resolutions. So in my view, it's not 60fps that "kills" IQ, it's just a suboptimal AA implementation.
 
Last edited:
Yea guys, no idea what this blur thing is you guys are talking about.
XRI6dlt.jpg
EnWd94B.jpg
jXtY900.png
Wa4jXG1.jpg
That's the point of motion blur. To blend frames together. If you watch any movie and pause during scene with movement, a loving object will be blurred. A camera captures moment in time. Not a time freeze.
It can be overdone in some games when it comes to camera rotation… that's why they tweaked it in uc4 remaster
 
And people wonder why world interactivity becoming worse with each gen and AI barely improves.

At end of the day we PLAY games rather than stare at the screen counting pixels...

But FUCK all! Its all about GRAPHICS!!!!
 
Last edited:
That's the point of motion blur. To blend frames together. If you watch any movie and pause during scene with movement, a loving object will be blurred. A camera captures moment in time. Not a time freeze.
It can be overdone in some games when it comes to camera rotation… that's why they tweaked it in uc4 remaster
If you're trying to argue that the effect in a movie is the same as in games and as shown here, you are absolutely and entirely wrong. A movie will nog blur the entire the entire screen the moment the camera pans or moves. Every 30 fps game does.
 
Last edited:
If you're trying to argue that the effect in a movie is the same as in games and as shown here, you are absolutely and entirely wrong. A movie will not become an entirely blur the moment the camera pans or moves. Every 30 fps game does.
It's not my fault you can't read. My post is clear
 
It's not my fault you can't read. My post is clear
I can read just fine and this isn't the first time you've expressed yourself with piss poor arguments to defend 30 FPS.

The "point" of motion blur in these cases is not an artistic choice. It is necessary to keep the game playable due to the low framerate. The higher the framerate, the less motion blur is required.
 
Last edited:
Gamers keep demanding better fidelity in games…it's valid but!!!

longer time to develop a game, bigger teams, more money needed, too many risks…


Then we have threads complaining about:
how broken games are, needing day 1 patch

How a game got delayed

How a publisher raised the game's price

How the gaming industry is stagnant and filled with GAAS etc

its all part of a long chain of events. I rather see a balance.
 
Top Bottom