Is 60 FPS killing overall graphical fidelity?

If I have to choose between ~1620p60 and 2160p60, I'm almost always going to choose 2160p30 due to the screen I'm playing on, it makes 30 look great compared to a monitor and its very big so I can really notice the difference in detail/sharpness.

This is as long as the usual things are in place, ie good framepacing, a not massive increase in input lag (aside from the average frame lag increase of 16.67ms from 8.33ms), good motion blur and its not a game that requires insanely precise button presses, a racing game or an fps. Although if a game is stuttering at 60fps for whatever reasons then I'd usually just lock to 30 regardless, like Psychonauts 2, which is fucked on my setup.

Its a shame since the game doesn't have a lot of high frequency detail so you don't get nearly as big a benefit going from 1440p to 2160p as you normally would, the UE4 TAA w/ sharpening looks fantastic and cleans up all the aliasing even at 1080p, let alone 1440p, and the animations have extreme speed increases in them so the more frames the better.

At the end of the day, I don't care if its 240fps, if its not (reasonably) locked in gameplay and cutscenes and almost more importantly, well frame-paced, then it can

go away gtfo GIF
 
Don't blame it on framerate. Blame it on resolution. The jump from 1440p to 4K is not enormous. Wasting shit tons of resources for 4K or some other arbitrary number with checkerboard 4K is just that, a waste.

You guys ate up the 4K marketing and now you're stuck in it. 60fps has been here since the beginning of gaming and it's not going anywhere as the bare minimum standard for most people.

Next up is 8K, imagine the diminishing returns.
Saw a commercial from Samsung for an 8k TV. There is practically zero 8k content out there! It's just madness.
 
Saw a commercial from Samsung for an 8k TV. There is practically zero 8k content out there! It's just madness.
They do it cause it something else they can market and sell. Won't deny that 8k is good for extremely large TVs though. For the most part it's useless. We still can't even get high bitrate 4k streaming. Long term focus for gaming should be high framerate 4k with all the ray tracing.
 
60 should be the base for all games no matter the sacrifice. I'll play in 1080p on my PC before going below 60. Smoothness equals graphics to me so even if 30 looks 50% better in screenshots, it still looks worse in motion.
 
You will… how to you spend it is another thing… you get double RT effects for example.
OK then let's look at a game like Ratchet and Clank or Spider-Man.

Made for 30fps so disable RT and slightly lower the resolution and you get 60fps. You're not getting double the graphics.
 
This is a conditioning thing. We are conditioned to 30 fps console games for decades, and since that was the standard, there was a clear flow of progress in terms of what can be done in realtime on consoles. When you suddenly change the rules and now need to double your workload without any visual changes, it's going to put a hamper on how much "better" the underlying technology and graphic prowess can be. In essence, yes, a 60 fps targeted game is going to "look worse" than a 30 fps targeted game designed for the same platform. It's a physics thing. Can't break the time restraints and cycle counts limits.
 
Enjoy this super smooth 60fps framerate from 1993! :messenger_tears_of_joy:


... did you just pick fucking Star Fox as an example for your argument? Really?

Cherrypicking doesn't even begin to describe what you just did there.

Also, joke's on you for not picking Stunt Race FX, which is not only choppy as fuck, but runs in a window too.
 
Saw a commercial from Samsung for an 8k TV. There is practically zero 8k content out there! It's just madness.

Here's a laugh: the first proper commercial 8K TV that Samsung released - the Q900R - only supported 8K input on one of its HDMI ports and even then it was limited to 8k@30Hz@4:2:0 :messenger_tears_of_joy: None of the apps supported 8K, including youtube, the only app with an 8K content on it.

The media player could play 8K videos (8K30 definitely, maybe even 8K60) apparently but I never found a file that would play after trying a bunch of different containers and formats. It didn't support VP9 which is I think is what all 8K and HDR content content on youtube is encoded with, brilliant effort from Samsung :messenger_tears_of_joy:

It came in 65", 75" and 85" and was 5k, 7k and 15k RRP respectively. Imagine you paid fucking 7g's for that 75" and found all that out afterwards, what a total joke. You could use it for 8K PC gaming... if it supported refresh rates above 30 lol not that you would've been able to run many games in 8K above 30fps anyway (This was mid-2018 so there was no RTX yet).
 
Yup, these Atari 2600 games move at 60fps.

All you need to know is that 60fps was the standard until 5th gen of consoles. That's a very well known fact and a small precentage of games that didn't follow that standard or had slowdowns but were still designed with 60fps in mind donesn't change that fact.

it wasn't, it was just the refresh ratio and it was usually interlaced so 60 half frames per second.
 
Don't blame it on framerate. Blame it on resolution. The jump from 1440p to 4K is not enormous. Wasting shit tons of resources for 4K or some other arbitrary number with checkerboard 4K is just that, a waste.

You guys ate up the 4K marketing and now you're stuck in it. 60fps has been here since the beginning of gaming and it's not going anywhere as the bare minimum standard for most people.

Next up is 8K, imagine the diminishing returns.

Have you actually seen 4K content on a large, high quality 4K TV though? If you have then what model was it?

I'm not trying to be a knob but I just see this "its not enormous" a lot and I wonder if said people have even seen the difference it makes on a large display. 720p to 1080p is double the pixels and thats a crazy jump in quality imo, 1440p to 4K is more than double the pixels.

I do agree some games don't see a large benefit and it can be a total waste, you have to check on a case-by-case basis.
 
Last edited:
it wasn't, it was just the refresh ratio and it was usually interlaced so 60 half frames per second.
Go emulate any NES, SNES game, etc.

Use RetroArch and press the frame advance button. Each frame of the 60hz refresh is a different frame in most games (not Starfox).

There are no repeated frames like in 30 fps games where each frame is repeated once more on a 60hz display.

They are literally, purely, 60 frames per second.
 
Last edited:
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.
Yes. I agree. I don't really need or want 60fps. I think games look jaw dropping at 30fps and sometimes 60fps just makes the game look arcadey or fake. 60fps also makes animations look bad sometimes and at higher frame rates, things like pop-in and other graphical oddities are more easy to spot. Like, I never knew how much bad pop-in there was in Horizon Zero Dawn until they unlocked the frame rate. Now it looks kinda like a PS3 game. It's weird.

60fps is a giant waste of resources in my opinion.
 
No - in fact most "next gen" games are getting both great graphics with an option for high framerate. I love having the option as personally I'll always go for performance.

For an example just check out the opening level in Ratchet and Clank's latest adventure. It looks amazing in all modes (the whole game looks great but that intro level is an amazing showcase).
 
I must be weird. When I play a game at 30fps, I don't see a blurry image, but when I play a low resolution game at a high frame rate, all I see is blur.
 
I don't think there's such a thing as really high-quality graphics anymore. There's 'good' and that's about it. It's not like Crysis days where there was absolute standout top-tier crop of games and nothing else compared.
 
Last edited:
Go emulate any NES, SNES game, etc.

Use RetroArch and press the frame advance button. Each frame of the 60hz refresh is a different frame in most games (not Starfox).

There are no repeated frames like in 30 fps games where each frame is repeated once more on a 60hz display.

They are literally, purely, 60 frames per second.

emulators allow games to be played better than on the original hardware.
 
OK then let's look at a game like Ratchet and Clank or Spider-Man.

Made for 30fps so disable RT and slightly lower the resolution and you get 60fps. You're not getting double the graphics.
Disable RT in 30fps and see what you can boost graphics.
If you disable it on 60fps you should disable for 30fps comparison.

Or enable RT on 60fps and compared with what you have in 30fps.

RT is a graphic feature.

In 60fps the GPU is rendering 60 frames per second with X graphical features.
In 30fps the GPU is rendering 30 frames per seconds with Y graphical features.

If X and Y are the same you need double the power to delivery 60fps.

Your mistake is disable RT that already made the GPU use be lower per frame compared with what you in 30fps… because RT is part of the graphic.
 
Last edited:
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.
Hum?

GTS already looks better than DriveClub… what are you smoking?

What they showed for GT7 looks way better than GTS.
 
Last edited:
But then you lose a lot of clarity and fertility in motion when you go 30fps.

Surely you can see the choppiness and the shimmer effect you get on object when you pan the camera at 30 fps. It's horrific.
Games like Uncharted, TLOU, GoT, GTA and so on are perfectly playable at 30fps, there are absolutely no clarity or 'fertility' issues at all.

The consoles have a tech limit, they must decide between top visual quality at 30fps or not so good quality at 60fps. In compeittive games where fast timing and input lag is key like multiplayer FPS games, fighting games, racing games, or hack and slash they typically decide 60fps. In single player slow paced games more focused in the story where in most of them you aren't fighting/shooting and that fighting/shooting is accessible and forgiving to appeal the mainstream gamers instead of the hardcore competitive niche, they prefer 30fps and top visuals because top visuals sell games and 60fps not.
 
Last edited:
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.
I hardly think that GT7 graphical shortcommings are because of the framerate, it probably has more to do with the cross gen burden.
 
Who here wants to hold back progress and just make this generation 4K/60fps?

This happens every time a new generation comes out. People go on about how finally the power is there to make 60fps happen, but the truth is... it's not the hardware that dictates the frame rate... it's the developers. So let me ask you this... are you willing to hold back advancements in graphics just to have 60fps at 4K? Because right now, I feel like games could look waaaaaaay better than they currently do and I feel like we haven't even scratched the surface of what's possible with these new machines.

I see so many "Now that I've experienced 60fps, I can't go back to 30 ever again" posts and I'm just like... eh... really? It's your FIRST time playing a 60fps game? Weren't there TONS of 60fps games on PS4 and PS4 Pro? The fact is... there are trade offs to get to 60fps... ALWAYS.

The way I see it is this... if I see a game running at 60fps, I think to myself... this game isn't pushing the hardware hard enough and I know there are cut-backs to make this happen and that bothers me.

By the way, if anyone wants to take a little trip down memory lane, I found some reddit posts from 2013 where people were saying the same things about the *new* PS4. It's pretty funny because most of if didn't come true.



















 
60 should be the base for all games no matter the sacrifice. I'll play in 1080p on my PC before going below 60. Smoothness equals graphics to me so even if 30 looks 50% better in screenshots, it still looks worse in motion.

And the irony is the devs add motion blur on top to make 30FPS supposedly look smoother, while in reality they're fucking up motion clarity even further...
 
... did you just pick fucking Star Fox as an example for your argument? Really?

Cherrypicking doesn't even begin to describe what you just did there.

Also, joke's on you for not picking Stunt Race FX, which is not only choppy as fuck, but runs in a window too.
Pretending that old games didn't run bellow 60fps is a thing that only zoomers do, the frame drops where so prominent in 2D games that we used to even sync the frames differently, there is a time in games before frame skip that younger people seem to have forgotten.

Frame drops didn't stop Metal Slug or any other game to push hard on the hardware at the expense of framerate and instead make compromises to get a good experience.
 
It's always been the other way around. Graphical fidelity crippling the frame rate. I'd be happy to see a shift in the other direction or ideally, for most games to include a performance mode and a fidelity mode.
 
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.
Forza Horizon 5 is 60 FPS 4k on PC and Series X and looks absolutely gorgeous and it's open world...
 
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.
GT Sport looks better than Drive club... Wut.
 
I agree with many of you that between 4k and 60 fps, 4k is the bigger whammy.

60 fps vs 30 fps is easy to notice the difference. Everything is smooth, it controls more responsive, and you dont need vasoline blur effect to cover choppy gameplay or turning the camera too fast.

Pretty sure everyone can also tell the difference between a clear 4k screen and a 1080p image of the same thing in your face. But the difference seems less so even though there's tons more pixels in a 4k image. And when youre playing a game (especially fast paced competitive games), it's not like you're standing there counting pixels. Youre moving around asap, which 60 fps helps every step.

A 4k image isn't even the right choice sometimes all things being equal. If the dev doesn't adjust text boxes and font size, a 4k image can have microscopic looking text where they didn't bother zooming in the text size to make it similar to playing at 1080p.

Call of Duty games are atrocious for this. First time I played COD4 Remastered on a 4k screen, I thought there was something wrong with the UI and scoreboard.
 
Last edited:
It does, and I'm sure we'll be getting plenty of 30fps games as the gen moves on devs start leaving last gen behind. With that said, I hope optional 60fps modes continue to be a thing whenever possible, in a lot of genres I'd rather have 1080p 60fps than 4K 30fps.

I think currently cross gen is a much bigger factor than 60fps when it comes to presentation and specially scope.

Personally, I think it would be awesome if this gen focused on performance and game design instead of visuals as I think with each gen we are getting higher diminishing returns with graphics. Even if 30fps as standard comes back, I'd rather see devs use that extra power on physics, AI, draw distnace, or new game mechanics instead of more games that play exactly the same as Ps4 ones but with much better visuals.
When I think about something like Final Fantasy 7 Remake part 2, I'd much rather get a sequel with the same visuals but bigger and more seamless areas, instead of incredible next gen graphics that once again force the game to be very linear and made up of small areas with hidden loading (squeezing through places, forced walking, etc) every 5 fucking minute.
 
Last edited:
its obvious that targeting 30fps you have double render time and can increase graphics vs 60fps tough in case of gt7 its not an option, can't see proper simcade without 60fps, where I see problem is 4k, gt sport on 4.2tf ps4 pro was 1800p cb so ps5 with gt7 push 2.88x more pixels (but yeah cb has also perf penalty), imo its too much, not much more room for improvements in other departments 2160p cb or dynamic res would be better solution
 
Last edited:
In the last gen this could be true since the CPU are very weak, but now? I don't think so, if you want to know what is holding us down, the answer is 4k resolution. Glad to see upscale techniques being improved in this gen.
 
Bruh GT7 on PS5 is probably targeting 2160p 60fps minimum with 120hz 1440p performance mode. Imagine playing a racing game at 30fps.
 
Last edited:
What the hell is going on here? It's getting fucking ridiculous.

NO! 60 fps does NOT ruin shit! It enhances the experience if anything. And you know what? Developers are often generous enough to give you spoiled brats who don't like 60 fps for whatever reason, the option to play in a 30 fps 'fidelity' mode.

And no, GT7 does not look worse than Driveclub, lol! Get your eyes checked.
 
Last edited:
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.
everything this gen looks worse than Driveclub
 
What the hell is going on here? It's getting fucking ridiculous.

NO! 60 fps does NOT ruin shit! It enhances the experience if anything. And you know what? Developers are often generous enough to give you spoiled brats who don't like 60 fps for whatever reason, the option to play in a 30 fps 'fidelity' mode.

And no, GT7 does not look worse than Driveclub, lol! Get your eyes checked.

Problem is the devs are building all the assets to hit 60fps and 1440p-ish, then just make the 30fps mode native 4K as an after thought which is a gross waste of resources. That is very very different from building a ground up 1080-1440 Temporal Upsampled 30FPS title, which would have a genertional leap in asset and mesh complexity (see UE5 demos). If a 60fps mode would be possible at all in such a game, it would be in 540p-720p.
 
Last edited:
60 FPS all the way.

I understand that in slow moving game it doesn't effect the experience much, but when I play a fast moving game and I pan the camera around and my eyes strain from seeing the frames splitting... Yeah, 60 FPS has shown me the light!
 
I agree with many of you that between 4k and 60 fps, 4k is the bigger whammy.

60 fps vs 30 fps is easy to notice the difference. Everything is smooth, it controls more responsive, and you dont need vasoline blur effect to cover choppy gameplay or turning the camera too fast.

Pretty sure everyone can also tell the difference between a clear 4k screen and a 1080p image of the same thing in your face. But the difference seems less so even though there's tons more pixels in a 4k image. And when youre playing a game (especially fast paced competitive games), it's not like you're standing there counting pixels. Youre moving around asap, which 60 fps helps every step.

A 4k image isn't even the right choice sometimes all things being equal. If the dev doesn't adjust text boxes and font size, a 4k image can have microscopic looking text where they didn't bother zooming in the text size to make it similar to playing at 1080p.

Call of Duty games are atrocious for this. First time I played COD4 Remastered on a 4k screen, I thought there was something wrong with the UI and scoreboard.
Idk. Launch up Bloodborne. Its super jaggy and low res but it plays good. The 30fps there is quite responsive
 
Last edited:
Top Bottom