Is 60 FPS killing overall graphical fidelity?

I'm not going back to 30. Feels like dogshit in comparison.

And before anyone says I'm being dramatic and games have always been 30 FPS, I'll counter by saying, I've lived through the N64 era and played and loved 20FPS games, too. It's 2021, and we have 120hz screens on pretty much every device now. Anything less than 60 isn't acceptable any more.

Offer the option so the schmucks out there can have higher resolution reflections, fine.
 
Metro Exodus on PS5 has global ray traced illumination and also plays at 60fps (higher resolution with 30fps is not even an option) and it looks amazing. It's all about optimizing. 4k is not really THAT important past 1440p rendering so I hope more devs choose framerate over resolution going forward this generation.
 
Not really, kinda, but not really; higher framerates are easier to attain with having a better cpu, high resolutions however will always be a big hit on rendering.
So it's more the push for UHD that's "killing"if you want to be so dramatic graphical fidelty.

Also actually not really at all, since framerate is an essential part of 'graphical fidelty'.
 
4k is. 4k gaming is still dumb. 60 fps should be a standard across all platforms that should never go away. It's objectively better.
 
Just get a PS5. It has the best graphics of all time and 60 FPS.

Bm8JHly.jpg
 
Old / previous gen game though. Raytracing and 60fps won't happen often, would be my guess.

Personally, I wish people would forget about raytracing, it'll be something that drags games back to 30fps.
Depends on the type of ray tracing. Global Illumination like Metro Exodus is really a generation leap IMO. Ray traced reflections not so much albeit on Spider Man it makes a huge difference because of how much reflective materials it has.
 
It's not purely relative like that, though.

Everyone's brain effectively has an internal refresh rate which basically governs how they perceive motion. For most people it's around 60hz, and that's why 60fps tends to be the sweet spot.

Yes, going from 20 to 30 will look like a huge improvement, but that doesn't mean that same effect will exist for 60 to 90, or that anyone can ultimately get used to any fps with a sufficient adjustment period.
Of course I agree.
I remember experimenting a lot and motion blur plays a big role in this too.
Compared Doom 2016 and it looked identical 60hz + motion blur compared to 240hz no motion blur. veeery close in LOOK. of course feel is a different topic.
I think even at 60hz alone, without any motion blur, there are "brainable" gaps between frames and motion blur helps to "cheat" fill that in. When at 240hz the gaps are so small, that each sharp image at 240hz is so close to another, it looks good.

I love this example:
 
Rachet and Clank Rift Apart says I!!!!
a92IoWV.jpg

But in all honesty it depends on what the devs are aiming for. Gaming doesn't have to be all about the pinnacle of pixel perfection.

One of the only developers who's games play good enough in 30fps modes imo. Think it's the motion blur which makes it feel smoother. I played both ratchet and Spider-man ps5 like this after trying the 60fps mode to. Now assassin's creed valhalla plays like shit in 30fps mode so go 60fps on that one.
 
Last edited:
Yo what the fuck is wrong with this forum? This last year is just unbelievable with all of these opinion and questions and analysis and whatever. Its gotten even worse in the last couple of months.

OT:

No.
Must've gotten banned on reddit.
 
If yes, we're just getting priorities straight at last.

You can either come out with some new TV tech that doesn't make 30fps look like juddering shit and spread it all over the world within a few years.
Or you can make all the games you can 60fps first and care about fidelity last. Graphical fidelity isn't evolution in and of itself.
 
Forza Horizon looks infinitely better than Driveclub, and it's 60fps…and open world. Driveclubs framerate made it a much worse game.

In the new Ratchet game 30 FPS mode feels broken compared to 60.
 
Last edited:
yes, but that is only part of the issue. We have games now pushing for 4k and 60 fps versus 1080p and 30 fps of last gen. Their is only so much these consoles can do with that large a jump in both resolution and fps.
 
You have twice as much time to render a frame at 30fps so yes games targeting 60fps will always be "held back" compared to 30fps due to their 60fps framerate target.

A new dawn is coming for the industry with UE5 and the geometry engines in terms of polygons and draw distances being the hard barrier to hit 60fps. Saying that it will be ray tracing and lumen that make 30fps the norm again on console this gen unless there's optional modes to turn them off for 60fps target.
 
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.

Is stupidity an infectious disease here on Gaf or something?
 
Nobody gives a damn about any new and better tech until it comes out. Preferring 30fps is like preferring a Nokia 3310 over a latest iPhone (exaggeration but the point remains). When a new and better tech comes out it naturally becomes the new standard. Hopefully this gen is the time when 60fps becomes the standard.

Btw I'm not saying one is dumb to prefer 30fps. Not at all! You do you. But the 30fps crowd has to come to terms that 60fps is now quickly becoming the standard.

it's not a new tech, 60fps have been a thing since forever.

Most people were not aware of them until it became a way to pretend to be better than others on gaming forums.

30fps are just fine for most games, they allow them to be more bold and look better.
 
Lol, good job posting an FX chip 3D game. Because those were the rule, not the exception, am i right?

Again, you talk nonsense to justify your preference for graphics over frame rate. 60 FPS (or 50 for EU) was the STANDARD for the first 4 generations of console gaming. Play with your ATARI 2600 and tell me there is a single game that doesn't move it's sprites at 60 fps... Or how many NES, SNES, GENESIS, etc, games run at 60 and how many at 30 fps. At least 95% run at 60, most likely more.

30 fps (or lower) console games were the exception until 5th generation came along (PS1, SAT, N64) and sacrificed smooth gameplay for textured 3D at affordable prices.

oh yeah the smooth 60fps (probably interlaced) of atari 2600 games!




That's the output frequency you are talking about, this makes PS5 and XSX 120fps consoles.

Go check some retro DF and you will find plenty of games that were sub 60fps
 
it's not a new tech, 60fps have been a thing since forever.

Most people were not aware of them until it became a way to pretend to be better than others on gaming forums.

30fps are just fine for most games, they allow them to be more bold and look better.

No no no even on the Megadrive you had streets of Rage 1 at 30fps and streets of Rage 2 60fps.

Sega rally in the arcade round at 60fps the Sega Saturn version round at 30fps

And virtually every game benefit from it even a game like Civilization 5. Just scrolling the map on civilisation at 60fps makes a big difference over 30 where it's choppy
 
Killing it is probably dramatic, but quite literally, it's 16ms to render a frame vs 33, with double the compute time per frame you can always do more, yes.
 
i don't give a shit about graphical fidelity, if it's sub-60 or not a locked 60 i get headache and motion sickness so it's a requirement for me.
 
I'd say I do more good to the videogames industry than you and your group of framerate elitists since I'm not telling devs that they should do what I like. So... no.
It looks like the industry is moving more towards 60fps. More and more people are going this way deal with it
 
Depends on the type of ray tracing. Global Illumination like Metro Exodus is really a generation leap IMO. Ray traced reflections not so much albeit on Spider Man it makes a huge difference because of how much reflective materials it has.
Sure, I'm talking about Raytracing that most people think of / want when it's mentioned.
 
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.
But I've seen The Hobbit in theatres and it was one of the 60fps movies and it looked terrible, the pacing was way off, every movement seemed artificial. So you say it's actually good for gaming?
 
If instead of targeting 60fps you target 30fps obviously you can push harder the game visuals because you have twice the time to draw each frame. It's a fact, and this is why most AAA companies who focused more on high end visuals targeted frequently 30fps in genres were having high FPS isn't important.

The games we see right now in next gen are basically previous gen games so most of them don't have any issue on achieving 60fps and in fact since most of these games are crossgen they use FPS to differentiate both versions and offer another next gen version selling point with that. But I think that once they start to push hard the new gen tech in next gen only games many of them will go back to choose 30fps again to push the visuals even beyond.

Nobody gives a damn about any new and better tech until it comes out. Preferring 30fps is like preferring a Nokia 3310 over a latest iPhone (exaggeration but the point remains). When a new and better tech comes out it naturally becomes the new standard. Hopefully this gen is the time when 60fps becomes the standard.
In this case the hyperbolic comparision would be the opposide: to prefer your games look as in a Nokia 3310 to get 60fps intead of being ok with 30fps to take advantage of the latest iPhone horsepower to make it way better.
 
Last edited:
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.
60fps does hold back fidelity obviously.
And Driveclub doesn't look much better than GTSport.
 
If instead of targeting 60fps you target 30fps obviously you can push harder the game visuals because you have twice the time to draw each frame. It's a fact, and this is why most AAA companies who focused more on high end visuals targeted frequently 30fps in genres were having high FPS isn't important.
But then you lose a lot of clarity and fertility in motion when you go 30fps.

Surely you can see the choppiness and the shimmer effect you get on object when you pan the camera at 30 fps. It's horrific.
 
You are talking about cross gen games running at 60fps.
I would not say RDR2 looks any better then GT7 because they are vastly different type of games, but they share similer technologys and polycounts.
All the games we have seen so far are using last gen engines and tools with some next gen features added.
You also have to take into account that devs like R* and naughty dog have armies of talented artists putting in many hours to handcraft environments, so while a next gen game made with UE5 but with a smaller team will have suprior lighting and geometric detail but some things may look inferior because they have less time and resources.
 
Top Bottom