Is 60 FPS killing overall graphical fidelity?

the list I've provided is an approximate. are you sure that you realize that? furthermore the argument isn't about whether Tony Hawk has a metascore of 98 and OoT a metascore of 99. the argument is that In the totality of the list, you will find plenty games that had 60 FPS from the get go.

yes, a bunch of people myself included played and enjoyed those games, but we did enjoy those games in spite those frame rates, not because of them.
No, you didn't enjoy despite the framerates. At the time, you didn't know or care about that. The only thing that mattered was if the game was fun. And that's what matters to most people, reviewers included. 30fps provides fun gameplay and nice graphics. A nice balance and compromise. Just let it go. 30fps will always be here and will always be enough for the vast majority of people.
 
Last edited:
If you can`t see the difference you need an appointment with your eye doctor asap


EVERYBODY can see the difference betwen 30 and 60 fps the second they`ve exprienced/seen it once.

I'm didn't say I can't see the difference, I said the vast majority won't notice the difference. And this is a fact.
Read the whole thread.
 
That is a bullshit... framerate has nothing to do with visual fidelity.
It is about response time.
Totally disagree I like the look of the graphics when it runs at 60fps. Nothing to do with response time I cannot stand the stuttering and the shimmering you get when you move the camera around at 30 frame it's terrible. I don't care about response times.

60fps does sell consoles because I would not have bought my PS5 if games were running at 30fps. We only have to look at this forum to see how many people want it and some can't accept it.
 
What, never?!

Why are some people so insistent about this? It's like some sacred belief or something. At some point the trade off is going to stop being so stark and 60fps will become standard. It's already happening to some extent.

Also, caring about frame rates IS niche, yes, but so is caring about resolution or ray tracing or ultra settings. It's all niche stuff that casual gamers don't even know exists.

Buuuut, otoh, while casual games generally look shit, they also do happen to usually be 60fps... so maybe they do care just a little bit, even if they don't realise it.
Because they won't, as simple as that. Almost all, it not all, new released games have performance and quality mode and that will become standard but 30fps will never cease to exist. Make your peace with this.
 
The middle ground on consoles is have options. That's what we have been given. Let's just roll with that and leave it.
 
No, you didn't enjoy despite the framerates. At the time, you didn't know or care about that. The only thing that mattered was if the game was fun. And that's what matters to most people, reviewers included. 30fps provides fun gameplay and nice graphics. A nice balance and compromise. Just let it go. 30fps will always be here and will always be enough for the vast majority of people.

I have my fun with sub 30 FPS games. its you that argues that 30 FPS gives you a better game experience. lol
 
I'm didn't say I can't see the difference, I said the vast majority won't notice the difference. And this is a fact.
Your opinion is not a fact....
The "notice" part as in input lag is just as noticeable as the "see/fluidity" part is visible. 30 to 60 fps makes a very perceivable and visible difference in anything faster than a chess game.
Just because the vast majority of the population doesn`t game and doesn´t know what fps even means doesn´t mean that they wouldn´t be able to easily differentiate between 30 and 60 fps once they`ve been shown the difference once or twice.
Same as people suddenly loving the 120hz smartphone screens because "it`s so fluid"......you don´t miss something you`ve never had, but once you had it going back just sucks.
 
Last edited:
Totally disagree I like the look of the graphics when it runs at 60fps. Nothing to do with response time I cannot stand the stuttering and the shimmering you get when you move the camera around at 30 frame it's terrible. I don't care about response times.

60fps does sell consoles because I would not have bought my PS5 if games were running at 30fps. We only have to look at this forum to see how many people want it and some can't accept it.
Give it some time and most games will be 30fps only. If you bought a PS5 only for 60fps, then you're going to be disappointed.

GPU is way too weak for 60fps throughout the entire generation. If you wanna skip a game that is 30fps only, then that's your choice. I skipped so many good games on PS3 because of screen tearing. I still haven't played the first Uncharted and I don't care one bit.
 
Because they won't, as simple as that. Almost all, it not all, new released games have performance and quality mode and that will become standard but 30fps will never cease to exist. Make your peace with this.

We've already moved much closer to 60fps than we've been since the SNES days.

Games like Deathloop and Ghostwire would never have been 60fps in the past. A big AAA game like Suicide Squad would never have been 60fps. Games having performance modes as standard pretty much across the board is already a significant move towards 60fps as a standard.

There will almost certainly come a point where graphics advance to a point where the sacrifice just isn't worth it anymore. Resolutions, for example, are well into diminishing returns territory at 4k. And as more games are 60fps, more and more people will become aware of the difference and start to demand it as a feature. There will be a critical mass of awareness at some point.

It's so silly to just broadly declare that some trade off will always exist. Not only are 60fps games way more common now, but the old trade off of 20fps is non-existent, as is the subsequent trade off of inconsistent 25 to 30fps.
 
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.
Well you might want to go back and play Driveclub again. I did a few weeks ago and the 30fps is pretty jarring now that most racing games are 60. Plus GT Sport looks better than Driveclub.
 
Your opinion is not a fact....
The "notice" part as in input lag is just as noticeable as the "see/fluidity" part is visible. 30 to 60 fps makes a very perceivable and visible difference in anything faster than a chess game.
Just because the vast majority of the population doesn`t game and doesn´t know what fps even means doesn´t mean that they wouldn´t be able to easily differentiate between 30 and 60 fps once they`ve been shown the difference once or twice.
Same as people suddenly loving the 120hz smartphone screens because "it`s so fluid"......you don´t miss something you`ve never had, but once you had it going back just sucks.
It's not an opinion, it's a fact. The vast majority of people won't notice the difference.
 
We've already moved much closer to 60fps than we've been since the SNES days.

Games like Deathloop and Ghostwire would never have been 60fps in the past. A big AAA game like Suicide Squad would never have been 60fps. Games having performance modes as standard pretty much across the board is already a significant move towards 60fps as a standard.

There will almost certainly come a point where graphics advance to a point where the sacrifice just isn't worth it anymore. Resolutions, for example, are well into diminishing returns territory at 4k. And as more games are 60fps, more and more people will become aware of the difference and start to demand it as a feature. There will be a critical mass of awareness at some point.

It's so silly to just broadly declare that some trade off will always exist. Not only are 60fps games way more common now, but the old trade off of 20fps is non-existent, as is the subsequent trade off of inconsistent 25 to 30fps.
If TV and monitor manufacturers just avoided making higher res TV/Monitors, then we could still play at 1080p and it would look good.

However, 1080p content looks like ass on a 4K screen, but on the flipside, 4K looks stunning on a 4K screen.

I think 4K is the final resolution we need. No need to go higher, and definitely no need to go lower.

8K is a complete waste. You have to sit so freakin' close to the screen to even tell it's 8K, so it's not necessary and it's a waste of money and development.
 
Its either 40 fps (aka Ratchet n Clank mode) or 60 fps. Anything under and i aint paying for it.

To OP : Ofc its not killing anything, modern games are more than capable of running at 1440p+ resolutions and 60 fps if they are optimized correctly. Problem is : dev's don't want it. Its hard work and the tend to settle easily whilst leaving performance on the table.
 
Maybe it's the resolution, RT and other things. I remember the talk from devs about the PS2 being one of the hardest systems to develop for and yet within around a year it had a bunch of amazing looking games running at 60fps. Previous gens have had games that looked great and ran at 60fps and this shouldn't change unless the hardware is weak or there wasting the power on things that aren't that for their cost.
 
If TV and monitor manufacturers just avoided making higher res TV/Monitors, then we could still play at 1080p and it would look good.

However, 1080p content looks like ass on a 4K screen, but on the flipside, 4K looks stunning on a 4K screen.

I think 4K is the final resolution we need. No need to go higher, and definitely no need to go lower.

8K is a complete waste. You have to sit so freakin' close to the screen to even tell it's 8K, so it's not necessary and it's a waste of money and development.

8k just seems like a completely ludicrous aspiration. I've never actually seen it, so MAYBE there's more to it than just the normal size to distance equation, but I seriously doubt that.

If it's the same trade off as 4k then you'd need something like a 150" screen in a typical lounge to really get the full benefit. It's a crazy idea. Who the fuck could even afford that, let alone accommodate it?
 
Maybe it's the resolution, RT and other things. I remember the talk from devs about the PS2 being one of the hardest systems to develop for and yet within around a year it had a bunch of amazing looking games running at 60fps. Previous gens have had games that looked great and ran at 60fps and this shouldn't change unless the hardware is weak or there wasting the power on things that aren't that for their cost.
Doesn't matter if the hardware is powerful or weak.

You can ALWAYS squeeze more quality out of a 30fps presentation because you literally have twice the rendering time per frame. With that budget, in 30fps, you can go even bigger than you could at 60fps.

This will ALWAYS be the case.

Doesn't matter if the PS5 had an RTX 3090 level GPU, you can still render more stuff at 30fps.

So those amazing looking games on PS2 that ran at 60fps, could've looked even better had they increased the frame times to 33ms.
 
Tree self shadowing in GT7 looks terrible at times, but I doubt it was caused by the 60fps target.
That is good example.
Tree and self shadowing could be in another level in GT7 if 30fps.
Even RT was not out of option for gameplay.
 
Last edited:
Doesn't matter if the hardware is powerful or weak.

You can ALWAYS squeeze more quality out of a 30fps presentation because you literally have twice the rendering time per frame. With that budget, in 30fps, you can go even bigger than you could at 60fps.

This will ALWAYS be the case.

Doesn't matter if the PS5 had an RTX 3090 level GPU, you can still render more stuff at 30fps.

So those amazing looking games on PS2 that ran at 60fps, could've looked even better had they increased the frame times to 33ms.
This is the problem, as screen shots sell and consoles try to justify their existence vs PC, 30 fps is often the result as the game tries to look on par with a good PC, only that the games will run at 30 on console vs 60 on PC.

I personally would like ALL games to be designed with 60 fps as the target on it's intended hardware and then design the graphics and other aspects around that target. I will always chose and love a game built with that in mind rather than an after thought.

Most games are in motion in some way or another so fluidity is imperative, 60 is lowest ideal fps for it to FEEL good. DS Remake is an EXCELLENT example of what one can achieve when a game is built around 60 fps.
 
This is the problem, as screen shots sell and consoles try to justify their existence vs PC, 30 fps is often the result as the game tries to look on par with a good PC, only that the games will run at 30 on console vs 60 on PC.

I personally would like ALL games to be designed with 60 fps as the target on it's intended hardware and then design the graphics and other aspects around that target. I will always chose and love a game built with that in mind rather than an after thought.

Most games are in motion in some way or another so fluidity is imperative, 60 is lowest ideal fps for it to FEEL good. DS Remake is an EXCELLENT example of what one can achieve when a game is built around 60 fps.
We aren't even close to seeing what this generation is capable of and that's mostly due to 60fps.

It's too bad. Usually by this time, we'd have some jaw dropping games, but sadly if devs continue to push for 60fps, the games are just going to look like slightly better versions of PS4 games.
 
Source? They may not care, but in one way (slideshow) or another (input lag) they will notice.
Another over exaggerated thing. You would be hard pressed to notice +16ms input lag when most people don't even know the input lag their TVs are adding or the massive lag their online games have due to even the best possible ping.
 
We aren't even close to seeing what this generation is capable of and that's mostly due to 60fps.

It's too bad. Usually by this time, we'd have some jaw dropping games, but sadly if devs continue to push for 60fps, the games are just going to look like slightly better versions of PS4 games.
I rather prefer to see what this generation can do at 60 fps as base line, and then continue on forward with always 60 as it's minimum target.

With the statement "not even close seeing what this generation is capable (at 30 fps)"... why not move the goalpost to 24, then 20, and then 15, you could get freakin amazing graphics then, you could in theory get games that will rival PS6 in fidelity, but where does one draw the line.

Yes 30 IS playable, but only barely so. 60 is where things get comfortable and 120+ is luxury. I don't want to PLAY games at 30 as it feels sluggish, and it's hard to make out details in the environment when rotating the camera, you have to constantly stop the camera to make out those details and that sux, I even get a bit nauseated when rotating the 30 fps cameras that apply heavy motion blur to cover up for the low fps.

I guess there will never be a consensus on this issue. Some will always be ready to sacrifice motion clarity, animation fluidity and responsive controls for maximum graphical fidelity and then there will be us who wants a minimum of 60 but games to be made with that in mind first.
 
Another over exaggerated thing. You would be hard pressed to notice +16ms input lag when most people don't even know the input lag their TVs are adding or the massive lag their online games have due to even the best possible ping.
I play fighting game on the regular and I can notice a difference of around 5-8 ms. There is a noticeable difference between a TV's inputlag of 10m vs 20ms.
 
your opinion is not a fact. And thinking your opinion is a fact doesn´t make it a fact either.
Still a fact, sorry you don't like it.
Source? They may not care, but in one way (slideshow) or another (input lag) they will notice.
Still a fact. People don't care.
They also don't notice input lag unless it's a very long lag.
They won't notice lag difference between 30 and 60fps.
Another fact.
 
I play fighting game on the regular and I can notice a difference of around 5-8 ms. There is a noticeable difference between a TV's inputlag of 10m vs 20ms.
You're NOT most people.
The vast majoriy of people don't play fighting games where every single frame counts. You're trained for this, the vast majority of people aren't.
 
Last edited:
I rather prefer to see what this generation can do at 60 fps as base line, and then continue on forward with always 60 as it's minimum target.

With the statement "not even close seeing what this generation is capable (at 30 fps)"... why not move the goalpost to 24, then 20, and then 15, you could get freakin amazing graphics then, you could in theory get games that will rival PS6 in fidelity, but where does one draw the line.

Yes 30 IS playable, but only barely so. 60 is where things get comfortable and 120+ is luxury. I don't want to PLAY games at 30 as it feels sluggish, and it's hard to make out details in the environment when rotating the camera, you have to constantly stop the camera to make out those details and that sux, I even get a bit nauseated when rotating the 30 fps cameras that apply heavy motion blur to cover up for the low fps.

I guess there will never be a consensus on this issue. Some will always be ready to sacrifice motion clarity, animation fluidity and responsive controls for maximum graphical fidelity and then there will be us who wants a minimum of 60 but games to be made with that in mind first.
If 30fps IS playable but barely, then why do you think 20 and 15fps would work?

You're literally contradicting yourself hahaha
 
I play fighting game on the regular and I can notice a difference of around 5-8 ms. There is a noticeable difference between a TV's inputlag of 10m vs 20ms.
Same with me man. If I try playing Rocket League on my friend's TV, I can tell the input lag is worse because it's a very precision-based game. When I play on my monitor, it's very responsive and I can make minute adjustments just fine. On his TV, I constantly miss the ball because my muscle memory is telling me to press the button but his TV doesn't respond in time.
 
You're NOT most people.
The vast majoriy of people don't play fighting games where every single frame count. You're trained for this, the vast majority of people aren't.
That is true I admit, but most people will learn to appreciate things they didn't know they were are missing. There are quite a few younger console exclusive players that are finally getting a taste for what 60 fps brings to the gaming experience.
 
Another over exaggerated thing. You would be hard pressed to notice +16ms input lag when most people don't even know the input lag their TVs are adding or the massive lag their online games have due to even the best possible ping.
Hard? When I lock a game like R6 Siege at 30fps or less I almost vomit. My mouse doesn't lie to me.
 
If 30fps IS playable but barely, then why do you think 20 and 15fps would work?

You're literally contradicting yourself hahaha
I'm stating what I perceive as barely playable. I just saying that people more than happy with 30 fps, hungry for max fidelity, why not lower the fps even more? At 20fps which some old school titles played at you could get some ridiculous graphics on this gen, so where does one draw the line?
 
Give it some time and most games will be 30fps only. If you bought a PS5 only for 60fps, then you're going to be disappointed.

GPU is way too weak for 60fps throughout the entire generation. If you wanna skip a game that is 30fps only, then that's your choice. I skipped so many good games on PS3 because of screen tearing. I still haven't played the first Uncharted and I don't care one bit.
Well so far I've been more than happy nearly every game have had 60fps mode. If you look back on this forum before this generation launched. It would be like the generation before and nearly all games would be 30fps which hasn't happened.

4k/30fps is more demanding than 1080p/60fps. Let's not talk about how demanding Ray tracing is just for some stupid reflections.
 
Last edited:
I'm stating what I perceive as barely playable. I just saying that people more than happy with 30 fps, hungry for max fidelity, why not lower the fps even more? At 20fps which some old school titles played at you could get some ridiculous graphics on this gen, so where does one draw the line?
Because if 30fps is barely playable, then 20fps would be far too low.

30fps is a good mix between frame rate and fidelity. That's why it's used. Otherwise, we would see some devs drop down to 20fps for the ultimate eye candy. As far as I know, there are no 20fps games (that were capped that way on purpose)
 
I guess it's a fact that 60fps games are better since 60fps is a better gameplay experience than 30fps and most people agree. Facts are facts. Double stamp that. No erasies.
 
I'm stating what I perceive as barely playable. I just saying that people more than happy with 30 fps, hungry for max fidelity, why not lower the fps even more? At 20fps which some old school titles played at you could get some ridiculous graphics on this gen, so where does one draw the line?
At whatever framerate devs makes the console players swallow.
 
At whatever framerate devs makes the console players swallow.
That is why I'm so happy to see the abundance of performance mode this generation, but things would be even more amazing if there were only 60 fps and all resources were put into maximizing the games fidelity within those constrains. DS Remake and G7 are great examples of how good things can look, heck even Returnal is beautiful despite it's soft appearance.
 
Last edited:
Because if 30fps is barely playable, then 20fps would be far too low.

30fps is a good mix between frame rate and fidelity. That's why it's used. Otherwise, we would see some devs drop down to 20fps for the ultimate eye candy. As far as I know, there are no 20fps games (that were capped that way on purpose)

Devs should target 20fps to pursuit that sweet 99 score. /s
 
Last edited:
Well so far I've been more than happy nearly every game have had 60fps mode. If you look back on this forum before this generation launched. It would be like the generation before and nearly all games would be 30fps which hasn't happened.

4k/30fps is more demanding than 1080p/60fps. Let's not talk about how demanding Ray tracing is just for some stupid reflections.
Yeap 4k30 is more demanding than 1080p60 but 1080p60 looks shit in 4k TVs.
 

Devs should target 20fps to pursuit that sweet 99 score. /s
Of devs wants to make a 20fps game and that game is good I will be glad to play it.
Framerate will never be a defining factor to play or not play a good game for me.

I played that Zelda and it is one of the best gaming experience I have.
BTW Goldeneye 007 is up to 20fps too... and people (me included) have a blast playing it.

Edit - Sorry I forget Goldeneye can go low as 10fps in gameplay.
 
Last edited:
Top Bottom