• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Is 60 FPS killing overall graphical fidelity?

My cousin play these games and he screen to everybody in his school class room how Fortnite looks way better on PS5 than PS4.
He wants a PS5 just because that now.

Ask him about the framerate? lol
He is fine with whatever the game runs.
Like I said, he plays Fortnite (not a graphic showcase) on newer stuff (PS5). He would play on PS4 if PS5 didn't exist, no?
 
What define "normal"? Because I'm sure most gamers plays on 30fps instead 60fps on consoles.
That is the normal.

Now on PC I rarely see anybody playing in 30fps... always 60fps or higher... so on PC 60fps is normal.
I feel thx to zen2 cpus we will get 60fps option in most games this gen on ps5/xsx , on previous gen thx to really weak jaguar cores it was very hard but luckily amd could provide console makers with much more cpu power this time around, so yup, 60fps will be very common hopefully, at least thats the trend we are seeing, but like OP guessed, 60fps wont come at 0 price, cuts to res and very probably lowered settings will be commonplace too :)
 
You can have a photorealistic game with 8K textures and incredible geometry detail, but if it's 30 fps, once you push that right stick to left or right, the entire image turns into puke. No thanks.
I dont know how many hundreds of hours I have on Driveclub and all Sony AAAs, and not once I had a headache.

If you 'puke' its your fault, and only your
Last gen had 60fps games in specific genres like FPS and fighting games. It was really just the usual suspects. I think people are less likely to notice the difference when those games are limited to specific genres: they'll probably tend to just attribute the smoothness of a COD to it being a shooter rather than the frame rate per se.

Virtually all big cinematic games were 30fps last gen. People didn't complain because they just accepted that was the situation - whether you knew about the weak CPUs or not, it was easy to infer that 60fps just wasn't a viable option.

This generation is different because the decent CPUs actually make it a relatively easy concession just by lowering resolution and settings. It's basically the first time since we moved to 3d visuals that this has been the case.



I would actually love to do a blind test of non-gamers to see what they think looks better. I strongly suspect it's a fallacy that the preference for fidelity at the expense of everything else is some kind of casual position. It looks much more like a graphics nerd's prerogative to me.

Obviously there's a matter of degree: I'm sure a casual player would prefer, say, Uncharted 4 at 30fps over Uncharted 3 at 60fps. But Uncharted 4 for at 1440p60 vs 4k30? I bet the vast majority would think 60 looks better.

So I'm not sure why 60fps should be the preserve of "PC Master Race". Who's to say that 4k and RT and ultra settings aren't the real nerdy features that people should have to buy a $3k PC to indulge in?



Doesn't the same apply to locked 30fps modes??
Adding to this reasonable post, and for truth sake, I would say casual people should get enough exposure time to both 1440@60 and 4k@30, on a good 4k TV, afrer all people have different tastes.

One thing also, 4k with temporal antialias looks noteacibly worse than 4k with mm antialias.
Having a 4k display for blurry games kinda takes away the point of it.
 
If you 'puke' its your fault, and only your
28001118965_63fc902a00_o.png
 
Fidelity mode is no doubt awesome for screenshots and shares to Youtube. The ability to show zoom in and see each individual blade of grass and count each pixel is great for online discourse. But to actually opt to play the game at 30 fps over 60 fps is very questionable. Hyperbole or not you are getting a worse gaming experience at 30.
Subjective
 
Sad and petty? I thought it was funny. 🤷‍♂️

And I will be building a PC. 4060ti is the goal once it releases. Less inclined right now with everything being 60fps on current gen systems. If things go back to 30fps, i'll buy a prebuilt immediately if I can't get my hands on a GPU.
Sad and petty. Some people have different tastes than yours and it doesn't make you any better than anyone else.
 
The problem is PS5 and XSX were setting expectations too high with boosting last gen games to 60 FPS. It's going to be pretty jarring going from a 60 FPS Horizon on PS5 to a 30 FPS Horizon 2. Does Horizon 2 look miles better? Sure but the play will feel sluggish until you readjust.

There are a ton of console gamers who have 'seen the light' as to why 60 FPS is superior but now will have to settle back into 30 FPS as true next gen games start coming out.
Most people barely understand what this is or even notice it, so I'm really doubt a lot has "seen the light" or that they even care about it.
 
In PC games you can turn it off in a lot of cases. On consoles, absolutely not. Very few games offer the option to turn it off. Performance mode typically runs at 60 fps or higher with far less aggressive motion blur.
The blur is still there though. Take a screenshot of spinning around at 60fps and show me what it looks like because how can you see the details if everything is smeared?
 
I mean if you want your games to look like The Matrix UE5 demo, then you will have to eventually go to 30fps at some point, but for now 60fps is fine and cannot be compared to the issue last gen with the Jaguar CPU causing a bottleneck that prevented even Xbox One X from achieving 60fps .
right, but the gains from The Matrix vs. say Horizon FW (@60) are absolutely not worth losing 30fps for.

marginally better graphics vs. literally double the frames. never going to be worth it.
 
Last edited:
I dont know how many hundreds of hours I have on Driveclub and all Sony AAAs, and not once I had a headache.

If you 'puke' its your fault, and only your

Adding to this reasonable post, and for truth sake, I would say casual people should get enough exposure time to both 1440@60 and 4k@30, on a good 4k TV, afrer all people have different tastes.

One thing also, 4k with temporal antialias looks noteacibly worse than 4k with mm antialias.
Having a 4k display for blurry games kinda takes away the point of it.
The bolded is hyperbole and I never believe these statements. Funny how the same people watch movies at 24fps and have no headache. Or puke.
 
I would actually love to do a blind test of non-gamers to see what they think looks better. I strongly suspect it's a fallacy that the preference for fidelity at the expense of everything else is some kind of casual position. It looks much more like a graphics nerd's prerogative to me.
There is this one focused in First Person Shooter.


He focus in lower framerate... like 2.5, 3, 5, 7 and goes up to 60fps.
There are different curves for shooting and movement... in simple terms movement has smaller difference after 15fps... it gives scores so 15fps starts at 0.75 (75%) and 60fps is 1 (100%)... 30fps stay there around 85-90%.
Thinks changes in shooting where the curve is more equalized... 15fps received 50% score while 30fps received 75% and 60fps received 100%.

For him framerate affect more shooting than movement.
He concluded framerate unplayable are the low ones like 3fps or 7fps.
BTW he asset the previous streaming test resulted in 3-7fps being acceptable because in a First Person Shooter the framerate affect the player performance (something that doesn't happen when watching a stream).

I should love to have more tests like that... of course with a mix of graphic and framerate to be choose by the user.
 
Last edited:
There is this one focused in First Person Shooter.


He focus in lower framerate... like 2.5, 3, 5, 7 and goes up to 60fps.
There are different curves for shooting and movement... in simple terms movement has smaller difference after 15fps... it gives scores so 15fps starts at 0.75 (75%) and 60fps is 1 (100%)... 30fps stay there around 85%.
Thinks changes in shooting where the curve is more equalized... 15fps received 50% score while 30fps received 75% and 60fps received 100%.

For him framerate affect more shooting than movement.
He concluded framerate unplayable are the low ones like 3fps or 7fps.
BTW he asset the previous streaming test resulted in 3-7fps being acceptable because in a First Person Shooter the framerate affect the player performance (something that doesn't happen when watching a stream).

I should love to have more tests like that.
You can easily test yourself using mouse (the most precise "controller") aiming on shooters locked to 30 then 60+ and see the diference in sluggishness/input lag.
 
Last edited:
You can easily test yourself using mouse (the most precise "controller") aiming on shooters locked to 30 then 60+ and see the diference in sluggishness.
Yeap I sense the response time difference shooting.
That is why I advocate for everybody have the same framerate in PVP for a fair playfield in consoles.
 
Last edited:
30 fps - acceptable
60 fps - great
120 fps - perfect

I've played quite a few games with frame rates even below 30 that were still good or very good, but I haven't played a single one that wouldn't have been better with 60 fps.
 
The blur is still there though. Take a screenshot of spinning around at 60fps and show me what it looks like because how can you see the details if everything is smeared?
yYYGlGP.jpg


v0OrIcQ.jpg


Still waiting to see someone take a spinning screenshot of uncharted 4 with performance mode and motion blur on.
If you needed any more evidence that you are absolutely clueless about this subject, then let this be it.
 
Last edited:
I'm concerned that I hurt your feelings and am offering a tissue.

If I wanted to cry, i'd play my games in 30fps modes.
Hm, it seems you're still crying. Oh, well, have some tissues. Don't let 30fps modes trigger you this much, it's not healthy.
 
Last edited:
Some real lames in here acting like 30fps is some blurry mess. Practically all of the best games of all time were 30fps when they came out. Were they unplayable? Y'all need eye surgery.

What a joke.

30FPS rules. Visual fidelity is infinitely more important to push the medium forward that fucking framerate.
 
Last edited:
Imagine defending 30 fps. what world are you guys living on.

30 fps is dog shit always was, it was a nessesary evil back in the day to get games to run at any capable level.

Now its just shit optimisation. Horizon game for example could easily be running at 60 fps if they actually give 2 shits about it.
 
Last edited:
Some real lames in here acting like 30fps is some blurry mess. Practically all of the best games of all time were 30fps when they came out. Were they unplayable? Y'all need eye surgery.

What a joke.

30FPS rules. Visual fidelity is infinitely more important to push the medium forward that fucking framerate.
Spider Man Lol GIF
 
Imagine defending 30 fps. what world are you guys living on.

30 fps is dog shit always was, it was a nessesary evil back in the day to get games to run at any capable level.

Now its just shit optimisation. Horizon game for example could easily be running at 60 fps if they actually give 2 shits about it.
Yup. Instead of giving us this option, they should have tried everything they could to get the game to 60 FPS and salvage the resolution and overall presentation in the process. Instead they toss in a 60 FPS mode with shitty CB and called it a day.
 
Some real lames in here acting like 30fps is some blurry mess. Practically all of the best games of all time were 30fps when they came out. Were they unplayable? Y'all need eye surgery.

What a joke.

30FPS rules. Visual fidelity is infinitely more important to push the medium forward that fucking framerate.
And every single one of these games are much better at 60+

PS:15fps is even better. 7.5 even more...
 
Last edited:
And every single one of these games are much better at 60+

PS:15fps is even better. 7.5 even more...
Visuals are more important than framerate. Always have been, always will be. This is why developers choose to continue to put out games at 30 - its the best balance between fidelity and smoothness.
Horizon FW looks significantly worse at 60. That trend will continue to be more noticeable as the gen moves forward. Thank fuck these developers dont listen to framerate nerds lmao.
 
Why do you think graphics/visual fidelity have historically been the biggest selling point for next generation consoles over everything else? Huh??? Riddle me that.
I mean, lets take this list for example:

it's only an approx. but many games on there are 60 FPS.

it's completely fine to take graphical fidelity over frame rate, but it's not like the low frame rate will enhance your game :messenger_tears_of_joy:.
Breath of the Wild is a fantastic game, but the best place to play it isn't on the Switch.
 
Imagine defending 30 fps. what world are you guys living on.

30 fps is dog shit always was, it was a nessesary evil back in the day to get games to run at any capable level.

Now its just shit optimisation. Horizon game for example could easily be running at 60 fps if they actually give 2 shits about it.
What was necessary about 30fps back in the day? It wasn't necessary just as it isn't necessary today. They could have easily cut down the graphics to reach 60fps. They don't because you get better graphics at 30fps. Some people are happy with 30fps with better graphics. Judging by the fact that most big games were 30fps I'd say most preferred better graphics at 30fps vs lower graphics at 60fps.

Horizon Forbidden West has a 60fps mode BTW if that is what you mean by Horizon game. The choice nowadays makes the differences smaller between the two modes.
 
Last edited:
I mean, lets take this list for example:

it's only an approx. but many games on there are 60 FPS.

it's completely fine to take graphical fidelity over frame rate, but it's not like the low frame rate will enhance your game :messenger_tears_of_joy:.
Breath of the Wild is a fantastic game, but the best place to play it isn't on the Switch.
that didnt answer my question

Why do you think graphics/visual fidelity have historically been the biggest selling point for next generation consoles over everything else?
:messenger_mr_smith_who_are_you_going_to_call:
 
Visuals are more important than framerate. Always have been, always will be. This is why developers choose to continue to put out games at 30 - its the best balance between fidelity and smoothness.
Horizon FW looks significantly worse at 60. That trend will continue to be more noticeable as the gen moves forward. Thank fuck these developers dont listen to framerate nerds lmao.
Play a game with PS1 visuals at 120fps and Flight Simulator 2020 at 0 fps and come back here saying visuals are more important than framerate again.
 
I mean, lets take this list for example:

it's only an approx. but many games on there are 60 FPS.

it's completely fine to take graphical fidelity over frame rate, but it's not like the low frame rate will enhance your game :messenger_tears_of_joy:.
Breath of the Wild is a fantastic game, but the best place to play it isn't on the Switch.
OoT was 20fps and is the higher one in your list. lmao
 
As far as I am concerned 60FPS is the only true fidelity mode i need.
I gladly sacrifice resolution for more FPS.
 
Last edited:
that didnt answer my question


:messenger_mr_smith_who_are_you_going_to_call:
then why wasn't Atari Lynx leading the handheld wars? Why didn't the Xbox win against the PS2? Why was the Wii winning against Xbox and PS3? Why is the Switch killing it right now?

graphic fidelity is nice, but it's not everything.
 
Imagine defending 30 fps. what world are you guys living on.

30 fps is dog shit always was, it was a nessesary evil back in the day to get games to run at any capable level.

Now its just shit optimisation. Horizon game for example could easily be running at 60 fps if they actually give 2 shits about it.
So much hyperbole. I'm honestly glad I'm not as sensitive as y'all since I regularly play older games that has sub 30fps, imagine bitching about that? Nah, I've got better things to do than screaming at devs and at the screen because a game is not 60fps lol.
I'll play HFW at 60fps though, since my set is still 1080p.
 
OoT was 20fps and is the higher one in your list. lmao
and Soul Calibur and Tony Hawk were 60 FPS.

Edit: btw. OoT can be played in 60 FPS these days, and yes, it's better.
 
Last edited by a moderator:

That's a pretty good overview of why frame rate matters, and why 60fps isn't just some arbitrary number but very closely tied to the human ability to perceive motion.

60fps looking much better than 30 is exactly what we'd expect given our biology. The idea that it's just some obsession of audio visual nerds is wrong. 120fps, yeah that's imperceptible to almost everyone, but 60 is absolutely a step change from 30.
 
Play a game with PS1 visuals at 120fps and Flight Simulator 2020 at 0 fps and come back here saying visuals are more important than framerate again.
If FS2020 is 0fps I'm not playing it I'm watching a picture. Also, FS2020 is 30fps on consoles and people use that as their go to "omg next gen graphics" game.
 
Visuals are more important than framerate. Always have been, always will be. This is why developers choose to continue to put out games at 30 - its the best balance between fidelity and smoothness.
Horizon FW looks significantly worse at 60. That trend will continue to be more noticeable as the gen moves forward. Thank fuck these developers dont listen to framerate nerds lmao.
You have a point, but there are certain games that feel different at 60 fps. The movement is more responsive. It feels more fluid. Demon Souls remake is the best case scenario where you can have high quality visuals at 60 fps. I'd like to see more Developers take that route, than shoot for 4k over everything else. Even Spiderman MM with 1080 raytaced reflections is a comprimise Im willing to except if the game runs at a smooth 60s. There needs to be a balance or more options to select.
 
And every single one of these games are much better at 60+

PS:15fps is even better. 7.5 even more...

Games should run at 1FPS max, so that you have more time to enjoy the visuals. /s

LMFAO @ people defending 30FPS, go watch a movie or something, this is a gaming forum.
 
Top Bottom