• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Finally happy that Sony is acknowledging on the record that 30fps is “choppy”

Mythoclast

Member
I'd agree with you about normies and not changing options or understand it but I'd say the default is the other way around actually, from memory most of them are set to Fidelity by default. I know because it always made me think the Fidelity mode is the primary target output.
Most of the games Ive played, or at least a majority, have default for Quality
 

Mythoclast

Member
No one has any doubts that 30fps is better than 60fps, the issue is that some games look horrible in performance mode, like FF7 Rebirth which is very blurry, so if I have to choose I prefer 30fps in some cases and game genres.
Ill be very honest and this is my opinion, but any day I would take 900p or 720p at 60 than 4k at 30.

The responsiveness is just on another level
 

Dorago

Member
Higher than the human eye can see, entitled bigots, white rage, heteronormative phallogcentric commodity fetishishism, etc. etc. UNTIL... Sony just comes out and says it.

60 good, 90 better, 120 even better, 500 (or whatever the highest is not) best.

They're right about people preferring frame rate too. I switched to quality on DS Director's Cut but switched back after a few hours because of the once in a blue moon frame dips.

Bottom line, frames chugging makes you realized that you're playing a video game!

More frames more immersion, simple as.
 

Fbh

Member
To their credit Sony has been pretty good with performance this gen.
Both their cross gen and current gen only releases from first party studios have offered solid 60fps modes.

Forbidden West and Demon Souls are still light years away from every other game this gen in terms of offering great visuals with good IQ at 60fps. Every other game that looks as good or better is either 30fps or some upscaled 720p mess that only looks passable in screenshots.
 

nikos

Member
30 FPS is unplayable.
60 FPS is choppy on a higher refresh rate monitor.
Even dips to 120 get me to look at my FPS to see what's wrong.
To be fair, it depends on the type of game though.
 
Last edited:

FeralEcho

Member
And here I am remembering Blighttown in Dark Souls at 15 fps slowdowns on PS3 and still enjoying the game immensely despite its flaws.

I'd take a great game at shittier framerates any day over any of the high fps garbage nowadays but that's just me.

Colin Farrell Reaction GIF
 
Meh, the best generations had 30 fps games that were just fine. Meanwhile this gen sucks ass and we got more 60 fps games.

Give me ps2/360/ps3 era software output with 30 fps over this crap we have today.

Only young kids who grew up in the suck era talk like that. They don't know any better.
Great points. It doesn't matter how many frames it is, if the game is shit. The 1st party games need to be the flagships of what their studios have to offer and why you would invest in their platform. The only one stepping up is Nintendo.
 
Last edited:

Eszti

Banned
60 is choppy too…
yes but actually no. its all about getting used too. i can never get used to 30 fps again that is really unplayable but 60 easily after a few hours especially with gsyn/freesnyc even if you know how 120 fps or even over 200 fps are on a 240hz screen. if the frametimes are good of course but that counts for every fps target.
 
Last edited:

mrcroket

Member
I liked that they used one of the few games without object-based motion blur to show how choppy 30fps is :messenger_smirking: all modern Insomniac games look much more smooth at 30fps due to their beautiful OBMB implementations and have really low input lag due to how they sync the frames and process input.

As soon as I 100%d Ratchet and Spiderman 2 in fidelity mode I started a new game/NG+ in performance mode I was like "cool looks so smooth and feels better, looks blurry af though" and turned them both off.

The whole "75% of players choose performance mode" is mired in most people having such old TVs they don't even have game modes or no ALLM to auto switch to Game mode for them at worst and at best entry level TVs with trash contrast/HDR that don't show the increase in sharpness that fantastic or even great TVs do. Also probably they are sitting 2.5-3.5m from a 50/55" TV or worse situation which is another culprit.

I used to play my PS1 on a 15" Sony CRT at a distance of less than 1m. Otherwise I couldn't see shit/read the text and its not like those games had a lot of detail in them. Modern games have so much detail in them, the reason so many say resolution isn't important and upscaling is the same as native is because they aren't viewing on a screen big enough for the distance they are viewing from.

Every day I sell people (read: middle aged couples for the most part) a 65, 75 or 85" TV with super high contrast/brightness that they will sit even closer to than I recommend and they come back and tell me they can't believe how sharp the image is and how much of a difference 4K and HDR makes to their enjoyment of films and TV shows. These are 40-50 year old wives we're talking about, before they left the shop all they cared about was the colour of the stand, size of the bezel or thickness of the TV and even they come back and tell me they can see a massive difference.
5LkMx6g.png
 
Top Bottom