• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Fidelity>Framerate. Come at me.

smart GIF


A lot of you are getting caught up on resolution versus framerate. In OP's post, they specifically don't state anything about resolution. Fidelity is our key word, and there's a huge amount that goes into graphical fidelity beyond resolution. Texture quality, lighting and shadow technologies, geometry complexity, etc. OP is saying they'd rather push all of that to the limit rather than sacrifice whatever necessary to achieve 60 FPS for the sake of it. I never thought I'd be one to use Cyberpunk as an example, but even with an extremely powerful card, the game runs like butt on ultra maximum; you will not achieve a smooth 60 unless you have an extremely powerful rig with a 3090 or better, and even then, it's only going to be achievable with some level of DLSS.

I guess it begs the question, why give us incredible graphical fidelity if it can't be achieved? If so many people say they won't accept anything less than 60, why do we pursue the technology?

Whenever Sony first party games come out and they push the envelope of graphics presentation, it's all oooh ahhh, no one ever says "You know what, I wish they used half poly models so we could get higher framerates. Ray traced global illumination is nice but I think if they used baked lightmaps that would be fine."

I want developers to push the boundaries of the technology and what's possible on this given power envelope. If they can ratchet up the framerate with whatever juice is left, great. You can always turn your settings DOWN to achieve higher framerates, but for the people that want higher fidelity, they can never turn fidelity UP past whatever the developers have implemented. For now. I'm looking at you RTX Remix.

Every single person here is guilty of choosing graphics over framerate, and the evidence is in your very profile. Look at your avatars. Each one of them was chosen because you liked the art, or the screenshot, or how it was presented. Not one fucking one shows how fast they are being rendered. Even if you have a screen cap of something at 120FPS, guess what, it ain't now. You CAN'T sell FPS in a screenshot, but you can fidelity. Higher fidelity is ALWAYS better.

I definitely have some odd criteria when considering a purchase. How long is this game? Is it multiplayer? Local? Does it have a new game plus? Controller support? Know what I've NEVER considered? Is it 60FPS or higher. As I've stated in the past, a good implementation of motion blur and other factors, 30FPS is perfectly fine. Games tailored to 30 CAN look really good. Your eyes will adjust. You'll be okay.

Edit: I'd like to amend that some games should be designed for high FPS. Online shooters and the like should spec for high framerates. That's a given. My case is specifically for single player experiences.
 
Last edited:
If you're dumb enough to want 30fps then you shouldn't be bothered about us wanting 60.

Having said that I'm on pc so I like minimum to be 120 and more at 144 ideally.
Look at the badass over here. He is on pc playing at MINIMUM 120fps. what a guy !
The only dumb here is you if you would skip games because not meeting minimum 120fps of yours.
Guess You are not playing any of souls games then (60fps lock)
 
One of the most interesting things I keep reading is '60fps should be the minimum, it's 2022'.

When will these people understand that it doesn't matter what year we are in. Games become increasingly complex and require more powerful hardware to run them. Sometimes this does mean that they will need to run at lower frame rates to be able to run at all. No amount of resolution reduction or graphics tweaking can make that much of a difference if the technology in the hardware cannot pull its weight.
 
One of the most interesting things I keep reading is '60fps should be the minimum, it's 2022'.

When will these people understand that it doesn't matter what year we are in. Games become increasingly complex and require more powerful hardware to run them. Sometimes this does mean that they will need to run at lower frame rates to be able to run at all. No amount of resolution reduction or graphics tweaking can make that much of a difference if the technology in the hardware cannot pull its weight.

It's almost like 30fps is base-level acceptable product for cutting edge graphics upon release and within years new hardware is able to support it and push the graphics further. Like games are meant to be sold over the span of years and not all at once.
 
Look at the badass over here. He is on pc playing at MINIMUM 120fps. what a guy !
The only dumb here is you if you would skip games because not meeting minimum 120fps of yours.
Guess You are not playing any of souls games then (60fps lock)
Being on PC I'm not forced to 30fps my man. That's the point.
 
Being on PC I'm not forced to 30fps my man. That's the point.
I know. I have a 3080 pc and I play 4k120 almost everything.
But the point is something else. Nobody is forcing you to do anything on console.
These haters take it as an attack... and meantime this should not even be a thought taken into consideration. We should just play games.
That's why I like console gaming despite being console gamer. No distractions.

That said, I did just (for firs time in years) replayed Crysis (remastered). It did 60-80fps 4k maxed out with shadows, objects and vegetation set to high. On very high or can it run crysis, the game shits itself.
It is still 10x better than ps5 port. What were they thinking. It looks so weird
 
smart GIF


A lot of you are getting caught up on resolution versus framerate. In OP's post, they specifically don't state anything about resolution. Fidelity is our key word, and there's a huge amount that goes into graphical fidelity beyond resolution. Texture quality, lighting and shadow technologies, geometry complexity, etc. OP is saying they'd rather push all of that to the limit rather than sacrifice whatever necessary to achieve 60 FPS for the sake of it. I never thought I'd be one to use Cyberpunk as an example, but even with an extremely powerful card, the game runs like butt on ultra maximum; you will not achieve a smooth 60 unless you have an extremely powerful rig with a 3090 or better, and even then, it's only going to be achievable with some level of DLSS.

I guess it begs the question, why give us incredible graphical fidelity if it can't be achieved? If so many people say they won't accept anything less than 60, why do we pursue the technology?

Whenever Sony first party games come out and they push the envelope of graphics presentation, it's all oooh ahhh, no one ever says "You know what, I wish they used half poly models so we could get higher framerates. Ray traced global illumination is nice but I think if they used baked lightmaps that would be fine."

I want developers to push the boundaries of the technology and what's possible on this given power envelope. If they can ratchet up the framerate with whatever juice is left, great. You can always turn your settings DOWN to achieve higher framerates, but for the people that want higher fidelity, they can never turn fidelity UP past whatever the developers have implemented. For now. I'm looking at you RTX Remix.

Every single person here is guilty of choosing graphics over framerate, and the evidence is in your very profile. Look at your avatars. Each one of them was chosen because you liked the art, or the screenshot, or how it was presented. Not one fucking one shows how fast they are being rendered. Even if you have a screen cap of something at 120FPS, guess what, it ain't now. You CAN'T sell FPS in a screenshot, but you can fidelity. Higher fidelity is ALWAYS better.

I definitely have some odd criteria when considering a purchase. How long is this game? Is it multiplayer? Local? Does it have a new game plus? Controller support? Know what I've NEVER considered? Is it 60FPS or higher. As I've stated in the past, a good implementation of motion blur and other factors, 30FPS is perfectly fine. Games tailored to 30 CAN look really good. Your eyes will adjust. You'll be okay.

Edit: I'd like to amend that some games should be designed for high FPS. Online shooters and the like should spec for high framerates. That's a given. My case is specifically for single player experiences.
This.

This forum chose 30 fps games over and over again over 60 fps shooters for their last gen GOTY. BOTW, Witcher 3, BB, GOW, Uncharted, GoT. None of these final bracket games were 60 fps. They targeted visual fidelity and cinematic graphics instead of 60 fps like Titanfall 2, BF1, CoD and Halo 5 did. I dont even remember those games winning their first bracket let alone making it to the final 8.

And id argue fidelity isnt even about graphics fidelity. It's about pushing physics, interaction, destruction, NPC simulations, weather simulations, fancy traffic simulations like we saw in the Watch Dogs reveal.

We dont even have desks, tables, chairs and lamps move when you get into a shoot out in a room. It's pathetic. Black had that in 2005. Half Life 2 had full physics integration in 2004. Id rather have that at 30 fps that last gen games at 60 fps.
 
I'm exasperated with this '60fps or nothing' attitude that has infected gaming recently. Sure it's nice if a game runs at a steady 60hz but I'd far far rather the developers get their game looking as nice as possible at a steady 30fps than sacrifice fidelity. I didn't pay all that money for a PS5 just to play smoother running PS4 games.

Who's with me? You can't all be framerate fanatics surely?

Doesn't really matter who's with you or against you, everything will always move in the direction you prefer anyway (speaking of consoles). The PC tech is constantly getting better and iterating which encourages (and sometimes directly incentivizes financially) devs to keep move forward with the tech used. I had hoped we would continue to see 60fps options, but it looks like devs might choose not to do that. Something like Plague Tale Requiem seems like it should be able to offer the XSS fidelity mode at 60fps on XSX, since the title doesn't appear to be overly CPU heavy, maybe they will make that an option in the future.
 
I guess it begs the question, why give us incredible graphical fidelity if it can't be achieved?
It can be achieved but with shittier performance/gameplay. This thread shows that tons of people prefer shittier gameplay but with better graphics.
 
Depends on the game. Fighting games don't need (and are mostly not) more than 60 fps for competitive reasons. I prefer they keep pushing fidelity in these types of games. Guilty Gear Strive looks fantastic...Tekken 8 as well.

FPS games is all about the FPS - also for competitive reasons, but really to gain some sort of competitive edge. I'll make a list to both truncate this post and to illustrate that I'm not in either camp necessarily.

Fighting - Mo' Fidelity
FPS - Mo' Framerate
Action Adventure - Mo' Fidelity
Battle Royale - Mo' Framerate
Sports - Mo' Fidelity
Racing - Mo' Framerate
Puzzle Games - Mo' Fidelity
MineCraftian/Builder games - Mo' Framerate (Minecraft's graphics don't really scream fidelity, though RTX is nice)

Basically if it's a competitive game I prefer framerate. If it's single player, give me Fidelity. I think most will find if they are honest with themselves...single player game andy's will prefer fidelity. Competitive multiplayer gamers will prefer framerate.

/thread
 
I know. I have a 3080 pc and I play 4k120 almost everything.
But the point is something else. Nobody is forcing you to do anything on console.
These haters take it as an attack... and meantime this should not even be a thought taken into consideration. We should just play games.
That's why I like console gaming despite being console gamer. No distractions.

That said, I did just (for firs time in years) replayed Crysis (remastered). It did 60-80fps 4k maxed out with shadows, objects and vegetation set to high. On very high or can it run crysis, the game shits itself.
It is still 10x better than ps5 port. What were they thinking. It looks so weird
Why should console users be locked into 30 fps though? It's possible for the devs to give a 60fps option but by the looks of it they are going backwards in this regard.

60fps makes a fair difference in modern games. I will say though 30fps on older hardware such as the 360 isnt half as bad as modern games at 30fps.

I think devs have forgot how to do 30fps properly and it feels shit and looks juddery.
 
Why should console users be locked into 30 fps though? It's possible for the devs to give a 60fps option but by the looks of it they are going backwards in this regard.

60fps makes a fair difference in modern games. I will say though 30fps on older hardware such as the 360 isnt half as bad as modern games at 30fps.

I think devs have forgot how to do 30fps properly and it feels shit and looks juddery.
It's not possible. We were in 2 years long crossgen. It's misleading.
If we want the console utilized to 100%, sure but it is still 500$ box.
We either keep the games at the level of graphics and tech we had for last 2 years or go 30fps...

Your 2nd point is interesting. 30fps can feel weird nowadays because devs rely on system level vsync. 360 had good lag in some games like Gears of War and so on. Terrible in others
 
Last edited:
When did fidelity and framerate become opposite ends of the same slider?
It's a coping mechanism I think. The 60 fps cross-gen period is quickly coming to an end so people will have to make excuses about 30 fps being 'fine' again.
 
First impressions are lasting impressions. We are most impressed by something at first sight, the second time will not have the same impact - because first time has the attraction of the unknown, second time has the familiarity of the known, and this argument is valid for many things like movies, tv shows, video games, books etc... That's why I choose fidelity mode in the first playthrough, to experience the visuals in all its glory. If I ever do a second playthrough then I consider the performance mode, since I'm now familiar with the game and know where to go and what to do, I'd like it to be a smoother experience.
 
I think devs have forgot how to do 30fps properly and it feels shit and looks juddery.
Yeah, this is a huge issue that has recently cropped up because devs have been optimizing their games on next gen consoles at 60 fps. Whereas previous gens, they would design the whole game around 30 fps and ensuring it plays smooth. Some devs like Insomniac, ND and GG have great 30 fps modes. BP's 30 fps mode for Demon Souls is virtually unplayable. Guardians was very playable but AC Valhalla omg, just felt awful.
 
Yeah, this is a huge issue that has recently cropped up because devs have been optimizing their games on next gen consoles at 60 fps. Whereas previous gens, they would design the whole game around 30 fps and ensuring it plays smooth. Some devs like Insomniac, ND and GG have great 30 fps modes. BP's 30 fps mode for Demon Souls is virtually unplayable. Guardians was very playable but AC Valhalla omg, just felt awful.
I think they lazily rely on vsync to do 30fps.
It's the slowest method.
Ratched and horizon felt better than Des at 30
 
I'm completely fine with 4K@60. With 4k being standard I much rather make use of my monitor/TV and play at 4K@60. 120Hz is extremely nice as well but a struggle to reach when playing at 4K with modern games. Personal opinion.
 
I'll take the framerate. Graphics reached a "good enough" point quite some time ago for me. In most cases I'll turn whatever options I need to down to reach 144fps. Much bigger difference in quality of experience than a couple shadows could make.
 
smart GIF




Every single person here is guilty of choosing graphics over framerate, and the evidence is in your very profile. Look at your avatars. Each one of them was chosen because you liked the art, or the screenshot, or how it was presented. Not one fucking one shows how fast they are being rendered. Even if you have a screen cap of something at 120FPS, guess what, it ain't now. You CAN'T sell FPS in a screenshot, but you can fidelity. Higher fidelity is ALWAYS better.

Gotta be one of the dumbest things I've read on this forum. "You all like fidelity because of that static image in your profile that doesn't move and even if it did you aren't playing it anyways".
 
Reading through this thread, it seems like I'm finally not the only fucking sane gamer left on planet earth. Quite a few agree with my sentiment in here. Usually I'm on an island
 
You're not a gamer. You're a graphics whore. And you're not sane either. Gtfo 💀
I'm a gamer that appreciates the actual craft and ART that goes into game creation. You are a fucking analytics and numbers nerd. You don't give a fuck about anything but how a game performs. You give no fucks about pushing the medium forward. You're ok with last gen looking/playing games, as long as it's 60fps lol. We are not the same.
 
smart GIF


A lot of you are getting caught up on resolution versus framerate. In OP's post, they specifically don't state anything about resolution. Fidelity is our key word, and there's a huge amount that goes into graphical fidelity beyond resolution. Texture quality, lighting and shadow technologies, geometry complexity, etc. OP is saying they'd rather push all of that to the limit rather than sacrifice whatever necessary to achieve 60 FPS for the sake of it. I never thought I'd be one to use Cyberpunk as an example, but even with an extremely powerful card, the game runs like butt on ultra maximum; you will not achieve a smooth 60 unless you have an extremely powerful rig with a 3090 or better, and even then, it's only going to be achievable with some level of DLSS.

I guess it begs the question, why give us incredible graphical fidelity if it can't be achieved? If so many people say they won't accept anything less than 60, why do we pursue the technology?

Whenever Sony first party games come out and they push the envelope of graphics presentation, it's all oooh ahhh, no one ever says "You know what, I wish they used half poly models so we could get higher framerates. Ray traced global illumination is nice but I think if they used baked lightmaps that would be fine."

I want developers to push the boundaries of the technology and what's possible on this given power envelope. If they can ratchet up the framerate with whatever juice is left, great. You can always turn your settings DOWN to achieve higher framerates, but for the people that want higher fidelity, they can never turn fidelity UP past whatever the developers have implemented. For now. I'm looking at you RTX Remix.

Every single person here is guilty of choosing graphics over framerate, and the evidence is in your very profile. Look at your avatars. Each one of them was chosen because you liked the art, or the screenshot, or how it was presented. Not one fucking one shows how fast they are being rendered. Even if you have a screen cap of something at 120FPS, guess what, it ain't now. You CAN'T sell FPS in a screenshot, but you can fidelity. Higher fidelity is ALWAYS better.

I definitely have some odd criteria when considering a purchase. How long is this game? Is it multiplayer? Local? Does it have a new game plus? Controller support? Know what I've NEVER considered? Is it 60FPS or higher. As I've stated in the past, a good implementation of motion blur and other factors, 30FPS is perfectly fine. Games tailored to 30 CAN look really good. Your eyes will adjust. You'll be okay.

Edit: I'd like to amend that some games should be designed for high FPS. Online shooters and the like should spec for high framerates. That's a given. My case is specifically for single player experiences.
aint nobody got time for that GIF
 
smart GIF


A lot of you are getting caught up on resolution versus framerate. In OP's post, they specifically don't state anything about resolution. Fidelity is our key word, and there's a huge amount that goes into graphical fidelity beyond resolution. Texture quality, lighting and shadow technologies, geometry complexity, etc. OP is saying they'd rather push all of that to the limit rather than sacrifice whatever necessary to achieve 60 FPS for the sake of it. I never thought I'd be one to use Cyberpunk as an example, but even with an extremely powerful card, the game runs like butt on ultra maximum; you will not achieve a smooth 60 unless you have an extremely powerful rig with a 3090 or better, and even then, it's only going to be achievable with some level of DLSS.

I guess it begs the question, why give us incredible graphical fidelity if it can't be achieved? If so many people say they won't accept anything less than 60, why do we pursue the technology?

Whenever Sony first party games come out and they push the envelope of graphics presentation, it's all oooh ahhh, no one ever says "You know what, I wish they used half poly models so we could get higher framerates. Ray traced global illumination is nice but I think if they used baked lightmaps that would be fine."

I want developers to push the boundaries of the technology and what's possible on this given power envelope. If they can ratchet up the framerate with whatever juice is left, great. You can always turn your settings DOWN to achieve higher framerates, but for the people that want higher fidelity, they can never turn fidelity UP past whatever the developers have implemented. For now. I'm looking at you RTX Remix.

Every single person here is guilty of choosing graphics over framerate, and the evidence is in your very profile. Look at your avatars. Each one of them was chosen because you liked the art, or the screenshot, or how it was presented. Not one fucking one shows how fast they are being rendered. Even if you have a screen cap of something at 120FPS, guess what, it ain't now. You CAN'T sell FPS in a screenshot, but you can fidelity. Higher fidelity is ALWAYS better.

I definitely have some odd criteria when considering a purchase. How long is this game? Is it multiplayer? Local? Does it have a new game plus? Controller support? Know what I've NEVER considered? Is it 60FPS or higher. As I've stated in the past, a good implementation of motion blur and other factors, 30FPS is perfectly fine. Games tailored to 30 CAN look really good. Your eyes will adjust. You'll be okay.

Edit: I'd like to amend that some games should be designed for high FPS. Online shooters and the like should spec for high framerates. That's a given. My case is specifically for single player experiences.
You had me, right up until that avatar example. I'm an FPS junkie. You almost had me. Your argument was so good and smart, that it defied logic and warped around to bad and dumb.

Therefore, I hate you and will now give you a triggered reaction emoji. You enjoy that
 
I'm a gamer that appreciates the actual craft and ART that goes into game creation. You are a fucking analytics and numbers nerd. You don't give a fuck about anything but how a game performs. You give no fucks about pushing the medium forward. You're ok with last gen looking/playing games, as long as it's 60fps lol. We are not the same.
I don't need to take any lip about not being a gamer from a graphics whore who dismisses indies which are far more artistic than most AAA games. I want to push the medium forward which is why I have a PC: to experience the medium in its most uncompromised form. While you're over here talking up a compromise like it's an artistic decision.

You are a graphics whore. You advocate for 30fps because it produces more pretty pictures. Admit it and get the fuck out of my hobby. People like you are actively ruining gaming with your visuals first attitude. Go watch a movie if you want pretty visuals
 
I don't need to take any lip about not being a gamer from a graphics whore who dismisses indies which are far more artistic than most AAA games. I want to push the medium forward which is why I have a PC: to experience the medium in its most uncompromised form. While you're over here talking up a compromise like it's an artistic decision.

You are a graphics whore. You advocate for 30fps because it produces more pretty pictures. Admit it and get the fuck out of my hobby. People like you are actively ruining gaming with your visuals first attitude. Go watch a movie if you want pretty visuals
I'm a gamer that appreciates the actual craft and ART that goes into game creation. You are a fucking analytics and numbers nerd. You don't give a fuck about anything but how a game performs. You give no fucks about pushing the medium forward. You're ok with last gen looking/playing games, as long as it's 60fps lol. We are not the same.
Alright cool it you two. Agree to disagree and move on.
 
I don't agree. Bloodborne jaggies are more off putting than it's 30fps… which is one of fastest, least laggy 30 fps.

Frame rate whoring is just annoying. You can do better. Your brain can do better
 
smart GIF


A lot of you are getting caught up on resolution versus framerate. In OP's post, they specifically don't state anything about resolution. Fidelity is our key word, and there's a huge amount that goes into graphical fidelity beyond resolution. Texture quality, lighting and shadow technologies, geometry complexity, etc. OP is saying they'd rather push all of that to the limit rather than sacrifice whatever necessary to achieve 60 FPS for the sake of it. I never thought I'd be one to use Cyberpunk as an example, but even with an extremely powerful card, the game runs like butt on ultra maximum; you will not achieve a smooth 60 unless you have an extremely powerful rig with a 3090 or better, and even then, it's only going to be achievable with some level of DLSS.

I guess it begs the question, why give us incredible graphical fidelity if it can't be achieved? If so many people say they won't accept anything less than 60, why do we pursue the technology?

Whenever Sony first party games come out and they push the envelope of graphics presentation, it's all oooh ahhh, no one ever says "You know what, I wish they used half poly models so we could get higher framerates. Ray traced global illumination is nice but I think if they used baked lightmaps that would be fine."

I want developers to push the boundaries of the technology and what's possible on this given power envelope. If they can ratchet up the framerate with whatever juice is left, great. You can always turn your settings DOWN to achieve higher framerates, but for the people that want higher fidelity, they can never turn fidelity UP past whatever the developers have implemented. For now. I'm looking at you RTX Remix.

Every single person here is guilty of choosing graphics over framerate, and the evidence is in your very profile. Look at your avatars. Each one of them was chosen because you liked the art, or the screenshot, or how it was presented. Not one fucking one shows how fast they are being rendered. Even if you have a screen cap of something at 120FPS, guess what, it ain't now. You CAN'T sell FPS in a screenshot, but you can fidelity. Higher fidelity is ALWAYS better.

I definitely have some odd criteria when considering a purchase. How long is this game? Is it multiplayer? Local? Does it have a new game plus? Controller support? Know what I've NEVER considered? Is it 60FPS or higher. As I've stated in the past, a good implementation of motion blur and other factors, 30FPS is perfectly fine. Games tailored to 30 CAN look really good. Your eyes will adjust. You'll be okay.

Edit: I'd like to amend that some games should be designed for high FPS. Online shooters and the like should spec for high framerates. That's a given. My case is specifically for single player experiences.
This is the best post in this thread!!!
You Nicely said what my angry ass couldn't put together
 
lol well said. I remember people saying $399 is the most they would pay for a console. like gtfo. this is what you get.

Consoles should be $599 minimum. They are a 7 year investment. The $299 PS1 would be $600 today, but we are cheap fucks who will spend thousands on iphones, hundreds on bars and clubs, $10 for avacado toast, and $6 for a cup of coffee, but $600 is too much for a console.
Consoles costing $600 and staying there for half of their life cycle would make for some *ahem* interesting sales threads.
The progression of tech means there's too much disparity between consoles and high-end PCs, so having options on consoles should absolutely be the norm.
 
This may seem like a hot take now but I think as more current-gen only titles come out over the course of this gen, you're not gonna be able to tell the difference between the 60 FPS & 30 FPS modes when you're not watching a DF/NXG video. In most cases, it's gonna be an internal resolution change which people will only notice at 300% zoom.
yeah-whatever.gif
 
Last edited:
Consoles costing $600 and staying there for half of their life cycle would make for some *ahem* interesting sales threads.
The progression of tech means there's too much disparity between consoles and high-end PCs, so having options on consoles should absolutely be the norm.
What do you mean options? Options in games? or in hardware?

I dont like multiple hardware at launch. 3 years later, mid gen refreshes are fine. Options for gamers to select 30 vs 60 fps. Also fine. Gotham Knights recently revealed that on PC you need a 5700xt (9.6 tflops GPU) to play the game at 1080p 60 fps on high settings. So clearly the PS5 and XSX couldve done run it.

But what if the bottleneck isnt the GPU power but the RAM bandwidth. Both consoles ended up sacrificing the ram bandwidth in one way or another to cut costs to hit that $499 (in sonys case $399) MSRP. Had they target $599, they would have not only gone with bigger GPUs but also enough vram bandwidth to avoid problems like this.
 
Having both is great sure, but if forced to choose between performance and fidelity, performance should be chosen 90% of the time.

There could be exceptions ofc, like VNs, walking sims, slow puzzle games or slow JRPGs. Anything else should be played at 60fps, since framerate affects gameplay in a heavy way, much more if the game is fast paced.

I remember someone made a thread no long ago asking why people were expending +1000$ in nee GPUs. Well, this thread proves that there's enough reason for that, since having one of those lets you play the game at it's fullest while making 0 compromises. But yeah, as I said, if compromises have to be done, then performance goes first, always.
 
As a long time console gamer I strongly believe anyone saying 30 FPS over 60 is deluded.

All those 60 FPS patches on PS5 for PS4 games has been transformative. The world feels more alive, gameplay is much better and fluid thanks to the low latency.

Switching back to 30 FPS now gives me headaches and just looks disgusting.

In an ideal world I'd want both high fidelity and frame rate. The best option would be to build a high end gaming PC granted one has the finances for such things, which is exactly what I plan on doing. Give me those UE5 games at 60 FPS!
 
lol well said. I remember people saying $399 is the most they would pay for a console. like gtfo. this is what you get.

Consoles should be $599 minimum. They are a 7 year investment. The $299 PS1 would be $600 today, but we are cheap fucks who will spend thousands on iphones, hundreds on bars and clubs, $10 for avacado toast, hundreds on onlyfan subs, and $6 for a cup of coffee, but $600 is too much for a console.

Lmao

Well see this is where you are wrong, because at 60 FPS the 30 FPS is also included. So why would you only want 30 FPS if the 30 FPS is already in the 60 FPS package? :)

This is why I love Gaf.
 
Top Bottom