Is 60 FPS killing overall graphical fidelity?

TLmfao. No it isn't. I and most console only gamers have played 30fps games for our entire lives. And suddenly Im supposed to believe they are "damaging playability"? :messenger_tears_of_joy: Give me what you're on. They play PERFECTLY fine.
Well I have good news! You can have what I'm on because every single game released will have 60 FPS modes included. Enjoy, buddy!
Gamers played at 240p for nearly two decades. They played at 720p/1080p for another 15 years or so. Congrats, you've been playing at a suboptimal framerate your entire life. Now you have the option to leave that behind and play at a much better framerate.
I cant wait to bump this thread when TLOU 3, GOW 3 drop in 2027 and they run at 30fps but look fucking mindblowing. They will both run at 30fps and then you'll get a 60fps patch on the PS6. You're ignoring history. This is what will happen. It always happens. Its not happening right this second because launch and early gen games do not push hardware. Stop it.
I love how you're pretending as if there is precedent here when there is none. At no point in gaming post 16 bit era did nearly every single console release come with a 60 FPS mode or higher. At no point before did every new release run at 60 FPS at the start of the generation and was then replaced by 30 FPS near the end of the generation

All of them will. Except MP focused devs of course. But developers like Insomniac, GG, NAUGHTY DOG, SSM, CDProjekt, R* North.. Basically.. ALL of the big dogs in the industry, who actually give a shit about visual fidelity and focus on impressive looking single player games... Come the end of the gen, when they are maxing out the power of these consoles... Games wont be targeting 60fps. Because it will take too much of a toll on the visuals. This literally always happens. You are in denial of history. Lmao.
Prove it. Go ahead and prove that those developers are planning to abandon 60 fps modes right after literally every.single.one of them have released a 60 FPS update for their games, including paid updates and rereleases. Again, you keep talking about history which you clearly have no clue about. We've never had a generation before where every game released runs at 60 FPS. You seem to think that past gens launched with 60 FPS games across the board and then scaled back to 30 FPS. This is completely untrue. There is no precedent to what is happening right now.

Again. 60FPS after being patched years later. Not at launch. Big difference. And yes, I do think they are looking at that demo and trying to base their fidelity targets around that. Thats the entire fucking point of the demo, to show whats possible. Really bro? And im the clown?
They were being patched almost as soon as new hardware became available. New games released launched with 60 FPS modes available (With the exception of Watch Dogs Legion, which added it later) LOL the whole point of the Matrix Demo was to, first and foremost, promote the matrix and to promote the Unreal Engine. It makes perfect sense that it runs at 30 fps because it's only a small slice. There's no actual game there that people will play for 9 - 10 hours.

Who cares. It looks better. And plays fine. Get lost. 30fps is best of both worlds.
Who cares. It plays better. It looks fine. Get lost. 60 FPS is the best of both worlds. Also I don't know what "Best of both worlds" means apparently.

You care about something as insignificant and barely noticeable as input delay over art style, atmosphere and visual fidelity - and im the casual? I care about the heart and soul of the game. I care about whats on screen. I want more enemies on screen. I want more destructability. I want more mindblowing warzones with crazy shit all around me happening all at once. I want enemies to look so real they have me shook. I want to play THIS
You're hilarious. "I care about the heart and soul of the game....The graphics!" You know who cares about about playability? Well as it turns out: Most people, as 60 FPS games continue to dominate the playing chart and developers are including a 60 fps mode with every new release and update old games along with it. 60 FPS games play better and due to the vast increase in motion clarity, they look much better too. Nobody cares what you want to "play". Most people want and expect 60 FPS modes in their game so that's how it's going to be. Deal with it.
 
Last edited:
28001118965_63fc902a00_o.png
Thats kinda what you see when you spin your head fast. And everyone can do that with no consequences.
 
While the base input lag of most games can be around 3-7 frames, lets round that to 80ms at 60fps, if the engine is rendering at 30 fps instead, that rendering input lag could potentially double as each frame now takes 32ms to render resulting in a theoretical lag of 160ms.
What are you referring to here? Triple buffered V-sync? Where does the 7 frames come from? That is latency unrelated to framerate if you are getting 7 frames worth.

Having triple buffered vsync in a game at 60hz would be ~50ms max. Double buffered is 33ms. Typically it's 2x framerate.
You wont get 80ms difference in latency.
here's another thing though sometimes capping your framerate to be lower actually reduces your input lag in some games. It all depends on the game loop. Now think about games that are a consistent 60 vs consistent 30 on consoles too. Consistent 60 is more rare, there are usually more drops and varying latency. For fighting games consistent 60fps is important but most 60fps games don't keep a consistent 60 anyway.
 
Man, I thank God I never owned a really good gaming pc. The difference between 30 fps and 60 is big and I can't even imagine 120 fps. If not for current consoles, I wouldn't have thought much about fps, but once you experience 60 fps, it's really hard to go back, even with the beauty 30 fps allows currently.

30 fps feels like I'm moving in slow motion now.

I would rather 60 fps be the target this gen, then ultra sharp 4k res. I'm thankful most games give you choice now.

AI upscaling can't come fast enough for consoles.
 
Man, I thank God I never owned a really good gaming pc. The difference between 30 fps and 60 is big and I can't even imagine 120 fps. If not for current consoles, I wouldn't have thought much about fps, but once you experience 60 fps, it's really hard to go back, even with the beauty 30 fps allows currently.

30 fps feels like I'm moving in slow motion now.

I would rather 60 fps be the target this gen, then ultra sharp 4k res. I'm thankful most games give you choice now.

AI upscaling can't come fast enough for consoles.
60 vs 120 is not that big.
30 is still fine. your brain needs to adjust
 
Man, I thank God I never owned a really good gaming pc. The difference between 30 fps and 60 is big and I can't even imagine 120 fps. If not for current consoles, I wouldn't have thought much about fps, but once you experience 60 fps, it's really hard to go back, even with the beauty 30 fps allows currently.

30 fps feels like I'm moving in slow motion now.

I would rather 60 fps be the target this gen, then ultra sharp 4k res. I'm thankful most games give you choice now.

AI upscaling can't come fast enough for consoles.
I'd rather just have more powerful GPUs.
 
60 vs 120 is not that big.
30 is still fine. your brain needs to adjust


While true, we as human can get used to anything that doesn't kill us, I think we should as gamers should desire greatness. If HFW came out only in 30 fps, I would still buy it, but I would want to communicate to Gorilla Games that maybe they should show the frame rate some love on the next game.
 
Hello Gaf, first time here

I think it's resolution that's killing graphical fidelity, and not 60fps

Devs focus too much on 4K, and that's why we dont have many games at 60fps and max settings on consoles
 
Doesn't matter if the hardware is powerful or weak.

You can ALWAYS squeeze more quality out of a 30fps presentation because you literally have twice the rendering time per frame. With that budget, in 30fps, you can go even bigger than you could at 60fps.

This will ALWAYS be the case.

Doesn't matter if the PS5 had an RTX 3090 level GPU, you can still render more stuff at 30fps.

So those amazing looking games on PS2 that ran at 60fps, could've looked even better had they increased the frame times to 33ms.
That's not the point I was making. My point was that the games I was mentioning looked significantly better than the games of the previous gen while still being 60fps. They didn't have to look anything like previous gen games to hit 60fps.
 
What are you referring to here? Triple buffered V-sync? Where does the 7 frames come from? That is latency unrelated to framerate if you are getting 7 frames worth.

Having triple buffered vsync in a game at 60hz would be ~50ms max. Double buffered is 33ms. Typically it's 2x framerate.
You wont get 80ms difference in latency.
here's another thing though sometimes capping your framerate to be lower actually reduces your input lag in some games. It all depends on the game loop. Now think about games that are a consistent 60 vs consistent 30 on consoles too. Consistent 60 is more rare, there are usually more drops and varying latency. For fighting games consistent 60fps is important but most 60fps games don't keep a consistent 60 anyway.
When Tekken 7 was launched it had 7 frames of input lag, there was an out cry from the community and they managed to reduce it to 4 frames I think.

I'm not an absolute expert on this but my understanding is that. Console triple V-Sync could introduce lots of input lag due to the frame queue and 30 fps would likely increase the input lag by more than 1 frame if not double it compared to 60 fps.

Also the game don't have to completely render each frame within the time of displaying 1 frame but can space out multiple steps over several frames, like reading inputs, game logic, doing the rendering, and then post processing, this can be spread out on multiple threads on both the CPU and GPU, thus the multiple frame input lag of many games.
 
Last edited:
Hello Gaf, first time here

I think it's resolution that's killing graphical fidelity, and not 60fps

Devs focus too much on 4K, and that's why we dont have many games at 60fps and max settings on consoles
True but the target audience is playing mostly in their living room on 40" + TV, the res needs to be as high as natively possible otherwise you get some terrible smeary low res outputs.
If you want proof of this look at the 3D audio optimisation on PS5, first port of call was TV speakers - the lowest common denominator.

Now if these consoles had something akin to DLSS wed be talking.
 
When Tekken 7 was launched it had 7 frames of input lag, there was an out cry from the community and they managed to reduce it to 4 frames I think.

I'm not an absolute expert on this but my understanding is that. Console triple V-Sync could introduce lots of input lag due to the frame queue and 30 fps would likely increase the input lag by more than 1 frame if not double it compared to 60 fps.

Also the game don't have to completely render each frame within the time of displaying 1 frame but can space out multiple steps over several frames time as long as each doesn't is at least within 1 frame time, like reading inputs, calculate physics, tesselate geometris, do transformations, doing the rendering, post processing etc, this can be spread out on multiple threads on both the CPU and GPU, thus the multiple frame input lag of many games.
So that input lag was not related to the framerate that they could almost halve it while having 60fps before and after. Input is usually CPU. Some games would read input multple times for physics before a frame is rendered. These are framerate capped games. The added time for the render is constant though. There are triple buffered V sync games where to avoid screen tearing from the monitor and game refresh rate being out of sync the GPU renders to a separate buffer before swapping to the one shown. This again would be a constant 2x-3x multiple of the framerate. Never 7 frames. The 7 frames of input lag sounds more like polling for the game to not respond until it determines what action the player wants to commit. To me it sounds like they increased the polling rate independent of the framerate.
 
Last edited:
60 fps.

30 fps is a mess. Not only sluggish, but all it takes are a few frames to drop and it makes it much worse. Especially bad for racers, sports, fighting games and shooters.

Also, at 30 fps this is where game devs potentially add in lots of motion blur to help fix it. Arguably the worst graphical effect invented in modern eras. Pan that camera fast and either it's hitchy (no blur) or a smear effect (blur).

When you got 60 fps, everything looks better smooth and no need for motion blur. You can also drop some frames and still be fine.
 
Last edited:
Hello Gaf, first time here

I think it's resolution that's killing graphical fidelity, and not 60fps

Devs focus too much on 4K, and that's why we dont have many games at 60fps and max settings on consoles
Well let's say console users play with big TVs on their living room.

Resolution is needed.

The issues we are having this gen didn't happened in previous generations because most game choose the same framerate (in most cases 30fps).

The move to 60fps this gen is making graphics downgrade.
 
Well I have good news! You can have what I'm on because every single game released will have 60 FPS modes included. Enjoy, buddy!
Gamers played at 240p for nearly two decades. They played at 720p/1080p for another 15 years or so. Congrats, you've been playing at a suboptimal framerate your entire life. Now you have the option to leave that behind and play at a much better framerate.
30 will become the norm again when games begin to push hardware more. Its so obvious. Cant help you if you cant see that.
I love how you're pretending as if there is precedent here when there is none. At no point in gaming post 16 bit era did nearly every single console release come with a 60 FPS mode or higher. At no point before did every new release run at 60 FPS at the start of the generation and was then replaced by 30 FPS near the end of the generation
"At no point in gaming post 16 bit era did nearly every single console release come with a 60 FPS mode or higher" - Exactly. And they could have. But they didn't. Because 30fps has always taken priority over 60fps. Because developers have historically chosen fidelity over framerate. Thanks for making my point for me.
Prove it. Go ahead and prove that those developers are planning to abandon 60 fps modes right after literally every.single.one of them have released a 60 FPS update for their games, including paid updates and rereleases. Again, you keep talking about history which you clearly have no clue about. We've never had a generation before where every game released runs at 60 FPS. You seem to think that past gens launched with 60 FPS games across the board and then scaled back to 30 FPS. This is completely untrue. There is no precedent to what is happening right now.
"You seem to think that past gens launched with 60 FPS games across the board and then scaled back to 30 FPS" - What in Gods hell are you talking about? My whole argument is that historically, devs have chosen fidelity over framerate, which is why 30fps has always been so popular. You keep making my point for me. Appreciate it.
And time will tell. If you honestly believe games at the end of the gen will all run at 60fps and not 30 then you're in for a rude awakening. Whats been happening as of late is they get patches for 60fps updates when new hardware releases. That trend will continue. When new hardware releases.
They were being patched almost as soon as new hardware became available. New games released launched with 60 FPS modes available (With the exception of Watch Dogs Legion, which added it later) LOL the whole point of the Matrix Demo was to, first and foremost, promote the matrix and to promote the Unreal Engine. It makes perfect sense that it runs at 30 fps because it's only a small slice. There's no actual game there that people will play for 9 - 10 hours.
"They were being patched almost as soon as new hardware became available." - Thanks for making my point for me .. again :messenger_tears_of_joy:. As I have BEEN SAYING, 30fps will take precedence over 60fps on this current hardware at the end of the gen. When new hardware releases, you will then get your 60fps patches.

And yes, the point of the Matrix Demo was to promote whats possible fidelity wise using Unreal Engine 5. Im not going to sit here and debate that fact with you. You go right ahead and think it has some other nonsensical meaning.

You're hilarious. "I care about the heart and soul of the game....The graphics!" You know who cares about about playability? Well as it turns out: Most people, as 60 FPS games continue to dominate the playing chart and developers are including a 60 fps mode with every new release and update old games along with it. 60 FPS games play better and due to the vast increase in motion clarity, they look much better too. Nobody cares what you want to "play". Most people want and expect 60 FPS modes in their game so that's how it's going to be. Deal with it.
Great job minimizing "art style, atmosphere and visual fidelity, more destructability, more intense warzones" to "graphics". If iNpUt LaG is more important to you than all of those things I mentioned then we are not the same. You don't appreciate the medium. All you care about is 60fps. You have no future vision, no artistic integrity or credibility, no vision as to what this medium could be. You probably think we've tapped out or have peaked the medium. Alllll you care about is 60fps and fucking input lag. Lol. So boring. So tasteless. Theres more to gaming than your precious numbers. Give me a fucking break.
 
Last edited:
So that input lag was not related to the framerate that they could almost halve it while having 60fps before and after. Input is usually CPU. Some games would read input multple times for physics before a frame is rendered. These are framerate capped games. The added time for the render is constant though. There are triple buffered V sync games where to avoid screen tearing from the monitor and game refresh rate being out of sync the GPU renders to a separate buffer before swapping to the one shown. This again would be a constant 2x-3x multiple of the framerate. Never 7 frames. The 7 frames of input lag sounds more like polling for the game to not respond until it determines what action the player wants to commit. To me it sounds like they increased the polling rate independent of the framerate.
The Tekken 7 input lag stuff is a weird one. It was the first time they used UE4 and actually ended up received help from Epic to reduces the input lag. The game also feature at least 1 frame of fake input to hide online latencies as well as to make offline feel like similar to online. I think the game still has at least one frame extra lag on Playstation vs PC with V-sync on.
 
True but the target audience is playing mostly in their living room on 40" + TV, the res needs to be as high as natively possible otherwise you get some terrible smeary low res outputs.
If you want proof of this look at the 3D audio optimisation on PS5, first port of call was TV speakers - the lowest common denominator.

Now if these consoles had something akin to DLSS wed be talking.

Yeah, i can't argue with that.

The thing is, we probably won't see a graphical leap as big as games like The Order, Ryse, Uncharted 4, RDR 2, etc. Because last gen devs were not pushing 4K and/or 60fps
Unless we go back to 30fps...
 
I'd rather have reduced graphical settings and keep 60fps.

30 is simply not an option...and neither is 1080p.


Sony needs to get that 1440p update out and perhaps developers can find a sweet spot that doesn't force 4k, which puts an unrealistic strain on the hardware at 60fps, and doesn't mean having to accept 30fps.

I don't have an particularly good pc (i5 10400f, 2060, 16gb RAM) and I play at 1440p, yet I still wouldn't pick my ps5 to play any multiplatform title.

I use it purely for those Sony exclusives (ideally those which I don't envisage ever making it to pc - stuff like TLOU 2, Spiderman, GT7 etc...).

Granted, I don't have a really nice TV so maybe I'm not seeing the very best it has to offer.
 
Anyone going on about movies being 24fps should really look into why that is (besides saving money on film), and why a lot of people think high frame rate movies "look fake".
 
That's not the point I was making. My point was that the games I was mentioning looked significantly better than the games of the previous gen while still being 60fps. They didn't have to look anything like previous gen games to hit 60fps.
My point is this... games could look a lot better at 30fps compared to 60fps. And that's never going to not be the case. Doesn't matter if you have the power of 12 RTX 3090s, you'll still see higher quality visuals if you have a 33ms budget compred to 16ms budget.

That's objective.
 
Really hope the 60fps options to be a standard, I'm loving it, and like said here before will think hard before buying a game that doesn't have a 60fps mode, so far both PS5 and XSX gave me almost every game with 60fps option, even Watch Dogs Legion got it eventually.
 
My point is this... games could look a lot better at 30fps compared to 60fps. And that's never going to not be the case. Doesn't matter if you have the power of 12 RTX 3090s, you'll still see higher quality visuals if you have a 33ms budget compred to 16ms budget.

That's objective.
That is a fact.
 
Well let's say console users play with big TVs on their living room.

Resolution is needed.

The issues we are having this gen didn't happened in previous generations because most game choose the same framerate (in most cases 30fps).

The move to 60fps this gen is making graphics downgrade.

Then, unless devs go back to 30fps, we won't see big jumps in graphics like we've seen before. It's also important to point out that most games were cross-gen so far
 
Great job minimizing "art style, atmosphere and visual fidelity, more destructability, more intense warzones" to "graphics", clown. If iNpUt LaG is more important to you than all of those things I mentioned then we are not the same. You don't appreciate the medium. All you care about is 60fps. You have no future vision, no artistic integrity or credibility, no vision as to what this medium could be. You probably think we've tapped out or have peaked the medium. Alllll you care about is 60fps and fucking input lag. Lol. So boring. So tasteless. Theres more to gaming than your precious numbers. Give me a fucking break.
and you do so by sounding like a pretentious douche? lmfao
You're minimizing the gameplay aspect in an interactive medium and the visual clarity that comes with 60 FPS.
If you think you can't make games with artistic integrity at 60 FPS, then you're a moron. Quite simple.
The devs chose 30 FPS because of hardware limitations, not because of artistic integrity. If they could've made it 60 FPS, they would've done so.
Give ME a break.
 
My point is this... games could look a lot better at 30fps compared to 60fps. And that's never going to not be the case. Doesn't matter if you have the power of 12 RTX 3090s, you'll still see higher quality visuals if you have a 33ms budget compred to 16ms budget.

That's objective.
I think there would be a lot less arguing if the 30fps people would start saying "higher quality assets" instead visuals, clarity, fidelity, etc.
 
and you do so by sounding like a pretentious douche? lmfao
You're minimizing the gameplay aspect in an interactive medium and the visual clarity that comes with 60 FPS.
If you think you can't make games with artistic integrity at 60 FPS, then you're a moron. Quite simple.
The devs chose 30 FPS because of hardware limitations, not because of artistic integrity. If they could've made it 60 FPS, they would've done so.
Give ME a break.

yeah, i'm with you

it's just not about the numbers, it's about how it feels. And a 60fps game feels a lot better to play
 
I think there would be a lot less arguing if the 30fps people would start saying "higher quality assets" instead visuals, clarity, fidelity, etc.
But that doesn't fit the difference.
Let's take the last example… Horizon 2.

Visuals: 30fps are better than 60fps
Clarity: 30fps is way more clarity than 60fps (actually 60fps mode is the opposite of clarity it is a blurry hell)
Fidelity: 30fps have more graphic fidelity than 60fps
Etc: I need to know to compare.
 
Last edited:
I think there would be a lot less arguing if the 30fps people would start saying "higher quality assets" instead visuals, clarity, fidelity, etc.
It's not just asset quality though.

30fps leaves room for more resolution, effects, particles, physics, destruction.... basically anything that takes up processing power. At 60Hz, you have half the time to render all of that stuff, so certain things NEED to be cut in order to hit that target.

If a game releases at only 60fps, then you'll never know what was cut to achieve that, but I assure you, there were things that were cut or paired back to hit that frame rate target.

It's not always a matter of "Well, the developers didn't try hard enough if they had to cut things to achieve 60fps" or that the game is "badly optimized".
 
and you do so by sounding like a pretentious douche? lmfao
You're minimizing the gameplay aspect in an interactive medium and the visual clarity that comes with 60 FPS.
If you think you can't make games with artistic integrity at 60 FPS, then you're a moron. Quite simple.
The devs chose 30 FPS because of hardware limitations, not because of artistic integrity. If they could've made it 60 FPS, they would've done so.
Give ME a break.
Lmao how is that pretentious? Get out of your feelings brother. I find caring more about numbers and input lag tasteless and boring. Im not sitting here calling him tasteless and boring. It's not pretentious. Sue me.
And no, 30fps is chosen because fidelity takes precedence over framerate. Any dev can make any game run at 60fps. But they don't, because they don't want to sacrifice the fidelity. That is a fact. Stop being in such denial.

Exhibit A:

When Uncharted 4 was announced, Naughty Dog said it would run at 60FPS. It didn't, they stuck with 30 - Because they couldn't get the visual fidelity up to their standards at 60.. What did they do? They went with higher fidelity over the higher framerate. Fidelity always comes first. Stop.
 
Last edited:
The Matrix tech demo was eye opening for me, it was the best looking thing I've ever seen being rendered on a Console, it looked sublime, like stepping into a movie.... Until the I actually started "playing it" and all the bad thing about the 2 last gens came back like a brick to the head, excluding the awful load times.

Heavy motion blur to disguise the inconsistent low framerate, it's good to watch but felt awful to play. Consistent high framerates and fast loading is what I am loving about this gen, every game I play feels quick, snappy and responsive.

I have no clue why anyone would want to go back, no matter how much postFX you put on the games.
 
Last edited:
And no, 30fps is chosen because fidelity takes precedence over framerate. Any dev can make any game run at 60fps. But they don't, because they don't want to sacrifice the fidelity.
lol, there are plenty of examples that disprove that sentiment.
Most PC releases. Most racing games and sports titles. Most fast paced games.

They went with higher fidelity over the higher framerate. Fidelity always comes first. Stop.
No, not any dev can make any game run at 60fps. Especially on PS1 and N64 you often had games running at sub 30 because the limited hardware didn't allow them to go much higher. There is a point where devs choose 30 FPS because the hardware limitations, most likely due to the scope off the game being to big and high to run at 60 FPS. Sometimes they choose things like uncapped frame rates to make up for it. Look at MGS 2 and MGS3. MGS2 runs at 60 FPS. MGS3 at 30 FPS. MGS3 at 60 FPS was just too much for the PS2 to handle...
but on the other hand you have games like Metroid Prime that were 60 FPS and looked better than most games of that generation. Same goes for Star Wars: Rogue Squadron II and plenty other games from devs that made great graphics AT 60 FPS. They could do it, proving that you can make high fidelity games at 60 FPS. Devs do not always ask themselves "either smoother game or better graphics". There are a ton of factors going into making that decision.

so yeah, stop it. Framerate was always important. And only someone who can't appreciate good gameplay would say otherwise.
 
As I see it, whether we continue to get 60fps modes as standard will largely depend on whether games pushing fidelity (like Horizon) continue to target 4k or close to it. If they do, then there's so much headroom to trade pixels for 60fps that there's really no reason not to offer such a mode.

So far this gen, 4k30 has been the default choice. That doesn't mean it'll always be, but as far as I know, in previous generations such patterns were also set at the start and then rarely deviated from. The dominant resolution target always seems to just be the native resolution of the prevailing TV standard.

Hence almost all games on PS360 targeted 720p or close to it, and almost all games last gen were 1080p or close to that. Perhaps sub native resolutions were less viable then at lower pixel counts and before good reconstruction methods, so things might change this generation. But it's still only a might.

The evidence so far is that 4kish at 30fps will become the new standard, and so probably the scope for 1440p60 or so will remain right through the generation. That goes double if RT features are widely adopted, because that's another thing that can easily be culled to get 60fps.

Also, I'd say that PlayStation actually having system level options for users to choose their preference for fidelity or performance is also good evidence of a general intention of developers to offer that mode. Do people seriously think such a feature was just offered in isolation? Sony will have been talking to developers about next gen for years, and will likely have learned from them that they intend to provide more options. It's not a coincidence that the option is there at the system level and then every game comes out and offers that option.

I wouldn't be too confident that 1440p30 will ever be that common this generation. It might depend ultimately on how scalable engines like Unreal 5 are to 4k and 60fps.
 
As I see it, whether we continue to get 60fps modes as standard will largely depend on whether games pushing fidelity (like Horizon) continue to target 4k or close to it. If they do, then there's so much headroom to trade pixels for 60fps that there's really no reason not to offer such a mode.

So far this gen, 4k30 has been the default choice. That doesn't mean it'll always be, but as far as I know, in previous generations such patterns were also set at the start and then rarely deviated from. The dominant resolution target always seems to just be the native resolution of the prevailing TV standard.

Hence almost all games on PS360 targeted 720p or close to it, and almost all games last gen were 1080p or close to that. Perhaps sub native resolutions were less viable then at lower pixel counts and before good reconstruction methods, so things might change this generation. But it's still only a might.

The evidence so far is that 4kish at 30fps will become the new standard, and so probably the scope for 1440p60 or so will remain right through the generation. That goes double if RT features are widely adopted, because that's another thing that can easily be culled to get 60fps.

Also, I'd say that PlayStation actually having system level options for users to choose their preference for fidelity or performance is also good evidence of a general intention of developers to offer that mode. Do people seriously think such a feature was just offered in isolation? Sony will have been talking to developers about next gen for years, and will likely have learned from them that they intend to provide more options. It's not a coincidence that the option is there at the system level and then every game comes out and offers that option.

I wouldn't be too confident that 1440p30 will ever be that common this generation. It might depend ultimately on how scalable engines like Unreal 5 are to 4k and 60fps.
Most would be fine if they give you the option to choose between 30 FPS and 60 FPS. If I have the choice I would always trade slightly better graphics for a better performing and playing game ( the gameplay remains the most important aspect of a game) and I would always trade any amount of blur in the world for better visual clarity. Its ultimately a better experience.
 
Last edited by a moderator:
Man, I thank God I never owned a really good gaming pc. The difference between 30 fps and 60 is big and I can't even imagine 120 fps. If not for current consoles, I wouldn't have thought much about fps, but once you experience 60 fps, it's really hard to go back, even with the beauty 30 fps allows currently.

30 fps feels like I'm moving in slow motion now.

I would rather 60 fps be the target this gen, then ultra sharp 4k res. I'm thankful most games give you choice now.

AI upscaling can't come fast enough for consoles.
In my opinion there are major diminishing returns. 60 to 120 is not the same as 30 to 60. Maybe I just can't see it, but Ive done a ton of testing with my machine. Im lucky and privileged enough to have a great machine with a 3090, 38 inch 144hz monitor etc

I've tested tons of different games to see the feel of locking games at 30 all the way to 144. If you are an average gamer I don't see the benefit of over 60 fps. It's pretty subtle from 60 to 144. It's more of a feel than something you "see" at 144, but totally not worth losing graphic fidelity. If you are competitive gamer, I get it. Every extra frame could give you an edge.

Also depends on game too. MS flight sim 2020 feels fine between 30-50fps. Im not sure why people have just a hard on for 60fps. I would take 45-50 over 60 to have better quality visuals. Maybe Im old and just remember playing a ton of 3d games at like 15-20fps :messenger_tears_of_joy:

I think game makers should be targeting above 30fps no doubt. But the middle ground between 30 and 60 is pretty good. The fuck you 60fps or nothing isn't very pragmatic. Everything should be a balance based on situation, game type etc.
 
My point is this... games could look a lot better at 30fps compared to 60fps. And that's never going to not be the case. Doesn't matter if you have the power of 12 RTX 3090s, you'll still see higher quality visuals if you have a 33ms budget compred to 16ms budget.

That's objective.

How good something looks isn't objectively measurable.

The best you can claim is that a frame rendered in 33ms is going to usually look better than a frame rendered in 16ms on the same hardware AS LONG AS THE SCREEN IS STATIC. That's probably valid, but it's not an unqualified statement of fact.
 
Most would be fine if they give you the option to choose between 30 FPS and 60 FPS. If I have the choice I would always trade slightly better graphics for a better performing and playing game ( the gameplay remains the most important aspect of a game) and I would always trade any amount of blur in the world for better visual clarity. Its ultimately a better experience.
See the thing is... a blurry image at 60fps looks like a blurry image. A sharp image at 30fps looks much better to me and I don't seem to have a problem with input lag or anything.
 
See the thing is... a blurry image at 60fps looks like a blurry image. A sharp image at 30fps looks much better to me and I don't seem to have a problem with input lag or anything.
it really depends on the type of game. I wouldn't want to play F-Zero GX at 30 fps.
 
Last edited by a moderator:
How good something looks isn't objectively measurable.

The best you can claim is that a frame rendered in 33ms is going to usually look better than a frame rendered in 16ms on the same hardware AS LONG AS THE SCREEN IS STATIC. That's probably valid, but it's not an unqualified statement of fact.
Not sure why the screen needs to be static. I've played many games at 30fps and they looked jaw dropping and guess what... the picture on the screen was moving!

Why even watch a Blu-ray instead of DVD if the picture turns to a blur as soon as there's any movement? It's because the Blu-Ray looks better and sharper. It's exactly the same with a game. If you're going to complain that a game only looks good when you're not moving, then a movie only looks good when the camera is perfectly still.
 
60fps isn't killing graphic fidelity, 4K resolution is the problem. You aren't going to get a game that looks like the matrix demo, stable 30fps, and native 4K. These consoles are only $500 so stop expecting $1500 pc quality.
 
Not sure why the screen needs to be static. I've played many games at 30fps and they looked jaw dropping and guess what... the picture on the screen was moving!

Why even watch a Blu-ray instead of DVD if the picture turns to a blur as soon as there's any movement? It's because the Blu-Ray looks better and sharper. It's exactly the same with a game. If you're going to complain that a game only looks good when you're not moving, then a movie only looks good when the camera is perfectly still.

It needs to be static because otherwise the statement contradicts my direct experience.

In every game with Performance mode I've played so far, the 60fps has looked better than the resolution mode because 60fps displays moving visuals with DRAMATICALLY increased fidelity. That might not be true for you but it's absolutely unquestionably true for me.

Therefore it's simply not objectively the case that 30fps always looks better than 60fps, all things being equal. It's a fundamental confusion about the difference between objectivity and subjectivity.
 
It needs to be static because otherwise the statement contradicts my direct experience.

In every game with Performance mode I've played so far, the 60fps has looked better than the resolution mode because 60fps displays moving visuals with DRAMATICALLY increased fidelity. That might not be true for you but it's absolutely unquestionably true for me.

Therefore it's simply not objectively the case that 30fps always looks better than 60fps, all things being equal. It's a fundamental confusion about the difference between objectivity and subjectivity.
I will agree with you that 60fps increases clarity and makes the low resolution image look even blurrier as a result.

I'm sorry but 1080p does not look good on a 4K screen and seeing Dying Light 2 run at 1080p/60 makes that look even more obvious.
 
My point is this... games could look a lot better at 30fps compared to 60fps. And that's never going to not be the case. Doesn't matter if you have the power of 12 RTX 3090s, you'll still see higher quality visuals if you have a 33ms budget compred to 16ms budget.

That's objective.
Yes I know that. You can always say any 60fps could've been more impressive looking at 30fps but what I was saying is that in previous generations the newer consoles could still make a game look next gen while running at 60fps. Even on the supposedly really hard to develop for PS2 and 3. If they don't manage that this gen I have to wonder why. Is it resolution, RT, weak hardware or just diminishing returns?
 
Top Bottom