Is 60 FPS killing overall graphical fidelity?

Isn't that a TV upscaling issue rather than a general one? RTINGS rates the upscaling of high end LG/Samdung/Sony TVs in the 9s and 10s.
The TV doesn't do any of the upscaling as the PS5 is always set to 4K regardless of the internal resolution of the game.

In other words, you would need to manually set your PS5 to 1080p in order for your TV to upscale the image and why would anyone set their PS5 to output at 1080p at all times?
 
Last edited:
The TV doesn't do any of the upscaling as the PS5 is always set to 4K regardless of the internal resolution of the game.

In other words, you would need to manually set your PS5 to 1080p in order for your TV to upscale the image and why would anyone set their PS5 to output at 1080p at all times?
Someone was talking about a 1080p game(DL2) not looking good on a 4K TV. I'm assuming 1080p shouldn't be difficult to upscale well for a console either since 1080p fits exactly 4x in 2160p.

I'm sure people can watch 1080p blu-rays fine on their 4K TVs and I don't see anyone saying those should be watched on a 1080p TV.
 
Someone was talking about a 1080p game(DL2) not looking good on a 4K TV. I'm assuming 1080p shouldn't be difficult to upscale well for a console either since 1080p fits exactly 4x in 2160p.

I'm sure people can watch 1080p blu-rays fine on their 4K TVs and I don't see anyone saying those should be watched on a 1080p TV.
You said "Isn't that a TV upscaling issue rather than a general one? RTINGS rates the upscaling of high end LG/Samdung/Sony TVs in the 9s and 10s."

Unless you specifically set your PS5 to output at 1080p, the TV will not do any upscaling whatsoever as it will think the picture is already native 4K (ie: the TV can't read the internal resolution of a certain video game, it can only see what the console is telling it and in this case, the console is telling the TV that it's a native 4K image)
 
You said "Isn't that a TV upscaling issue rather than a general one? RTINGS rates the upscaling of high end LG/Samdung/Sony TVs in the 9s and 10s."

Unless you specifically set your PS5 to output at 1080p, the TV will not do any upscaling whatsoever as it will think the picture is already native 4K (ie: the TV can't read the internal resolution of a certain video game, it can only see what the console is telling it and in this case, the console is telling the TV that it's a native 4K image)
I was challenging the notion that 1080p(regardless if upscaled by the console or TV) will look not so good on a 4K TV. It would be interesting if someone tested how well consoles upscale content but I'm working under the assumption they upscale well enough.
 
Last edited:
I will agree with you that 60fps increases clarity and makes the low resolution image look even blurrier as a result.

I'm sorry but 1080p does not look good on a 4K screen and seeing Dying Light 2 run at 1080p/60 makes that look even more obvious.

1080p is another matter. Personally I'm not too keen on that trade off.

But 1440p or so doesn't look blurry, it looks sharp at rest and much much clearer than 4k in motion.
 
That is good example.
Tree and self shadowing could be in another level in GT7 if 30fps.
Even RT was not out of option for gameplay.

But the OP is talking about how this recent trend of making 60fps games is killing fidelity I think, GT7 falls short in some areas because Polyphony pushed too much in others, or the game is still non optimized. Not because Polyphony suddenly decided to go with 60fps.

It´s strange that console gamers start to complain about fidelity though, it has always been an issue there, even at 30fps.
 
In a couple of years, new games will struggle to maintain 30fps and not even come close to 4k. Just like last generation with 1080p. This whole "fidelity first" argument will look pretty stupid by then.
 
One more thing. The question, "Is 60 FPS killing overall graphical fidelity?" is a profoundly stupid question. It is biased, too broad, and implies that graphical fidelity is the end all, be all of gaming. It's not.

Is 60 FPS killing overal graphical fidelity? On consoles, probably.

Is 60 FPS killing gaming as a medium? No, it's improving it.
 
30 will become the norm again when games begin to push hardware more. Its so obvious. Cant help you if you cant see that.
More fact free bullshit. Next.

"At no point in gaming post 16 bit era did nearly every single console release come with a 60 FPS mode or higher" - Exactly. And they could have. But they didn't. Because 30fps has always taken priority over 60fps. Because developers have historically chosen fidelity over framerate. Thanks for making my point for me.
I'm not making your point for you. I've already destroyed your point several times. They didn't release as many 60 FPS games because the hardware wasn't there to support it. The fact that we are already getting every single release, both cross gen AND current gen exclusive with a 60 FPS mode shows that the historical precedent doesn't apply here. Did previous gens launch with 60 FPS modes included into every single for the first 2 years? No? Then you can't use historical precedent as as an example because it's obvious that Devs are already breaking with that.

"You seem to think that past gens launched with 60 FPS games across the board and then scaled back to 30 FPS" - What in Gods hell are you talking about? My whole argument is that historically, devs have chosen fidelity over framerate, which is why 30fps has always been so popular. You keep making my point for me. Appreciate it.
And time will tell. If you honestly believe games at the end of the gen will all run at 60fps and not 30 then you're in for a rude awakening. Whats been happening as of late is they get patches for 60fps updates when new hardware releases. That trend will continue. When new hardware releas
Why do you keep repeating the same nonsense? Historically, devs have chosen 30 FPS because of hardware limitations. Thats why most games ran at 30 FPS. The fact that already every single game releases with a 60 FPS mode, we have platform holders confirming Framerate to be a priority and the fact that the most popular games are mostly 60 FPS games is undeniable proof that historic precedent doesn't apply here. Saying that devs will force games to run at 30 fps based on historical precedent is just as dumb as saying devs will force games to run at 720/1080p because of historical precedent. Also, It's a pretty well known fact that the only reason that 60 FPS wasn't the norm last gen was because both systems were CPU bound which caused a bottleneck. As confirmed here by Phil Spencer:
"I think we've reached a point with Xbox One X in the generation where games look amazing, and there's always work we can do to look more amazing. But I want games to feel as amazing as they look. We don't have that in today's generation, mainly because the CPU is underpowered relative to the GPU that's in the box in order to reach a feel and frame rate and kind of consistency or variable refresh rate and other things that we want."
Something also backed up by Bungie when speaking about Destiny 2 having 60 FPS on PS4 Pro:
I mean, I'm going to wade into this, and you [Mark Noseworthy] can flesh it out," Smith said. "The console, the PS4 Pro is super powerful, but it couldn't run our game at 60. Our game's this rich physics simulation where collision of players, networking, etc, and like, it wouldn't run... [there's] not enough horsepower there. It's on the CPU side."

Okay so back to your crappy arguments:
"They were being patched almost as soon as new hardware became available." - Thanks for making my point for me .. again :messenger_tears_of_joy:. As I have BEEN SAYING, 30fps will take precedence over 60fps on this current hardware at the end of the gen. When new hardware releases, you will then get your 60fps patches.
Only older games that were released before the new hardware games were patched.Nearkly Every single game that was released after the launch of the next gen consoles, including next gen only haves, released with 60 FPS modes present. We're already in the 2nd year of the console and this is still the case. "b-b-but that's because they are early games" at NO POINT IN HISTORY have games released in the first 2 years of a consoles lifecycle ran at a different framerate than at the remainder of the generation. Never. The closest thing you can possibly get to is the PS2/GC/Xbox era where there was more variaty in framerates. Every single other generation, whether its PS1/N64, PS3/Xbox360 or PS4/Xbone, had the same most common/standard framerates at the start of the gen as at the end of it.
And yes, the point of the Matrix Demo was to promote whats possible fidelity wise using Unreal Engine 5. Im not going to sit here and debate that fact with you. You go right ahead and think it has some other nonsensical meaning.
The Matrix TECH DEMO is literally just that. It's exactly to promote a possible fidelity (as well as promote the Matrix, Obviously) and that's why it makes sense to make it run at 30 FPS since there's no game there. It's the exact same reason why so many games are shown to run in fidelity mode in promotion material such as trailers/gameplay footage despite every single one of them offering a 60 FPS mode. This is a dumb argument and I suggest we move on from it.

Great job minimizing "art style, atmosphere and visual fidelity, more destructability, more intense warzones" to "graphics". If iNpUt LaG is more important to you than all of those things I mentioned then we are not the same. You don't appreciate the medium. All you care about is 60fps. You have no future vision, no artistic integrity or credibility, no vision as to what this medium could be. You probably think we've tapped out or have peaked the medium. Alllll you care about is 60fps and fucking input lag. Lol. So boring. So tasteless. Theres more to gaming than your precious numbers. Give me a fucking break.
bVtwblL.jpg

Yes, fluidity in playability and controller response is far more important than slightly higher resolution textures or a few extra lighting effects. I also really enjoy not having the entire game turn into a blurry mess the moment I turn the camera. R&C: Rift Apart looks fantastic. Spiderman: Miles Morales looks incredible and Horizon: Forbidden West also looks terrific on a PS5 and thanks to 60 FPS, they also play great. If you really want to play at higher fidelity, well they give you the option to do so. THIS is the actual balance that developers should go for and fortunately, they are.
 
Last edited:
30 will become the norm again when games begin to push hardware more. Its so obvious. Cant help you if you cant see that.

no it wont. wanna know why... well... 2 words and a letter: Xbox Series S

developers will need to make Series S versions. that means their games are scaleable enough to run on a 12TF console and on a 4TF console with a slightly weaker CPU and less RAM.

what does that mean? that means even if the Series S version will not get a 60fps mode, the version the Series S is running is basically automatically ensuring that they have a graphics setting ready to go that can easily run at 60fps on Series X :)

so no... 30fps will not become the norm again. almost every single game on the market is 60fps currently and people will more and more expect it to be there, and there is literally no reason for it not to be there.

and no, games will not max out the CPU, at least not if they are competently made. a game to only be able to run at 30fps on a 3.6Ghz Zen 2 CPU would be a fucking joke if it doesn't push the most cutting edge physics or has like thousands of complex NPC AI instances running simultaneously

there will be ZERO cases where bandwidth will be an issue because the Series S has way less than the Series X... there will be ZERO cases where the shader cores will be an issue because the Series X has tons more than the S and there will be ZERO cases where the raytracing will be an issue... because, you guessed it... the Series S is still there and has less Ray Acceleration hardware than the X
 
Last edited:
4K is the only thing that is killing graphical fidelity. Seriously, who even asked for 4K? Make all AAA games run with ultra graphics in 1080p (or slightly better if it doesn't harm performance) and 60 fps.
 
Last edited:
I think there is a disconnect between people who have always gamed on console and who really aren't particularly bothered by 30fps, and those who have experienced higher frame rates on pc or feel that the focus on 60fps modes is the way forward.

Personally, I feel like the 60fps mode on most ps5 games actually looks better even when it obviously running at reduced resolution and settings.
The smoothness adds something that cannot be beaten by any ray tracing or improvements in lighting etc...

Take Miles Morales, and the 30 and 60fps offerings.

The 30fps mode feels horrendous, like a slideshow, and no amount of graphical improvements can overcome that.

Switch to the 60fps mode and it's like a different game, and it actually looks better.

You've only got to look at something like TLOU 2 and see the 60fps upgrade provides a far superior experience regardless of any purported reduction in settings or resolution.
The new Cyberpunk update offers 30fps or 60fps, and despite the 30 offering having the superior settings it feels awful...and the switch to 60 immediately remedies that.

All of this is, of course, my own opinion.

It's all subjective in the end.

As I said in a previous post, my personal preference is for Sony to hurry up and enable 1440p so developers can offer it as a legitimate option.

That way we can have 1440-60/120 and 4k30 for those who want it.

Having a pc that I play on a 1440p monitor I firmly believe it is the sweet spot and offers the best of both worlds (i.e a significant improvement over 1080p and the ability to offer 60fps).

Choice is the buzz word, and as long as the option is there for both parties, via the two game modes, I'm happy with that.
 
Last edited:
The games that benefit from 60fps are fast-paced ones, so lower graphical fidelity isn't really as noticeable.

Usually, if you notice the worse graphics, then you either don't need to playing at 60fps or you're playing it wrong.
 
Solution to this debacle is simple, and devs already came up with all this, options/modes- hell many games got even 3 different modes, not 2, so as long as person was smart and didnt cheap out buying xss( on which games already sometimes dont get 60fps modes and overall look visibly worse to any1 who is even bit techsavy) they can enjoy 60fps mode(which looks worse from 30fps/fidelity mode, but feels better to play) so we dont actually ever have to wonder, at least for now, if and by how much particular game would look better in 30fps or not :)
 
Last edited:
I'm not making your point for you. I've already destroyed your point several times. They didn't release as many 60 FPS games because the hardware wasn't there to support it. The fact that we are already getting every single release, both cross gen AND current gen exclusive with a 60 FPS mode shows that the historical precedent doesn't apply here. Did previous gens launch with 60 FPS modes included into every single for the first 2 years? No? Then you can't use historical precedent as as an example because it's obvious that Devs are already breaking with that.
Okay, I'm going to have to disagree with you there. There's no such thing as hardware not supporting 60fps.

However, no matter how fast (or slow) the hardware is, there will ALWAYS be cutbacks to get 60fps, but to say the hardware simply is incapable of 60fps doesn't make any sense. There's been 60fps games ever since I can remember.
 
Okay, I'm going to have to disagree with you there. There's no such thing as hardware not supporting 60fps.

However, no matter how fast (or slow) the hardware is, there will ALWAYS be cutbacks to get 60fps, but to say the hardware simply is incapable of 60fps doesn't make any sense. There's been 60fps games ever since I can remember.

Yeah but the GPU was bottlenecked by the CPU. So it's actually pretty accurate. The Pro consoles were way more than twice as powerful as the base ones, but basically no games them on had 60fps modes, and that's because the of the CPU. God of War tried it but only got to like 40-50fps.

This is different this time. In general 60fps is achievable just by lightening the load on the GPU.
 
Last edited:
60FPS should be the standard. Most games look good enough.

The obsession for 4K should be ignored - 1440p60 is absolutely fine for my tastes.
 
Last edited:
no it wont. wanna know why... well... 2 words and a letter: Xbox Series S
I'm not sure about your point... games on Series S are not being released with 60fps mode... at least there are example lately released.
It is being limited to 30fps.
 
Last edited:
I think there is a disconnect between people who have always gamed on console and who really aren't particularly bothered by 30fps, and those who have experienced higher frame rates on pc or feel that the focus on 60fps modes is the way forward.
I think you are close to the truth.

Consoles only players are not bothered with 30fps because it is really fine to play games in that framerate.
But the PC or Hybrid users have a strong hate for 30fps... seems to take it at heart.

60fps is better? Of course.
30fps is unplayable? Bullshit excuse... it is perfectly playable.
I the trade off for 60fps worth? Well in my opinion not.

Of course some games are focused in 60fps gameplay like racing, fighting, maybe FPS, etc... even so these genre already proved it can be played in 30fps too... well DriveClub and Destiny come fast to mind... loved the gameplay in both.

I thing console hardware are trying to take too much and ahead what it capabilities.
For me 120fps is out of question... no go... a waste of dev resource and the end result is ridiculous bad.
60fps can be give as option since PVP has a fixed mode to everybody (everybody 60fps or everybody 30fps).
30fps is where consoles devs needs to focus to games looks next-gen.

That is what I think.. so yes in terms of graphic fidelity 60fps kills it in console hardware.
Horizon is just released and proves again that point.

After all console hardware are in the middle level (well Series S is at bottom) and compromisses needs to be done.
If you want a 60fps or even 120fps gaming experience you have better options for that... on PC you can get these without the heavy compromisses and trade-offs you need on consoles.
 
Last edited:
Yeah but the GPU was bottlenecked by the CPU. So it's actually pretty accurate. The Pro consoles were way more than twice as powerful as the base ones, but basically no games them on had 60fps modes, and that's because the of the CPU. God of War tried it but only got to like 40-50fps.

This is different this time. In general 60fps is achievable just by lightening the load on the GPU.
But to say 60fps isn't possible just isn't true. 60fps is possible, but there would be cut backs. That's my point. You can basically make any game run like crap on good hardware and you can make any game run at 60fps on any hardware.... what that game LOOKS LIKE is a whole other story.
 
Some guys where discussing about gamers preferring graphics or framerate.
At least in Europe seems the graphics is the bigger winner even on PC.

"Better graphics" are widely agreed to be important in the European markets, with 68% of all surveyed gamers considering it important. Among console gamers only, this figure moves up to 78%.

"Shorter load times" was the only other feature close to "Better graphics" with 63% All / 71% consoles.


The survey were made for what you want more for Next-generation.
 
Last edited:
Some guys where discussing about gamers preferring graphics or framerate.
At least in Europe seems the graphics is the bigger winner even on PC.



"Shorter load times" was the only other feature close to "Better graphics" with 63% All / 71% consoles.


The survey were made for what you want more for Next-generation.

When you've got 49% of respondents replying that 8k support is "important" you've clearly got a measurement/communication problem lol.

The only fair way to really test whether people prefer performance modes or fidelity modes is to find a bunch of very casual gamers and literally make them play at 4k30 for 5 minutes or so and then switch to 1440p60 for a few minutes, and just ask them which is better?

Otherwise all kinds of biases and preconceptions will interfere with their decision.
 
When you've got 49% of respondents replying that 8k support is "important" you've clearly got a measurement/communication problem lol.

The only fair way to really test whether people prefer performance modes or fidelity modes is to find a bunch of very casual gamers and literally make them play at 4k30 for 5 minutes or so and then switch to 1440p60 for a few minutes, and just ask them which is better?

Otherwise all kinds of biases and preconceptions will interfere with their decision.
That I believe it is still biased.
You need show 30fps and 60fps with big time intervals imo.

The switch makes 30fps looks weird.
When you play it directly without come from a 60fps mode you will find it normal.

Let's show 30fps one day and the other day 60fps… or one at morning and the others at afternoon.
 
Last edited:
I think you are close to the truth.

Consoles only players are not bothered with 30fps because it is really fine to play games in that framerate.
But the PC or Hybrid users have a strong hate for 30fps... seems to take it at heart.

60fps is better? Of course.
30fps is unplayable? Bullshit excuse... it is perfectly playable.
I the trade off for 60fps worth? Well in my opinion not.

Of course some games are focused in 60fps gameplay like racing, fighting, maybe FPS, etc... even so these genre already proved it can be played in 30fps too... well DriveClub and Destiny come fast to mind... loved the gameplay in both.

I thing console hardware are trying to take too much and ahead what it capabilities.
For me 120fps is out of question... no go... a waste of dev resource and the end result is ridiculous bad.
60fps can be give as option since PVP has a fixed mode to everybody (everybody 60fps or everybody 30fps).
30fps is where consoles devs needs to focus to games looks next-gen.

That is what I think.. so yes in terms of graphic fidelity 60fps kills it in console hardware.
Horizon is just released and proves again that point.

After all console hardware are in the middle level (well Series S is at bottom) and compromisses needs to be done.
If you want a 60fps or even 120fps gaming experience you have better options for that... on PC you can get these without the heavy compromisses and trade-offs you need on consoles.
I do agree, and I think perhaps I exaggerated when I said 30 was unplayable.
It's also very much dependent on your personal experience.

I only got my computer last year, so before that I was exclusively gaming on ps4 (pro).
I played TLOU 2 and never felt like it was ruined by 30fps because I knew no different.

It's only since replaying it at 60, and being somewhat spoilt by regular 100fps experiences on pc, that I've felt so strongly about the drawback of 30fps gaming.

That said, I do think there is something to be said for games that hold a rock solid 30 and those which see fluctuations.
The unreliable fps can often add to the unpleasant nature of the experience...that, and the very restrictive field of vision (FOV) that is often employed in console games.

Playing Warzone, as an example, at 75 fov is, for me, disappointing.

By contrast, 105 fov on pc at 130fps really does feel like you are playing a different game.

But, if you only ever play on console you really won't be any the wiser.

It is entirely down to the individuals expectations and personal experience regarding what they find acceptable, don't you think?
 
Last edited:
I do agree, and I think perhaps I exaggerated when I said 30 was unplayable.
It's also very much dependent on your personal experience.

I only got my computer last year, so before that I was exclusively gaming on ps4 (pro).
I played TLOU 2 and never felt like it was ruined by 30fps because I knew no different.

It's only since replaying it at 60, and being somewhat spoilt by regular 100fps experiences on pc, that I've felt so strongly about the drawback of 30fps gaming.

That said, I do think there is something to be said for games that hold a rock solid 30 and those which see fluctuations.
The unreliable fps can often add to the unpleasant nature of the experience...that, and the very restrictive field of vision (FOV) that is often employed in console games.

Playing Warzone, as an example, at 75 fov is, for me, disappointing.

By contrast, 105 fov on pc at 130fps really does feel like you are playing a different game.

But, if you only ever play on console you really won't be any the wiser.

It is entirely down to the individuals expectations and personal experience regarding what they find acceptable, don't you think?
Well said.

Things I don't like about framerate...
Framerate fluctuations... no matter if 30fps or 60fps it should be solid... fluctuations feels bad in both.
Change in framerate during gameplay... example playing in 60fps and cutscenes become 30fps... it is terrible... or everything is 60fps or everything 30fps (I'm looking at you Square Enix)... the constant shift between framerate is really a bad experience.

Most FPS games are 60fps but I'm still to play a FPS game on consoles that has a better feel of gameplay response than Destiny 30fps.... to put things in perspective.
 
Last edited:
That I believe it is still biased.
You need show 30fps and 60fps with big time intervals imo.

The switch makes 30fps looks weird.
When you play it directly without come from a 60fps mode you will find it normal.

Let's show 30fps one day and the other day 60fps… or one at morning and the others at afternoon.

Well it has to be a comparison, doesn't it? The two things need to be completely fresh in people's minds.

Plus, I said 30fps first. So it's not going to look weird compared to anything else.

My guess is that most people wouldn't even notice the drop in resolution, in most cases. So then it's a question of whether they notice the extra fluidity.
 
Well said.

Things I don't like about framerate...
Framerate fluctuations... no matter if 30fps or 60fps it should be solid... fluctuations feels bad in both.
Change in framerate during gameplay... example playing in 60fps and cutscenes become 30fps... it is terrible... or everything is 60fps or everything 30fps (I'm looking at you Square Enix)... the constant shift between framerate is really a bad experience.

Most FPS games are 60fps but I'm still to play a FPS game on consoles that has a better feel of gameplay response than Destiny 30fps.... to put things in perspective.
Definitely.

I think you have summed it up perfectly.
 
1080 to 4K needs 4x the performance.
30 to 60fps only needs double the performance.
 
Frame rate every single time.

There is no game that exists (or will exist) where I will willingly move from 60fps to 30fps just to get better graphics whilst playing.

I will switch for a quick look to see what I will be 'missing', but I'll then simply shrug my shoulders and move back to 60fps without a second thought.

However, I am all for sacrificing resolution to achieve better graphics, as long as 60fps is maintained.
 
What's funny about this thread, when PS5 was coming out we had people crying over how 60FPS should be standard and now we get thread like this.
4t.gif
 
Last edited:
If someone seriously says it is they haven't actually looked at Cyberpunk 2077 on Series X at 60fps with their HDR Settings correctly set. The game looks unbelievable.
 
i'd say focus on graphic fidelity is killing 60 fps experiences
The whole point of a new generation is prettier visuals. Right now it's as if we just bought a new graphics card to play the same games as before, but faster.

I'd like to use all that new power for better visual experiences. Stuff that is truly breath-taking.

Horizon Forbidden West kinda nails it in Fidelity mode. It's perfect.
 
If someone seriously says it is they haven't actually looked at Cyberpunk 2077 on Series X at 60fps with their HDR Settings correctly set. The game looks unbelievable.

Agreed it's just a very tinker heavy game to get it to look right. Console players call that broken but tweaking visual settings is something PC gamers have been used to for years. I'd rather have that level of control than being stuck with default settings. It truly does look bad with the wrong HDR settings on the game + uncalibrated Xbox video settings and the wrong uncalibrated factory settings on your TV so it is kind of a lot to ask if an average joe who is used to most games just looking fine out of the box.

For me though it's less about graphics and more about lackluster physics. I still hate the fact that physics are so horrific at the expense of graphical fidelity…I want to be able to cut through any object with a light saber not just fucking pots and scripted moments with vines. That's still better than invincible crates and boxes in Call of Duty.
 
Last edited:
Its about gameplay if it means that 30fps gives you a net positive on graphics then I'd rather have dialed down graphics at higher framerates.
 
More fact free bullshit. Next.


I'm not making your point for you. I've already destroyed your point several times. They didn't release as many 60 FPS games because the hardware wasn't there to support it. The fact that we are already getting every single release, both cross gen AND current gen exclusive with a 60 FPS mode shows that the historical precedent doesn't apply here. Did previous gens launch with 60 FPS modes included into every single for the first 2 years? No? Then you can't use historical precedent as as an example because it's obvious that Devs are already breaking with that.
Why do you keep repeating the same nonsense? Historically, devs have chosen 30 FPS because of hardware limitations. Thats why most games ran at 30 FPS. The fact that already every single game releases with a 60 FPS mode, we have platform holders confirming Framerate to be a priority and the fact that the most popular games are mostly 60 FPS games is undeniable proof that historic precedent doesn't apply here. Saying that devs will force games to run at 30 fps based on historical precedent is just as dumb as saying devs will force games to run at 720/1080p because of historical precedent. Also, It's a pretty well known fact that the only reason that 60 FPS wasn't the norm last gen was because both systems were CPU bound which caused a bottleneck. As confirmed here by Phil Spencer:

Something also backed up by Bungie when speaking about Destiny 2 having 60 FPS on PS4 Pro:


Okay so back to your crappy arguments:

Only older games that were released before the new hardware games were patched.Nearkly Every single game that was released after the launch of the next gen consoles, including next gen only haves, released with 60 FPS modes present. We're already in the 2nd year of the console and this is still the case. "b-b-but that's because they are early games" at NO POINT IN HISTORY have games released in the first 2 years of a consoles lifecycle ran at a different framerate than at the remainder of the generation. Never. The closest thing you can possibly get to is the PS2/GC/Xbox era where there was more variaty in framerates. Every single other generation, whether its PS1/N64, PS3/Xbox360 or PS4/Xbone, had the same most common/standard framerates at the start of the gen as at the end of it.

The Matrix TECH DEMO is literally just that. It's exactly to promote a possible fidelity (as well as promote the Matrix, Obviously) and that's why it makes sense to make it run at 30 FPS since there's no game there. It's the exact same reason why so many games are shown to run in fidelity mode in promotion material such as trailers/gameplay footage despite every single one of them offering a 60 FPS mode. This is a dumb argument and I suggest we move on from it.


bVtwblL.jpg

Yes, fluidity in playability and controller response is far more important than slightly higher resolution textures or a few extra lighting effects. I also really enjoy not having the entire game turn into a blurry mess the moment I turn the camera. R&C: Rift Apart looks fantastic. Spiderman: Miles Morales looks incredible and Horizon: Forbidden West also looks terrific on a PS5 and thanks to 60 FPS, they also play great. If you really want to play at higher fidelity, well they give you the option to do so. THIS is the actual balance that developers should go for and fortunately, they are.
Not gonna waste too much time typing up a response to all this bullshit.

Just know this:

- 30FPS has been popular forever because its the perfect balance between great graphics and smooth gameplay.
- All Developers CAN make their games 60FPS, but they don't, because it would require too much of a sacrifice on everything else - Hence they stick to 30. . Because, as I said, its a better balanced framerate. Thanks again for proving my point. Again. The Bungie quote is basically proof. Genius.
-The UE5 Tech demo was used to show off the possibility of fidelity in the near future. If developers want to reach that level of fidelity, it will have to be at 30fps. It will happen.
- You keep ignoring the part where I said I want more destructable environments, more enemies on screen, more effects, more ray tracing, more interactivity with environments, larger battlefields.. All of that is fidelity. And is less likely to happen at 60fps. If you value iNpUt LaG over all of that then more power to you. We're different. Im glad I dont look at games the way you do. Ew.

Oh - And here's a link proving what Ive been saying this whole time. Most gamers prioritize "better graphics" over anything else when buying new hardware. Followed by "shorter load times". Framerate way down on the list because who the fuck cares but frame nerds.

Better graphics most important to new hardware buyers:

Bye.
 
Last edited:
I hope the trend continues and 60fps becomes the norm. Id much rather take the graphic details hit if it means its a stable 60. If not 60, then at least give me a rock solid 30 that never dips.
 
4K at 30FPS is definitely more demanding.
Though I'm not sure how much more demanding checker board(upscale) 4K @ 30 is compared to 1080p @ 60.
I'd have to agree.

The other poster thought I was being facetious, but I was actually being genuine.

The question is how much more performance is needed to double to fps whilst the resolution is 4 times lower.
 
Yes, I've been saying this for years. This lack of next gen looking games is the future you frame nerds want. OMG ITS 60FPS ITS SO NEXT GEN BRO WOW HAHA OMG SO SMOOTH...

Fuck outta here with that. stick to 30. blow my mind visually.

No one in real life gives a crap about frame rate.

Not once in the mediums history has "steady high frame rate! Holds at 60 fps!" been on the back of a box. Always graphics.
XFaFQmv.jpg
 
Read my arguments throughout the thread, the back of that box proves my point. 60 was only added for the remaster. Not originally, because the sacrifice to make it run at 60 was too much. So they stuck to 30

And resolution is still mentioned first :messenger_dizzy:
 
Top Bottom