Is 60 FPS killing overall graphical fidelity?

RT is the new flavour of the day. Before that it was number of strands of hair. Way back it used to be bloom and lens flare effect.

Still waiting for devs to say "Hey, we got an awesome game engine improvement with great physics and AI"..... (to be fair Frostbite open map destruction is great).

RT does have the potential of speeding up the art pipeline, no? Having to rely less on prebaked lighting could be mutually beneficial to developers and players.
 
Last edited:
Nah, I game on a 77" LG C1 with VRR turned to the max and Riky holding my joystick. It's pure heaven
Which means you don't use bfi, so you should not claim you should be listened to.

You need bfi on sample and hold displays for smoothness.

On c1 its under the oled motion pro setting.
 
Last edited:
Which means you don't use bfi, so you should not claim you should be listened to.

You need bfi on sample and hold displays for smoothness.

On c1 its under the oled motion pro setting.
yep. Cant use vrr and bfi together on c1.
And I find bfi useless on c1 anyway. It helps a tiny bit but it dims the picture.
It is sample and hold but better than any lcd imaginable
 
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.
If you ask me if I prefer counting characters' ass hair at 20-30fps, or having slightly less details for a smooth as butter experience, well...
 
yep. Cant use vrr and bfi together on c1.
And I find bfi useless on c1 anyway. It helps a tiny bit but it dims the picture.
It is sample and hold but better than any lcd imaginable
I actually was really impressed with the bfi on c1, and motion settings in general.

In Sdr, it's free performance as you don't need the extra brightness. Although I use bfi on sony x900e even in hdr. It looks better than oled without bfi. Just the setting of 1 not 2 though as 2 is pretty dim in hdr.

You might want to experiment with it a bit more but if you really don't see a difference then leave it off as it adds 8ms lag.

Oled plus 120fps and good bfi is the best motion you can buy atm.
 
Last edited:
I actually was really impressed with the bfi on c1, and motion settings in general.

In Sdr, it's free performance as you don't need the extra brightness. Although I use bfi on sony x900e even in hdr. It looks better than oled without bfi. Just the setting of 1 not 2 though as 2 is pretty dim in hdr.

You might want to experiment with it a bit more but if you really don't see a difference then leave it off as it adds 8ms lag.

Oled plus 120fps and good bfi is the best motion you can buy atm.
But I do need to have constant of 120fps right? For the bfi in sdr 120hz to look correct ?
 
But I do need to have constant of 120fps right? For the bfi in sdr 120hz to look correct ?
You can adjust the slider to find your brightness tolerance, but if you go all the way to max the bfi flickers at 60hz which is straining so you don't want that. It should be apparent if it's 60hz or 120hz as the former will have flicker.

You can use 120hz bfi with 30fps, 60fps or 120 and it doesn't need to be a locked fps.

Some say not to use 120hz bfi unless the fps is 120 as it will cause image duplication but i've found that said duplication just blends in with the inherent sample and hold blur so it's a non issue.
 
but...1080p and 1440p looks like shit on 4k TV compared to real 4k 444 rgb quality.
We need mainstream TVs with 1440p native resolution(1600p would be better spot imho) if we want to refuse from 4k standard.
Compared to half the framerate and much lower quality in all settings overall, LOD, draw distances, bad AA and reflections to be able to reach that res and mark the "4K" checkbox? no, they don't.

I have both a 4K@120 LG55GX6LA and a Epson Pro Cinema 6040UB projector, and I sure don't need to see how "crisp" and "sharp" the staircase aliased edges and how bad 30fps feels compared to 60fps (60 to 120 is not that big of a deal).

The sacrifices they make to reach the "4K" checkmark don't pay off. Unless you are playing a very static game, smooth motion is a lot more important than resolution.
 
Last edited:
I don't really understand your point? My point is demanding 60fps for all games likely compromises game design not just graphics. Shadow of the Coloussus is the best example I could think of. (At least up to now maybe these consoles have enough power that the majority games can be 60 frames without major compromises.)

Whatever happens to the devs after the games ship doesn't seem relevant. Team Ico (not sure why your bringing up Sony japan) could have still been disbanded if they shipped stable games devs can go under for all kinds of reasons.

There are plenty of examples of mega successful 30 frame rate ( or under) games that likely couldn't be 60 frames on consoles without compromising the gameplay e.g. dark souls, skyrim, witcher 3, last of us, mgs3, gta5 etc. In fact the vast majority of successful single player games have been 30fps.

Basically unless a dev decides to do 2 different games targeting 30fps and 60fps its impossible to know what's left out.

Kojima is good example mgs2 60fps and mgs3 30fps and mgs5 60fps and death stranding 30fps. Personally I enjoyed his 30fps games more.

Overall I feel just focusing on framerate can be rather shallow. Yes the games feel smoother and you get better reactions but there is a lot more to game design than that.

I already said I don't necessarily disagree with the larger point, just that it's curious you'd bring up team Ico games in support of it, when it was their ambition that probably killed them off.

Anyway. To actually address the point directly: it's true of course that higher frame rates stress the CPU as well as the GPU, and so in theory some compromises on other uses of the CPU may be necessary. My overall feeling is that this is a price worth paying. I think this can has been kicked down the road long enough: 30fps has never REALLY been acceptable but for me reason or another it's just been hard to avoid.

This generation we've finally got an opportunity where we can get back to 60fps relatively easily, but to me it's kind of a one off cost: once we've taken the hit there shouldn't be any more talk about it in the future. We can have the big leap in visuals etc next time.

If the situation persisted next gen and there were people clamouring for 120fps then I'd be totally on your side, because I think that's just a silly luxury in a way that 60fps just isn't.
 
If you don't consider frame rate as an essential part of the experience you're missing the boat.

I'm so happy with how the hardware is finally getting where we need it to be.

Next gen will continue this trend.

I've seen every generation change since the 80s and this is one of the biggest paradigm shifts we've had since polygon graphics game along.

I'm not sure why this isn't being stated more.
 
The mandatory "4K" sticker and bad "raytracing" is what is killing quality.

unknown.png



1080p60 or 1440p60 with good AA any day. Fuck the stupid "but it's 4K!!".

God what were they thinking with that shit? So often the kind of RT the consoles can actually do just looks awful.

Pretty much all they had to do with GT7 was improve the environments, make the trees 3d etc. It wasn't that far from photorealistic in GT Sport. But instead they waste God knows how much power on RT reflections on the cars that you'll never notice, and look dopey even if you do.
 
Something I've learned over the PS4 generation, is that people would always take Metal Gear Solid (PS1) graphics at 60 fps, over something truly outstanding graphically at 30 fps.
 
Maybe on consoles but in the land of RTX 3080 Ti it certainly doesn't.

I look at it this way. Metroid Prime still has some of the best graphics I've ever seen in any video game. It is beautiful yesterday, today and forever. I don't think you need bleeding edge graphics. But you sure as hell need solid gameplay and play control, and depending on what you're asking the player to do, you need 60fps
 
To answer the OP questions, currently NO 60fps is not killing overall graphic fidelity. First of all, with today's hardware the difference between 30 and 60fps is merely resolution scaling and maybe some tweaking of a few features here and there that have minimal impact on the overall image. It doesn't change the overall ambition and design of the game or it's assets to shoot for 60fps. We can see that especially with the current gen (PS5/XBS) as many games have multiple modes with 30 and 60 fps output. Ratchet and Clank doesn't look fundamentally different in its 30 vs 60fps modes with RT enabled. Just slightly lower input resolution to the temporal injection scaler and some nips and tucks to the fidelity that's minimal and unnoticeable without side by side comparisons. Same can be said about games like Demon's Souls, ACV, and many others.

Obviously 60fps games have been around for over 30 years and throughout the "3D" gaming era (i.e. since PS1). It never prevented new generations of games from looking like a huge leap over previous generations (GT1/2 -> GT3 on PS2, MGS1 -> MGS2 on PS2, Tekken 3 -> Tekken Tag PS2, Call of Duty Xbox -> Call of Duty 2 X360 and the list goes on). Your example of GT7 not looking as good as Driveclub is A.) a bit premature since we haven't seen GT7 up close yet and B.) somewhat misguided since parts of GT Sport looked better than Driveclub on PS4. Hard to image they would take a huge step back on new hardware. Halo Infinite is a case of the change in developer goals and a redesign of certain aspects of the game. If they targeted 30fps, it wouldn't look totally different. Just perhaps running at a more stable framerate and a slightly higher resolution.

Of course, a lot of the 30 vs 60fps load discussion becomes moot with the wider adoption of temporal reconstruction tech (FSR, TSR and others on console) which can easily take a 30fps lock and double the framerate to 60fps with minimal to no impact on the image or the underlying game design :messenger_winking:
 
30fps is also killing. Devs should make game at 1fps, 1x1i for best graphics possible.
 
Last edited:
30fps is also killing. Devs should make game at 1fps, 1x1i for best graphics possible.
I know you're joking but I wonder what a 1080p/30fps game would look like with top notch AA.

For example, a Blu-ray movie still looks really good on a 4K TV, why wouldn't a 1080p game with really good AA not look good on a 4K TV? If they didn't focus on frames and added as much detail and lighting quality as possible, it could be pretty mind blowing.
 
To answer the OP questions, currently NO 60fps is not killing overall graphic fidelity. First of all, with today's hardware the difference between 30 and 60fps is merely resolution scaling and maybe some tweaking of a few features here and there that have minimal impact on the overall image. It doesn't change the overall ambition and design of the game or it's assets to shoot for 60fps. We can see that especially with the current gen (PS5/XBS) as many games have multiple modes with 30 and 60 fps output. Ratchet and Clank doesn't look fundamentally different in its 30 vs 60fps modes with RT enabled. Just slightly lower input resolution to the temporal injection scaler and some nips and tucks to the fidelity that's minimal and unnoticeable without side by side comparisons. Same can be said about games like Demon's Souls, ACV, and many others.

Obviously 60fps games have been around for over 30 years and throughout the "3D" gaming era (i.e. since PS1). It never prevented new generations of games from looking like a huge leap over previous generations (GT1/2 -> GT3 on PS2, MGS1 -> MGS2 on PS2, Tekken 3 -> Tekken Tag PS2, Call of Duty Xbox -> Call of Duty 2 X360 and the list goes on). Your example of GT7 not looking as good as Driveclub is A.) a bit premature since we haven't seen GT7 up close yet and B.) somewhat misguided since parts of GT Sport looked better than Driveclub on PS4. Hard to image they would take a huge step back on new hardware. Halo Infinite is a case of the change in developer goals and a redesign of certain aspects of the game. If they targeted 30fps, it wouldn't look totally different. Just perhaps running at a more stable framerate and a slightly higher resolution.

Of course, a lot of the 30 vs 60fps load discussion becomes moot with the wider adoption of temporal reconstruction tech (FSR, TSR and others on console) which can easily take a 30fps lock and double the framerate to 60fps with minimal to no impact on the image or the underlying game design :messenger_winking:
Best post in this discussion. Sadly some of them think we're going to get double the graphics if we went to 30 frames per second.
 
I know you're joking but I wonder what a 1080p/30fps game would look like with top notch AA.

For example, a Blu-ray movie still looks really good on a 4K TV, why wouldn't a 1080p game with really good AA not look good on a 4K TV? If they didn't focus on frames and added as much detail and lighting quality as possible, it could be pretty mind blowing.
A game made for an i9+RTX 3090 could be the most mind blowing, but at 1080p on a 4k screen would be a blurry mess.
 
Last edited:
I don't think I can go back to 30fps on a console.
Did you see what you just said? "on a console"... that pretty much guarantee's you'll be playing 30fps games. Best to get a PC where you can dictate what it does. Otherwise, just enjoy 30fps games when the real next gen games start to drop (once we're all done with this cross-gen phase that is)
 
I game made for an i9+RTX 3090 could be the most mind blowing, but at 1080p on a 4k screen would be a blurry mess.
This really depends but in most cases it won't look a blurry mess on console played on a TV screen. There are many reasons for this:
  1. Most TVs screens aren't big enough in the home to really discern between 1080p and 4K at typical viewing distances (would need to be ~5ft away from an 80in screen to really see the difference between 1080p and 4K (LINK)
  2. Speaking of viewing distances, most people sit 8+ ft away from their TVs again reducing the impact of 4K over 1080p
  3. Most TVs have much better scaling to 4K than any computer monitor. This also mitigates the difference between 1080p vs 4K and is also why some people think that 1080p games and BluRay movies still look really good on a 4K display
Now if you're talking a computer monitor where the user sits 2ft away from a 27in screen for example, then yes the difference is much more noticeable at 1080p (I assume you're speaking more about monitors since you mentioned i9+3090 hardware). But I can attest personally that switching between 1080p and 4K modes on most games that allow it such as Doom Eternal, DIrt 5, or Call of Duty Cold War on PS5, the difference is extremely subtle without side by side comparisons (which is not happening during real world gaming) and/or moving up extremely close to the screen (I generally sit about 12 ft away from my 77in OLED).
 
the 120hz 40fps of Ratchet and Clank is a good idea to overcome this... but Insomniac not using it in SpiderMan PS5 games make me think it is not as easy to implement!
 
This really depends but in most cases it won't look a blurry mess on console played on a TV screen. There are many reasons for this:
  1. Most TVs screens aren't big enough in the home to really discern between 1080p and 4K at typical viewing distances (would need to be ~5ft away from an 80in screen to really see the difference between 1080p and 4K (LINK)
  2. Speaking of viewing distances, most people sit 8+ ft away from their TVs again reducing the impact of 4K over 1080p
  3. Most TVs have much better scaling to 4K than any computer monitor. This also mitigates the difference between 1080p vs 4K and is also why some people think that 1080p games and BluRay movies still look really good on a 4K display
Now if you're talking a computer monitor where the user sits 2ft away from a 27in screen for example, then yes the difference is much more noticeable at 1080p (I assume you're speaking more about monitors since you mentioned i9+3090 hardware). But I can attest personally that switching between 1080p and 4K modes on most games that allow it such as Doom Eternal, DIrt 5, or Call of Duty Cold War on PS5, the difference is extremely subtle without side by side comparisons (which is not happening during real world gaming) and/or moving up extremely close to the screen (I generally sit about 12 ft away from my 77in OLED).
I have my pc plugged on a 4k TV. Games set to 1080p looks very blurry compared to 4k. I sit ~2 meters away.
 
Last edited:
I already said I don't necessarily disagree with the larger point, just that it's curious you'd bring up team Ico games in support of it, when it was their ambition that probably killed them off.

Anyway. To actually address the point directly: it's true of course that higher frame rates stress the CPU as well as the GPU, and so in theory some compromises on other uses of the CPU may be necessary. My overall feeling is that this is a price worth paying. I think this can has been kicked down the road long enough: 30fps has never REALLY been acceptable but for me reason or another it's just been hard to avoid.

This generation we've finally got an opportunity where we can get back to 60fps relatively easily, but to me it's kind of a one off cost: once we've taken the hit there shouldn't be any more talk about it in the future. We can have the big leap in visuals etc next time.

If the situation persisted next gen and there were people clamouring for 120fps then I'd be totally on your side, because I think that's just a silly luxury in a way that 60fps just isn't.
Okay fair enough I get you now. I guess I am happy with 30fps because the majority of my favourite have all been 30 at release 🤷‍♂️

You raise and interesting point about the default being 60. People expect a solid 30 as a minimum these days no reason why it can't be 60.

Do you think all games need 60 though? The most obvious example is for turn based rpgs it would seem irrelevant. Slow paced stealth games as well it might be worth sacrificing the frames for more ambitious level design.

And dare I say souls games ( people will hate me for that). Gamers love the huge winding looping level design and this appears to stress the framrate. (Blight Town being the obvious example) Obviously the extra frames help with the combat but it's not as vital as say devil may cry or bayonetta.

Out of interest has every game had a 60fps mode on next gen consoles so far that's pretty cool.
 
I don't know if it's because of my new TV. But ever since I bought a PS5 and my new TV together. And I drop down from 60fps to 30fps it looks awful. I have even given it an hour to see if I get used to 30fps and I can't. It's strange as on PS4 I didn't notice 30fps. But I think it's because of my new TV. I bought a 65 inch which is large for my small room. I sit around 2.5 metres away from 7 feet away from the TV. Before I gamed on a 43inch TV with my PS4 Pro. The larger the screen size the more it's easy to notice low framerate.

Now I cannot even handle under 60fps. I'd even take a 1080p 60fps mode over a 4K 30fps RT mode. I'm glad that so far all games give me an option for a 60fps mode but I am worried if this will continue.
 
i actually think 60 fps is not natural and gives the soap opera effect and since the games are turning more realsitc it looks weird being overly smooth like a soap opera .30 fps is enough the issue really is the screens we use dont have enough refresh so the screens are blurry. crt monitors were perfect at this, perfect clarity and motion and fps did not matter
 
to the topic:
yes, for every game franchise that was 30 before, 60 is a bit of overkill. Yes You will still get some gains because of new technologies but PS4 Pro -> PS5 is more or less just a bit more than double the performance. Yes you can much easier squeeze out the performance of RDNA2, but RDNA2 shouldn't be that much more efficient (at the same TF) than highly optimized GCN code. With RDNA2 it is much easier to use it's full potential so you can't gain that much more through optimizing. Only by adapting new technologies you can gain much more efficiency.
The midgen refresh consoles just took some of the generation-shift-effect away from us. Not that they were bad consoles or something like that but they a bit downplayed the achievements of the current gen. Only faster loading/no loading is what you can immediately "feel".

Not every game has to go with 60fps. E.g. I guess naughty dog games will remain at 30 fps because they want to spend as much time as possible to render each frame for the best possible imagequality. The problem with optional 60fps mode would always be, that they than would leave CPU resources untouched in a 30fps mode. Resources that could be spend for physic effects and so on.

And because we had the mid-gen consoles, I really hoped for a much bigger jump, but that wouldn't be possible at those prices. I guess we will see in the future what the new technologies will bring to the table (e.g. Mesh shading (and whatever sony is calling this, they should have something very similar), ...), as adapting those techs will really take some time.
 
Depends on how you define graphical fidelity. It a game looks 540p anytime you move the camera but great when doing nothing is that better than a game that looks good but not jaw dropping in stills but also smooth and much clearer in motion.
 
Depends on how you define graphical fidelity. It a game looks 540p anytime you move the camera but great when doing nothing is that better than a game that looks good but not jaw dropping in stills but also smooth and much clearer in motion.
What have is 540p in motion?
You have an example or are you another clueless motion blur hater?

It's the LCD fault
 
Depends on how you define graphical fidelity. It a game looks 540p anytime you move the camera but great when doing nothing is that better than a game that looks good but not jaw dropping in stills but also smooth and much clearer in motion.
If your wife was 1080p when she was with you but sleeping with another man at 540p, why would you care as long as you get her at 1080p, especially at 60fps?
 
You have an example or are you another clueless motion blur hater?
Motion blur is bad? Am I missing something here - it's probably the most hated non-sensical "feature" added to 30fps games. It literally takes the image and blurs it - hence the 540p reference. It's a distortion of the image when moving around in games - which you do all the time! "Look how beautiful this game is when I'm not touching the controls" - said no one ever.

And that's not an LCD thing. If your OLED does a better job at 'distorting the image', it's probably because your TV adds more filters to smooth things out.
Basketball Wives Ugh GIF by VH1
 
Motion blur is bad? Am I missing something here - it's probably the most hated non-sensical "feature" added to 30fps games. It literally takes the image and blurs it - hence the 540p reference. It's a distortion of the image when moving around in games - which you do all the time! "Look how beautiful this game is when I'm not touching the controls" - said no one ever.

And that's not an LCD thing. If your OLED does a better job at 'distorting the image', it's probably because your TV adds more filters to smooth things out.
Basketball Wives Ugh GIF by VH1
A Programmer's Guide to Motion Blur

Prologue: Mass Effect
Chapters 1-10: Mass Effect
Conclusion: Mass Effect

If you're having a problem finding it, look for this:

x360_mass_effect-110214.jpg
 
A Programmer's Guide to Motion Blur

Prologue: Mass Effect
Chapters 1-10: Mass Effect
Conclusion: Mass Effect

If you're having a problem finding it, look for this:

x360_mass_effect-110214.jpg
Haha! Jesus man, I luckily forgot about that motion blur "pearl".

Actually funny to see how the PS3 version has a constantly more blurry picture, but far less motion blur.

 
Motion blur is bad? Am I missing something here - it's probably the most hated non-sensical "feature" added to 30fps games. It literally takes the image and blurs it - hence the 540p reference. It's a distortion of the image when moving around in games - which you do all the time! "Look how beautiful this game is when I'm not touching the controls" - said no one ever.

And that's not an LCD thing. If your OLED does a better job at 'distorting the image', it's probably because your TV adds more filters to smooth things out.
Basketball Wives Ugh GIF by VH1
Motion blur is essential. 30 or 60fps is not enough with only sharp images. An no. Your eyes won't blur images together. Not at 60. Maybe at 240



Low shutter speed (motion blur) looks way more natural. And you need 240 or more Hz to do it without motion blur
 
Last edited:
Motion blur is essential. 30 or 60fps is not enough with only sharp images. An no. Your eyes won't blur images together. Not at 60. Maybe at 240



Low shutter speed (motion blur) looks way more natural. And you need 240 or more Hz to do it without motion blur

You are showing a video of a still-camera recording moving objects... Now, if it were a moving camera recording moving objects, the motion blur would be instantly noticeable. That's why 24fps works in movies, cause the camera holds still for the majority of the running time. You need to educate yourself a little better, if you're trying to school someone.

Edit: And the most stupid thing about that video is that it's uploaded to YouTube in 25fps :messenger_tears_of_joy:
 
Last edited:
Motion blur is essential. 30 or 60fps is not enough with only sharp images. An no. Your eyes won't blur images together. Not at 60. Maybe at 240



Low shutter speed (motion blur) looks way more natural. And you need 240 or more Hz to do it without motion blur

Turning fast in COD at 60 fps seems pretty good to me.

At 30 fps or less, its terrible. And the lower the frame rate, the more devs add motion blur making it blurry (and with comet trails if it's really bad).
 
You've got to laugh even my mobile phone is 120hz. But no console gaming got to be held back by 30 frames per second which all started in the 3D PlayStation era.

Before that you could play Sonic the Hedgehog, streets of Rage 2, Mario and Super Mario Kart all at 60 frames per second.

60 frames has to become the standard I don't care if you don't understand it.
 
You've got to laugh even my mobile phone is 120hz. But no console gaming got to be held back by 30 frames per second which all started in the 3D PlayStation era.

Before that you could play Sonic the Hedgehog, streets of Rage 2, Mario and Super Mario Kart all at 60 frames per second.

60 frames has to become the standard I don't care if you don't understand it.
It wont ever become the standard my entitled son.

Devs are not so stupid to sacrifice their vision just because some random guys in a forum are asking for mandatory 60 fps.

And just so you know some great games for the ps1 era were 24 fps in PAL territory and noone complained and everyone loved the games.

60 fps is pure entitlment.
 
Last edited:
Forza Horizon looks infinitely better than Driveclub, and it's 60fps…and open world. Driveclubs framerate made it a much worse game.

In the new Ratchet game 30 FPS mode feels broken compared to 60.
Forza Horizon at 60fps? I think you are confusing Horizon with motorsports.
 
It wont ever become the standard my entitled son.

Devs are not so stupid to sacrifice their vision just because some random guys in a forum are asking for mandatory 60 fps.

And just so you know some great games for the ps1 era were 24 fps in PAL territory and noone complained and everyone loved the games.

60 fps is pure entitlment.
I'll put my money on it that you're going to be more 60 frames games as time goes on. This gen is already looking better than last gen for 60fps.
 
I'll deal with 30 if I have to but 60fps should be the standard going forward. Maybe a few genres like JRPG/more tactical, slower games are fine at 30, but fuck goin back to playing stuff like Fromsoft games at 30. So glad I'm replaying the Souls series on the SeX.
 
Top Bottom