Is 60 FPS killing overall graphical fidelity?

The whole point of a new generation is prettier visuals. Right now it's as if we just bought a new graphics card to play the same games as before, but faster.

I'd like to use all that new power for better visual experiences. Stuff that is truly breath-taking.

Horizon Forbidden West kinda nails it in Fidelity mode. It's perfect.
i dont think this gen will be all about better visual
will games have better graphics ? ofc but i think that the focus will be the cpu and ssd ( if the developers want that ) because they can make world feel way more alive/immersive than ever before , the cpu and ssd allows for crowds of hundreds of npcs to act very realisticly , and not only that but we might get way better level design than ever before ( again this is all the developers choice or if they have the imagination to do it )
 
Some guys where discussing about gamers preferring graphics or framerate.
At least in Europe seems the graphics is the bigger winner even on PC.
Its really hard to generalise https://store.steampowered.com/hwsurvey/videocard/ if u look at most recent(so from jan 2022) steamsurvey old trusty gtx 1060 is still most popular gpu(7,54%)and few others bit more powerull but by no means top end cards(and all of them weaker from xsx or ps5) have highest attachement rate, first relatively highend(not topend but stronger from xsx/ps5, altho not by that much)card is rtx 3070 at 1,81%(and its a fall from 1,95% in december 2021).

By that u can see big chunk of pc gamers arent enthusiast graphics whores like most(all?:P) of us here in this thread but casuals who are perfectly fine with 1080p(67,12%)or 1440p(9,19%), for comparision 4k res is only 2,37% but 1366x768 is 6,93% which can easily tell u how much of the total steam users are owners of very weak and outdated laptops- basic full hd monitors of decent quality are really cheap now, around 120 to 150usd so if pc gamer isnt at least on full hd it can only mean one thing- laptop user, and old lappy user at that :)

Another interesting observation:
9 out of top11 cards with the exception of very weak gtx 1050 and 1050ti(which are around base ps4 power,bit stronger ti version but not that much) are somewhere inbetween ps4pro and ps5 in power, first amd gpu is rx 580, top14 card with 1,45% attach rate.
 
Read my arguments throughout the thread, the back of that box proves my point. 60 was only added for the remaster. Not originally, because the sacrifice to make it run at 60 was too much. So they stuck to 30

And resolution is still mentioned first :messenger_dizzy:
I just replied to your "Never un a boxart" comment with an actual boxart. Nothing more.

Besides, once VR becomes the preferred way to play the minimum framerate ceiling will be 90/120fps.
 
I just replied to your "Never un a boxart" comment with an actual boxart. Nothing more.

Besides, once VR becomes the preferred way to play the minimum framerate ceiling will be 90/120fps.
It wont be anytime soon, definitely not this console gen, not enough power, who knows next gen so 2027-2028+ probably not enough power either but in 3-4 console gens, it might be possible, not probable but possible =D
 
It wont be anytime soon, definitely not this console gen, not enough power, who knows next gen so 2027-2028+ probably not enough power either but in 3-4 console gens, it might be possible, not probable but possible =D
With every new generation, the expectations are higher than they were the previous generation.
 
The whole point of a new generation is prettier visuals. Right now it's as if we just bought a new graphics card to play the same games as before, but faster.

I'd like to use all that new power for better visual experiences. Stuff that is truly breath-taking.

Horizon Forbidden West kinda nails it in Fidelity mode. It's perfect.
graphic fidelity is a diminishing return drain...anything breathtaking now will be the norm before the end of the game itself, that's how it works.

higher framerate actually makes games play better.

this being said, many games are getting different modes, and on pc you customize your experience, so hopefully in the future everyone will be able to choose
 
graphic fidelity is a diminishing return drain...anything breathtaking now will be the norm before the end of the game itself, that's how it works.

higher framerate actually makes games play better.

this being said, many games are getting different modes, and on pc you customize your experience, so hopefully in the future everyone will be able to choose
Other way around for me.
I will notice jaggies and bad graphics all the time. i will not get used to it.
but when I had 240hz, I got used to it just as I get used to 30 or 60 and stopped noticing it. but the graphics sucked all the time.

Of course yes - higher framerate is better. Nobody is arguing that. But for me, not at the cost of graphics (to the point)
 
I'm not sure about your point... games on Series S are not being released with 60fps mode... at least there are example lately released.
It is being limited to 30fps.

that is not my point, my point is if a game can run at 30fps on Series S then 99.9% of the time that same game can run at 60fps on Series X. worst case scenario it will run at the same settings as the 30fps Series S version, but usually there will be plenty of headroom above that too.

and because they need to have a Series S version for every game, every game can therefore pretty easily be given a 60fps mode on Series X
 
Last edited:
that is not my point, my point is if a game can run at 30fps on Series S then 99.9% of the time that same game can run at 60fps on Series X. worst case scenario it will run at the same settings as the 30fps Series S version, but usually there will be plenty of headroom above that too.

and because they need to have a Series S version for every game, every game can therefore pretty easily be given a 60fps mode on Series X
That's really only true when the GPU is the sole limiting factor. We still haven't seen any games that push the new CPUs to a real degree.

CPU bandwidth can also be a bottleneck.
 
Not gonna waste too much time typing up a response to all this bullshit.
You should have given up long ago. I've absolutely wiped the floor with you in this little debate we've been having. You cite precedent that's already been broken with. You omit important parts of quotes to support your poor argument and you misframe data as well. It's pitiful. No wonder you've been wrong about pretty much everything. You know next to nothing about this industry and it's baffling that you keep insisting that you do.
Just know this:

- 30FPS has been popular forever because its the perfect balance between great graphics and smooth gameplay.
Wrong. 60 FPS has been the standard for well over a year now in every new release and most of the most played games of the last 3 years run at 60 FPS. Stop denying the facts.
- All Developers CAN make their games 60FPS, but they don't, because it would require too much of a sacrifice on everything else - Hence they stick to 30. . Because, as I said, its a better balanced framerate. Thanks again for proving my point. Again. The Bungie quote is basically proof. Genius.
The bungie quote specifically mentions that they wanted to have it at 60 FPS, but couldn't because it had a CPU Bottleneck. That Bottleneck isn't present in current systems which is why D2 now runs at 60 FPS on every system, just like every other game released on current gen hardware. Spoiler alert: Bungie's next game will be 60 FPS as well. Quit trying to spin quotes to try to prove your point.
-The UE5 Tech demo was used to show off the possibility of fidelity in the near future. If developers want to reach that level of fidelity, it will have to be at 30fps. It will happen.
It WiLl HaPpEn hahahahah. You might get one or two games that will still have a 30 FPS mode, but the vast majority of them will offer a 60 FPS mode as well. You're the only one still talking about that demo mate. Everyone else is now talking about Horizon Forbidden West, which looks incredible and can run at 60 FPS as well.
- You keep ignoring the part where I said I want more destructable environments, more enemies on screen, more effects, more ray tracing, more interactivity with environments, larger battlefields.. All of that is fidelity. And is less likely to happen at 60fps. If you value iNpUt LaG over all of that then more power to you. We're different. Im glad I dont look at games the way you do. Ew.
Ah yes, Destructable Environments, more enemies on screen, Larger Battlefields...Oh you mean like the Battlefield series, which targets 60 FPS on every platform? Or are you talking about Uncharted, God of War or The Last of Us, which both have like 4 or 5 enemies on the screen at most all taking place in tiny linear areas. You have such a childish concept of how game design works. Most games don't benefit from "Larger battlefields" or "More enemies on screen". That is only relevant in certain video games. "More enemes on screen" won't make Uncharted or TLOU a better game. Nor would them having "Larger Battlefields". Everything else is just "BeTtEr GWafIx" which don't actually look better because your screen turns into a blurry mess the moment you turn the screen. Yes, Playability is more important than all of that.
Oh - And here's a link proving what Ive been saying this whole time. Most gamers prioritize "better graphics" over anything else when buying new hardware. Followed by "shorter load times". Framerate way down on the list because who the fuck cares but frame nerds.

Better graphics most important to new hardware buyers:

Bye.
Not surprising to see you try to misframe data to suit your poorly thought out arguments. The question of the survey is "What new features of a new console is most important to you". Unsurprisingly, the most common answer is "Better Graphics" and that makes perfect sense. That's what I'm most excited about when I purchase a new system as well. The question, as you have attempted to frame it, is NOT "Do you prefer better graphics at the expensive of framerate".

Just because gamers are most excited about the new graphics doesn't mean that they want those graphics to come at the expensive of playability. That's not the question that's being asked and you're misframing it. Again, it needs to be repeated: Just because games now offer a 60 FPS mode doesn't mean that they don't deliver good graphics. R&C,DS Remake, HZ:FW, GT7, GoW:R all look great and they all run at 60 fps.
 
You should have given up long ago. I've absolutely wiped the floor with you in this little debate we've been having. You cite precedent that's already been broken with. You omit important parts of quotes to support your poor argument and you misframe data as well. It's pitiful. No wonder you've been wrong about pretty much everything. You know next to nothing about this industry and it's baffling that you keep insisting that you do.

Wrong. 60 FPS has been the standard for well over a year now in every new release and most of the most played games of the last 3 years run at 60 FPS. Stop denying the facts.

The bungie quote specifically mentions that they wanted to have it at 60 FPS, but couldn't because it had a CPU Bottleneck. That Bottleneck isn't present in current systems which is why D2 now runs at 60 FPS on every system, just like every other game released on current gen hardware. Spoiler alert: Bungie's next game will be 60 FPS as well. Quit trying to spin quotes to try to prove your point.

It WiLl HaPpEn hahahahah. You might get one or two games that will still have a 30 FPS mode, but the vast majority of them will offer a 60 FPS mode as well. You're the only one still talking about that demo mate. Everyone else is now talking about Horizon Forbidden West, which looks incredible and can run at 60 FPS as well.

Ah yes, Destructable Environments, more enemies on screen, Larger Battlefields...Oh you mean like the Battlefield series, which targets 60 FPS on every platform? Or are you talking about Uncharted, God of War or The Last of Us, which both have like 4 or 5 enemies on the screen at most all taking place in tiny linear areas. You have such a childish concept of how game design works. Most games don't benefit from "Larger battlefields" or "More enemies on screen". That is only relevant in certain video games. "More enemes on screen" won't make Uncharted or TLOU a better game. Nor would them having "Larger Battlefields". Everything else is just "BeTtEr GWafIx" which don't actually look better because your screen turns into a blurry mess the moment you turn the screen. Yes, Playability is more important than all of that.

Not surprising to see you try to misframe data to suit your poorly thought out arguments. The question of the survey is "What new features of a new console is most important to you". Unsurprisingly, the most common answer is "Better Graphics" and that makes perfect sense. That's what I'm most excited about when I purchase a new system as well. The question, as you have attempted to frame it, is NOT "Do you prefer better graphics at the expensive of framerate".

Just because gamers are most excited about the new graphics doesn't mean that they want those graphics to come at the expensive of playability. That's not the question that's being asked and you're misframing it. Again, it needs to be repeated: Just because games now offer a 60 FPS mode doesn't mean that they don't deliver good graphics. R&C,DS Remake, HZ:FW, GT7, GoW:R all look great and they all run at 60 fps.
You're wrong, that guy is right.
 
Recently we saw GT7 and let's be real it looks worse than Driveclub. Guess which one is 30 (Driveclub) and which is 60 (GT7). Halo Infinite recieved huge backlash for it's graphics and again it was 60 FPS. Jusy look at RDR2 graphics how jaw dropping the lighting is and compare it to a next gen 60 FPS like Far Cry 6. I mean clearly there is a pattern.
just no
GT7 is visually far far far more detailed than driveclub. driveclub is legit an arcade racer ofc it'll look more flashy. doesnt mean it looks better though
 
4K is the only thing that is killing graphical fidelity. Seriously, who even asked for 4K? Make all AAA games run with ultra graphics in 1080p (or slightly better if it doesn't harm performance) and 60 fps.
1440p/60FPS across the board.

Linear story driven games, multiplayer, everything at that resolution and no lower than 60FPS. This should be the bare minimum on current gen.
 
I like immersion as much as gameplay, and 30 fps usually serves this purpose more than gamey 60fps. Thanks to the added fidelity, and also for not haviong the 'soap opera' effect of such smooth framerate.
I dont need 60fps more than I need fidelity.
No one in this forum have anyword in anything me, and anyone,
there is a lot more that goes into immersion in a game than just frames. audio has to be great, visuals have to be great (which can be accomplished while not sacrificing the framerate.... say hello to my little friend forza horizon) animations have to be realistic, all that other jazz. the process of making a game immersive does not have much to do with framerate.
I can still enjoy a game like GTA at 60 and it doesnt take away from the immersive experience- it improves upon it because now it looks more realistic.
 
I just want to say for PS5, a GPU equivalent to 6600 XT be able to do native 4K with RT on, even at stable 30 FPS, the dev still did a impressive job.

RDNA2 architecture's RT performance is not great to begin with.
 
Last edited:
I just want to say for PS5, a GPU equivalent to 6600 XT be able to do native 4K with RT on, even at stable 30 FPS, the dev still did a impressive job.

RDNA2 architecture's RT performance is not great to begin with.
I forgot to mention I was specifically talking about HFW, which the graphic fidelity is amazing
 
More proof.
I tried to hint in the thread but people thought I was crazy (that I'm a bit with my controversial options and way of life).
Welcome to next-generation rendering reality.


With all of this in mind, where do we stand with this first look at a release version of Unreal Engine 5? The quality of the rendering is superb, Lumen and Nanite are delivering the generational leap in fidelity we want from the latest hardware, but I am concerned about CPU performance, where we can easily find limits even with the most powerful processors on the market. Of course, this is just a demo sample and not a final product but even so, I'm surprised at the apparent reliance on single-thread CPU performance - I'm concerned that attaining 60fps in UE5 titles for PC, let alone consoles, is going to be extremely challenging. However, ultimately, this is just a single example of the engine in action - and of course, the technology remains in a constant state of development, with many improvements to come.
 
Last edited:
We gonna all eat up that ps5pro and xbox series Y hard, peasants on standard consoles gonna play in 30(not to mention series S with some full hd or below that of a res) while all of us pr0s gonna get full fat 4k 60 on new midgen upgraded mashines ^^
 
Other way around for me.
I will notice jaggies and bad graphics all the time. i will not get used to it.
but when I had 240hz, I got used to it just as I get used to 30 or 60 and stopped noticing it. but the graphics sucked all the time.

Of course yes - higher framerate is better. Nobody is arguing that. But for me, not at the cost of graphics (to the point)
I haven't played a game yet this gen that had options for 30/60+ fps where the 60fps was graphically potato level.

I say, always provide an option. That way, everyone is happy.
 
I haven't played a game yet this gen that had options for 30/60+ fps where the 60fps was graphically potato level.

I say, always provide an option. That way, everyone is happy.
horizon forbidden west is one.
Generally, I don't like to see nice graphics and nice fps modes.. in my brain I feel the loss of another mode
 
Of course there are cuts to hit 60fps but when it comes down to actually playing a game. I'd rather play GT7 all day over driveclub.
 
Dude- HFW is not potato graphics on 60fps.
it's not potato but it is significantly worse looking.
Anyway - it doesn't matter. There is a topic where I explain why modes are essentially bad... at least in current form with no descriptions, requirement to watch df to see comparison and so on.
It's a console. I want to play the game in best possible form on that console. I have pc if I want options
 
Last edited:
Is 30 fps killing overall gameplay fluidity?
maybe in some games but I am playing uc4 legacy remaster right now and testing the 4k mode. Surprisingly 30fps in that game is fantastic.
It's the very good camera response times and good motion blur. Looks very good in motion and feels good.
1440p60 mode is great of curse but looks a bit blurry.
But there is no reason they couldn't checkerboard or fsr 1440p to 4k and do that at 60.
Or 4k/60 mode... just slapping 4k30, 1440p60 modes is lazy as fuck. You need to watch comparisons to see what are the details of modes anyway
 
maybe in some games but I am playing uc4 legacy remaster right now and testing the 4k mode. Surprisingly 30fps in that game is fantastic.
It's the very good camera response times and good motion blur. Looks very good in motion and feels good.
1440p60 mode is great of curse but looks a bit blurry.
But there is no reason they couldn't checkerboard or fsr 1440p to 4k and do that at 60.
Or 4k/60 mode... just slapping 4k30, 1440p60 modes is lazy as fuck. You need to watch comparisons to see what are the details of modes anyway
Doesnt matter how many times you say it or how many people say it.
Some brains just cant conceive a good playable 30 fps.
 
maybe in some games but I am playing uc4 legacy remaster right now and testing the 4k mode. Surprisingly 30fps in that game is fantastic.
It's the very good camera response times and good motion blur. Looks very good in motion and feels good.
1440p60 mode is great of curse but looks a bit blurry.
But there is no reason they couldn't checkerboard or fsr 1440p to 4k and do that at 60.
Or 4k/60 mode... just slapping 4k30, 1440p60 modes is lazy as fuck. You need to watch comparisons to see what are the details of modes anyway
Naughty Dog did a shit job on the UC4 remaster. The 1440p/60 mode is bs. Shoulda been at least 1600p or more!
 
it's not potato but it is significantly worse looking.
Anyway - it doesn't matter. There are topic where I explain why modes are essentially bad... at least in current form with no descriptions, requirement to watch df to see comparison and so on.
It's a console. I want to play the game in best possible form on that console. I have pc if I want options
I disagree. Choice shouldn't be limited to PC only users. I don't have a PC so I appreciate the options.
 
More proof.
I tried to hint in the thread but people thought I was crazy (that I'm a bit with my controversial options and way of life).
Welcome to next-generation rendering reality.

If that is the case I think UE5 will have a tough time competing in the market.
 
I sure as hell will not buy a single game this gen that isn't 60fps.

Elden Ring was right at the edge of acceptable for me and only thanks to VRR
ooof.... the stubborn generalization will only bring you misery.
Just get a pc in that case. Console is not for you.
30 fps is not always the same. Check out uncharted 4 4k30 mode. Feels and looks very good.
 
ooof.... the stubborn generalization will only bring you misery.
Just get a pc in that case. Console is not for you.
30 fps is not always the same. Check out uncharted 4 4k30 mode. Feels and looks very good.

Uncharted 4 never feels good.... also I have a PC ;)

still, some games I usually get on console for ease of use. I will not buy 30fps games anymore tho. that shit has to die already and should result in a score deduction in reviews
 
Uncharted 4 never feels good.... also I have a PC ;)

still, some games I usually get on console for ease of use. I will not buy 30fps games anymore tho. that shit has to die already and should result in a score deduction in reviews
So if a ground breaking game comes out like bloodborne 2 which is only on ps5... you will not buy it because it's 30 ?
Putting restrictions like that on yourself is just plain stupid.
You would get used to it if you wanted. Just give it 5 minutes man
 
So if a ground breaking game comes out like bloodborne 2 which is only on ps5... you will not buy it because it's 30 ?
Putting restrictions like that on yourself is just plain stupid.
You would get used to it if you wanted. Just give it 5 minutes man

jep, I am boycotting everything 30fps. it's about principles here.

also with Sony going in on PC, there won't be many console exclusive games anymore soon hopefully
 
jep, I am boycotting everything 30fps. it's about principles here.

also with Sony going in on PC, there won't be many console exclusive games anymore soon hopefully
That's a stupid boycott.
You are just removing yourself from the fun of games...
But whatever - you can play anything you want on a pc
 
I think it really depends on the game, engine and how devs are capable at optimising.

You have games like Elden Ring that more often than not look a bit dated and yet struggle to hit 60fps at 4k even on low settings with a 2070 Super. Or New World that stutters like crazy no matter the settings once there are more than 10 players in your area, even with a modern 8 core 16 thread CPU. GTAIV, Arkham Knight, AC Unity all at release?

Then you have games like Doom that easily run above 60fps at 4k on high settings. Or Gears 5 which also performs great or, for an open world game, RDR2 after you've tinkered for an hour with the 50 odd settings at your disposal.

Of course lines become blurry between "poorly optimised" and "not super optimised" and things aren't always clear cut. But it's generally not "because of 60fps". In case of GT7, it probably wouldn't look like the other game if they went for 30fps (also lol... 30fps for a racing game is a red flag) because they went for a complete different look which is less cinematic, mind-blowing, but closer to reality and hence a little bit more "stale" so to say, like reality sometimes is. Both games went for a realistic look but Driveclub tends more to the cinematic side. 60 vs30fps is not the reason in my opinion.
 
Top Bottom