• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why 30FPS was once acceptable, BIT ISN'T ANYMORE!... mostly...

8BiTw0LF

Consoomer
I am not doing that, only showing the shutter speed motion blur principle.
launch 30fps game like uncharted 4 and do that test there in 30fps mode its the same....
No it's not. Unless you're running Uncharted 4 on a 25hz screen.
 

PC Gamer

Has enormous collection of anime/manga. Cosplays as waifu.
CEjOIa1.gif
 

Hugare

Gold Member
Any form of image reconstruction not named DLSS is shit. FSR is terrible. At sub HD image? I cant even imagine.

About latency, ChatGPT hit the nail in the head:

"God of War Ragnarök's 30fps graphics mode has an input latency of around 170ms, while Halo 3 had 90ms of latency at 30fps on an Xbox 360. Vsync can also introduce input lag because it pre-renders frames, and some games hold multiple frames in the back buffer before displaying a new one."

Halo 3 had lots of post processing that could add to the latency. In reality what makes a huge difference most of the time is Vsync.

You would be amazed how many games get vsync wrong, even on consoles. If you game on PC, its basically mandatory to mess with vsync settings before gaming, otherwise you'll get incorrect framepacing and latency.

Cyberpunk is one example. Playing with the ingame Vsync settings its downright unbearable due to the latency. And this is a problem that you can find even on the consoles versions, which is absurd.

On PC I had to disable it ingame and then force it using Nvidia CC, and then boom, way better latency.

This is problem whats going on with Jedi Survivor, I bet
 
Last edited:

Filben

Member
Yeah, it huuugely depends on the game. I've tried to play the Rise of the Tomb Raider and Shadow on the PS4 with 30fps cap and there was such a huge input lag I couldn't hit shit in combat. It felt literally broken to me. The super weird acceleration curve made it worse, too. I had to play it on PC with 60fps minimum. It just felt super sluggish. Uncharted on PS3 though or 4 on PS4? No issues! In fact object and camera movement felt smoother, too.

Currently been playing Bayonetta Origins on the Switch which is locked 30fps but character movement still feels snappy and immediate.

The Witcher 3 with already huge delays in input-to-animation? It's real bad in 30fps mode. And don't get me started on RDR2.

However, I think on modern displays a bit of high quality motion blur is needed to prevent ghosting and perceived stutter of moving objects over bright backgrounds at 30fps.

Some developers clearly have no controls in mind or don't know any better and put fidelity and animations (and input lag) before good and precise controls.

Cyberpunk is one example. Playing with the ingame Vsync settings its downright unbearable due to the latency. And this is a problem that you can find even on the consoles versions, which is absurd.
Another game I couldn't play with 30fps due to ultra high delay compared to other 30fps games.
 
Last edited:

PillsOff

Banned
If you can't make a game to look good n run 60 fps on 12 TF Zen 2 powered machine then you can go fuck your self. I am not buying your game.

Only exception will be if you are doing some crazy demanding gameplay systems that tear this Zen 2 to pieces.
 
I look forward to TOTK selling 400 million copies because as it turns out the overwhelming majority of gamers don't give a fuck about framerates if the game is good.

Also BOTW feels infinitely smoother than OoT ever did, so there goes that argument
 

Pelao

Member
I don't think there has been a single game that only runs at 30fps on PS5 and Series X that has turned out to be any good, and not just because of the framerate. At this point it strikes me as a good red flag of an incompetent developer and a game to avoid.
 
I’ve played 30fps games just fine for years. The blur did not affect my enjoyment of the game.

Also, if you think about it, if you pause any frame of a movie, there will be motion blur in it, but the way you perceive it in motion is sharp abs crisp. Why isn’t it an issue in movies? People would literally be getting sick in the movie theater and movies would be unwatchable right?

Also, I firmly believe 30fps can add a cinematic quality feel to a game that you simply cannot achieve at higher frame rates. It’s like how some people prefer tube amps to solid state amps. The imperfection is actually preferred vs the sterile sound of solid state. Same thing with frame rate. Too much information can be bad and highlight issues like pop-in and other visual glitches that would be more masked with 30fps.
 

rofif

Can’t Git Gud
I’ve played 30fps games just fine for years. The blur did not affect my enjoyment of the game.

Also, if you think about it, if you pause any frame of a movie, there will be motion blur in it, but the way you perceive it in motion is sharp abs crisp. Why isn’t it an issue in movies? People would literally be getting sick in the movie theater and movies would be unwatchable right?

Also, I firmly believe 30fps can add a cinematic quality feel to a game that you simply cannot achieve at higher frame rates. It’s like how some people prefer tube amps to solid state amps. The imperfection is actually preferred vs the sterile sound of solid state. Same thing with frame rate. Too much information can be bad and highlight issues like pop-in and other visual glitches that would be more masked with 30fps.
Exactly. Each frame at 30fps is 33ms. That frame should contain the motion captured during the last 33ms. That's why movies look so smooth.
because if you get a camera and crank shutter speed to be super fast (like helicopter blades and droplets of water stopping in place, so like a game), and then record at 30fps, it will look like jerky shit.
Frames need to capture the motion in time. Frame should not be a still picture.

of ourse there can be too much motion blur. best games, adjust motion blur to that frametime, so at 30fps, there is a lot of motion blur and at 240hz, there is almost 0.
Doom 2016 for example does that. At 240hz+240fps, Enabling and disabling motion blur does nothing. That's because 240fps frames are so close to one another, there is enough data to create full motion
 

Soosa

Banned
READ THE FUCKING POST BEFORE POSTING NON-ARGUMENTS PLEASE.

This thread is inspired by some of the comments I read in this thread:
[/URL]
where people talk about playing old games that ran at super low framerates, and having no issues with that...
An I too am one of them, I even just recently played through Disaster Report on XBSX2, a PS2 game that has a 20fps lock, and even drops below that 20fps lock in some scenes...
I had no issues playing it... but why do I hate 30fps in modern games, when I play 20fps retro games just fine?

The quick explanation would be:
1: Framerate is not the issue.
2: Input lag, and visual noise is.

Modern games feel worse at low framerates than older games.
Example: Star Wars Jedi Survivor at 30fps feels way worse than Ocarina of Time at 20fps.
and the reasons for this are mainly how modern engines work and how modern games are rendered.


But I will elaborate on these a bit, otherwise what an OP would this be? :D


First of all, input lag:
Many modern engines have an insufferable amount of input lag, all while the games aren't designed with them in mind either.
Unreal Engine is especially bad with this, but is by far not the only one... God of War's current engine has the same issue, as does Cryengine and many others.
They don't inherently have these issues, but the way almost all developers use them is the issue, as they rely on many of the default way these engines read inputs, render images and Vsync games.

Ocarina of Time on N64 had less input lag at 20fps, than almost any modern game has at 30fps... and for some even the 60fps modes have higher input lag than 20fps Ocarina of Time... this is not an exaggeration, it's absolutely true.

And this is not down to modern TVs either, in fact, if you have a modern Samsung TV from the last 2 production years for example, and you play at 120fps on it, your Input Lag of your Screen is lower than the latency of a CRT... because yes CRTs also have input lag, this lag comes from how they draw an image from top to bottom 60 times each second, which means 1 image takes 16.6 ms to be drawn by a CRT, input lag is measured at the center of the image usually, meaning a CRT given those standards would have input latency of at least 8.3ms of input lag. While a modern 120hz Samsung TV has ~5ms of input lag... and about 9ms at 60hz.
The CRT Input Lag problem basically has ben solved now, we are so close to that even on TV screens now that it's not a factor anmore, let alone on high end PC monitors.

And this increase in latency isn't only obvious when comparing super old games to super new games.
New games still have a shitload of variation, with games like Call of Duty Modern Warfare 2/Warzone 2.0 having input lag so low, that they compete with old SNES games!
We are talking ~40ms of latency at 60fps, and below 30ms at 120fps, which is lower than most 8 and 16 bit games ever dreamt of reaching.
We could compare some Xbox 360 games that ran at 30fps to some modern games that run at 60fps, and Halo 3 would win... God of War Ragnarök at 60fps funnily enough has about the same latency as Halo 3 at 30fps, which is around 90ms for both (+/- 5ms)

So even tho there are modern games that absolutely crush it when it comes to latency, like Call of Duty, and there are of course old games like Killzone 2 that were infamous due to their high latency, it's sadly a pattern that latency is going up.
We are now in a spot where a bulk of modern titles have the same, or even more Input Lag than Killzone 2 had, a game that was panned at the time for how awful the aiming feels due to it's lag.
Coming back to God of War Ragnarök. That game in its 30fps Graphics Mode, has an input latency of around 170ms. Killzone 2's latency was in the 150ms ballpark!!!

So let that sink in for a moment... during Gen 7, Killzone 2 got massive bad PR for having 150ms of latency... and a modern game like God of War Ragnarök easily exceeds that!
Meanwhile Halo 3 had 90ms of latency at 30fps on an Xbox 360, and Ragnarök has about the same amount of latency in its 60fps Performance Mode.
In God of War's case it's mostly the Vsync that is the issue it seems, since as soon as you use the unlocked VRR mode that deactivates Vsync, the latency shrinks down to 90ms, in par with Halo 3 and the 60fps Vsync mode of the game.

Why does Vsync introduce input lag? Because it pre-renderes (buffers) frames. And some games take this to another level, usually to smooth out framerates due to being extremely demanding on the GPU, and they will hold multiple frames in the back buffer before displaying a new one, which gives them the possibility to basically have a fallback frame under most circumstances, which keeps the percieved framerate consistent at the cost of input lag.
Sea of Thieves in fact is the only game I know of that actually lets the user chose how many frames it should pre-render.


Secondly, how games build their final image:
Modern games have tons of high frequency details, and modern games are expected to be played on high resolution displays.
These 2 factors are the reason many games now use aggressive TAA, and more and more games use one of the many forms of Upsampling to higher resolutions from a lower base resolution.

These 2 things both lead to the same issue, a muddy and hard to parse image in motion.

Then a third factor comes in, MOTION BLUR *scary thunder noises*
And Motion Blur adds even more visual noise and muddies the final image even more.

So a modern game will often look pristine and super clean when you hold your camera completely still, but as soon as your character, and especially the camera, moves at all... MUD AND SMEARING.
And funnily enough we have an old game as an example here again, that at the time got a lot of flak for having the same issues, that nowadays are so common that noon even really talks about them in reviews or anything anymore...
And that game is Halo Reach.

Halo Reach was an early example of a game that used a form of Temporal Anti Aliasing... TAA.
And TAA at the time Halo Reach was made, was still firmly in its infancy, exhibiting a lot of issues and visual noise. So if you stood completely still in Halo Reach, it looked GLORIOUSLY smooth and clean... move the camera and it was a jittery mess.

These days TAA got a lot better, but it still has issues with ghosting and clear fizzle and disocclusion artifacts the moment something on screen moves.
But of course, TAA isn't the shiny new thing anymore... we now have FSR2, TSR and many other proprietary methods by different developers...
And these Upsampling methods basically bring bag almost all the issues Halo Reach had, and then some!

When you play Ocarina of Time at 20fps on an N64 connected to a CRT, motion is clean! You can see motion clearly, there's no artifacts from anything moving too fast, there is no blur to reduce legibility of the action happening around you, and there's no motion blur to try to make it seem smoother than it is.

"but why is this a 30fps problem mister Y" is what you are now asking I bet!
WELL, TAA, FSR2, TSR, CBR, ADD UPSAMPLING METHOD HERE... and Motion Blur, all of these rely on image data from previous frames!
THE LESS IMAGE DATA, THE MUDDIER THE IMAGE IS IN MOTION!
So, at 30fps these modern upsampling and anti aliasing methods have less temporal data available to them to create their final image!
A game running at 1440p FSR2 Quality Mode will look sharper and have less artifacts when running at 60fps than it would have at the same settings but locked to 30fps.
So the lower the framerate, the more "smear" the more "mud" and the more "fizzle" there is.

So in the end, all of this adds up.
The Motion Blur, the Image Reconstruction, the Antialiasing, all of it get worse the lower the framerate is.

This is why Jedi Survivor in motion looks like this (I purposefully took a shot from a spot where the game locks to 60fps to give it the best chances):
starwarsjedisurvivorm7ifc.png


and F-Zero GX like this (I'm turning sharp left here to give it the worst chances with fast camera and vehicle movement):
fzeroscreend5cph.png



Normalized for same image size so you don't have to zoom as much on mobile (not zoomed in in any way, just cropped)
swscreenzoomxlfwx.png
fzeroscreenzoom7ocx7.png



And the lower the framerate, the worse all the weird grainy, blurry, fizzle, artifacts you see in Jedi Survivor will get. Meanwhile if you ran F-Zero GX at 10fps, it would look literally the exact same in terms of image quality and artifacts (because it has no artifacts)
And we haven't even touched motion blur, which is off in my screenshot here, along will all the other image filters.

If a game like Jedi Survivor runs at a lower framerate, the amount of data the engine has to draw the image from one frame to the next is reduced, the image gets less readable and muddy, and you will see blur in motion, even tho motion blur is off.


Smaller issues that aren't technical but due to Art Design of games:
Older games have different considerations when it comes to art (textures, models, scale) than modern games.
Older consoles couldn't have as much detail than newer consoles can handle. This means textures had less "visual noise", meaning there is less for your eyes to get hung up on, less to process for your brain.
In motion, when running through a level, or turning a camera, this means the things that are important, or could be important, as well as your general environment, is easier to digest.

You will have an easier time knowing where enemies are, or where a button or rope is, when it's simpler in design, bigger in scale and stands out more. And all of these features are features older graphics had!
On a wall with a simple flat texture, no shadows or reflections, seeing a red button that is on top of that unrealistically big in scale, is easier to see than a red button with realistic scale, on a wall with detailed textures, shadows of adjacent objects on it and other lighting applied.

So even if older games had the graphical issues I explained above, the simpler graphics would still make it way easier on your eyes, because there is not as much small detail that can get lost in all the blur, fizzle and breakup.

In short: Less Detail + Scale = Easier to read environments even at low framerates.



SO IN CONCLUSION:
Playing a game like Ocarina of Time will need getting used to at first, but once your eyes are adjusted to the 20fps output, the image will look pretty decent and clean.
Playing Jedi Survivor at 60fps already makes the whole image look muddy and blurry due to the artifacting of the upsamling. At 30fps this will only get worse, add Motion Blur to negate the stutter and you'll see barely anything in motion.

Playing Ocarina of Time at 20fps on real hardware will feel... not amazingly responsive, but more responsive than many modern games like... Jedi Survivor... or Redfall... or God of War Ragnarök feel at 30fps, hell some of them even at 60fps!

So you can not use an old game and point the finger at it saying "LOOK! back then we had no issues with this! this also ran at a super low framerate!", while not taking into account how modern game engines work, how they react to player input, and how they construct their final output image.

Old games had simple graphics, they were easier to parse by your eyes because important stuff was exaggerated in scale and shape, they didn't have motion blut, no reconstruction artifacts, no TAA or Screen Space effects that lag behind and take time to accumulate data. They had simpler engines with less latency.
And that is why they look cleaner and play better at low framerates than new games.

And yes, there are modern games and old games that are outliers from this. Like some modern ones like From Software games have pretty low latency, and some old games like Mortal Kombat on GameBoy have massive amounts of lag.
And some old games had weird fake motion blur which made the image hard to read, while some modern games have clean images with barely anything to muddy them in motion.
The difference is that there are more of the less good examples today, and more of the good examples in the past.

Crt+lcd are also "easier to eyes" with motion blur. 30fps is just fine

On OLED 30FPS is literally unplayable to some, like me. On OLED 30fps have micro lag and makes me sick almost instantly.
 

01011001

Banned
Crt+lcd are also "easier to eyes" with motion blur. 30fps is just fine

On OLED 30FPS is literally unplayable to some, like me. On OLED 30fps have micro lag and makes me sick almost instantly.

blur kills readability of the game.

so either you have stutter on a fast pixel refresh screen,
or you have blur that smears detail, not only making graphics fidelity lower in motion, but also getting in the way of quickly parsing your surroundings.
 

Edgelord79

Gold Member
The older I get, the less visuals impress me. What does impress me is fluidity and something that is snappy and responsive.

There was a time when visuals were all I cared about so I don’t begrudge people this.

All things equal, 60 fps will always play better than 30 fps even at a minute scale.
 

NeoIkaruGAF

Gold Member
Also, if you think about it, if you pause any frame of a movie, there will be motion blur in it, but the way you perceive it in motion is sharp abs crisp.
No it isn't. Not without black frame insertion or frame interpolation. Watch any TV program or a movie on your TV, of course there's blur in motion and it's perfectly visible. Watch people's eyes get blurry when they turn their heads, for example.

It happens at the movie theater too. It's just that the cinema screen is dimmer and less sharp than your 4K TV, and it's not as jerky as an OLED screen. If cinema used giant OLEDs people would be getting motion sickness left and right, count on it. I got it very quickly on my OLED TV with an ultra-shaky cam movie like Man of Steel. Imagine that blown up in an OLED the size of a movie theater screen...
 

hussar16

Member
If we stil gamed on crt or plasma it would be very acceptable issue is not fps but panels tht can't make clear motion
 
No it isn't. Not without black frame insertion or frame interpolation. Watch any TV program or a movie on your TV, of course there's blur in motion and it's perfectly visible. Watch people's eyes get blurry when they turn their heads, for example.

It happens at the movie theater too. It's just that the cinema screen is dimmer and less sharp than your 4K TV, and it's not as jerky as an OLED screen. If cinema used giant OLEDs people would be getting motion sickness left and right, count on it. I got it very quickly on my OLED TV with an ultra-shaky cam movie like Man of Steel. Imagine that blown up in an OLED the size of a movie theater screen...
Funny because movie projectors don’t have poor response times that helps blend the frames together, so yes a movie theater would be just like watching an OLED
 

Ulysses 31

Gold Member
blur kills readability of the game.

so either you have stutter on a fast pixel refresh screen,
or you have blur that smears detail, not only making graphics fidelity lower in motion, but also getting in the way of quickly parsing your surroundings.
Or have some low level motion interpolation like game motion plus on Samsung TVs. BFI would be the best solution if it didn't come with such a brightness hit. Maybe MicroLED solve it.
 

Represent.

Represent(ative) of bad opinions
Sorry OP, but you're flatout wrong.

Plagues Tale: Requiem

30 Fidelity

image



60 Performance
image



60fps mode is missing a whole goddamn forest compared to the 30fps mode.

And it runs at sub 1080p.

This is fucking garbage.

I hope not a single dev moving forward wastes its time on a "performance" mode on these base consoles. Yes, lets sacrifice literally everything else to make our game run at 60fps. Stop

30fps:

image



60fps:
image




Enough.

This "every game must be 60fps" is nonsense and holding back the industry.

If you MUST HAVE 60fps games, buy a PC. Or wait for PS5 PRO.

Stop forcing such ugliness on the rest of us. I have HAD it :messenger_tears_of_joy:

60fps and everything else is worse.
or
30fps and everything else is better.

The answer is clear and obvious.

Thanks to Bojji Bojji for the pics
 
Last edited:

proandrad

Member
Sorry OP, but you're flatout wrong.

Plagues Tale: Requiem

30 Fidelity

image



60 Performance
image



60fps mode is missing a whole goddamn forest compared to the 30fps mode.

And it runs at sub 1080p.

This is fucking garbage.

I hope not a single dev moving forward wastes its time on a "performance" mode on these base consoles. Yes, lets sacrifice literally everything else to make our game run at 60fps. Stop

30fps:

image



60fps:
image




Enough.

This "every game must be 60fps" is nonsense and holding back the industry.

If you MUST HAVE 60fps games, buy a PC. Or wait for PS5 PRO.

Stop forcing such ugliness on the rest of us. I have HAD it :messenger_tears_of_joy:

60fps and everything else is worse.
or
30fps and everything else is better.

The answer is clear and obvious.

Thanks to Bojji Bojji for the pics
Horizon FW is a massive open world and it has a performance mode that looks better than both those screen shots. If anything, unreal engine is the reason for those busted looking games at 60fps.
 

Represent.

Represent(ative) of bad opinions
Horizon FW is a massive open world and it has a performance mode that looks better than both those screen shots. If anything, unreal engine is the reason for those busted looking games at 60fps.
Horizon FW also looks much better in fidelity mode, especially the new DLC.

And HFW is a first party game, made by arguably the most technically gifted studio on the planet.

Not many run of the mill 3rd party devs are going to be able to match what Guerilla Games does with their performance modes.

As the gen grows older, and games become more visually demanding... Most devs are simply going to be wasting our time with their 60fps modes. They will be Sub 1080p trash and missing way too many details.

Moving forward, on base consoles, 40fps will be the new performance mode.

60fps will be for the PRO consoles or PC.
 

proandrad

Member
Horizon FW also looks much better in fidelity mode, especially the new DLC.

And HFW is a first party game, made by arguably the most technically gifted studio on the planet.

Not many run of the mill 3rd party devs are going to be able to match what Guerilla Games does with their performance modes.

As the gen grows older, and games become more visually demanding... Most devs are simply going to be wasting our time with their 60fps modes. They will be Sub 1080p trash and missing way too many details.

Moving forward, on base consoles, 40fps will be the new performance mode.

60fps will be for the PRO consoles or PC.
First, I would argue the fidelity mode only looks slightly better in screenshots and the performance mode looks miles better in motion. Horizon’s performance mode also looked like trash on release and it took months of patching for it to look great; proving that if the game just got delayed for polish both modes would have been great and everyone would have been happy.
 

Represent.

Represent(ative) of bad opinions
First, I would argue the fidelity mode only looks slightly better in screenshots and the performance mode looks miles better in motion. Horizon’s performance mode also looked like trash on release and it took months of patching for it to look great; proving that if the game just got delayed for polish both modes would have been great and everyone would have been happy.
The fidelity mode looks better in motion too, its way sharper thanks to higher resolution. There is also better lighting and more effects.

30fps Fidelity mode is the way GG decided to advertise their own game... Because it simply looks better (in screens and in motion).



Im all for 60fps modes... but NOT if it means games have to be delayed by 6-7 months, and have everything else cut back, just for a sub-1080p game with everything else cut in half. Its simply not worth it.

Thats what the PRO consoles are for.
 

01011001

Banned
Just jumping back into this thread to highlight Zelda Tears of the Kingdom.

Because it's a perfect example of a game where 30fps does indeed work reasonably well, which is why my thread title has the "... mostly..." part at the end.

Zelda is very old fashioned.
a classic double buffer vsync solution, no motion blur, no TAA or any temporal upsampling.

Double Buffer Vsync means relatively low input lag, these days you don't see double buffer all that often anymore. you usually see triple buffer, because it improves performance at the cost of latency.

No motion blur means no smearing in motion that further muddies the image during camera turns.

No TAA and no temporal accumulation of any kind means no artifacting during disocclusion, no fizzle, no additional blur in motion, which all get worse the lower the framerate, hence why modern 30fps usually SUCKS.


Playing Jedi Survivor in resolution mode, and then following it up with playing TotK will instantly show you how much more playable Zelda is in comparison. your inputs register about twice as fast, your vision isn't constantly filled with blur and artifacts, and the framepacing feels smoother due to the way better/more consistent frame delivery.


additionally, it helps further if you play it in handheld mode.
In handheld mode your screen will usually cover less of your field of view, which means the steps between frames if something moves look smaller to youe eyes, which in return reduces the feeling of stutter. that's also a factor that comes into play when people talk about PS1 era 30fps games, and which I ignored to talk about in my OP.

TVs in the 90s were generally much smaller, even if you sat right in front of your TV the FOV to screen ratio was smaller, at which point the above mentioned comes into effect, just like on the smaller Handheld screen
 
Last edited:

rofif

Can’t Git Gud
Just jumping back into this thread to highlight Zelda Tears of the Kingdom.

Because it's a perfect example of a game where 30fps does indeed work reasonably well, which is why my thread title has the "... mostly..." part at the end.

Zelda is very old fashioned.
a classic double buffer vsync solution, no motion blur, no TAA or any temporal upsampling.

Double Buffer Vsync means relatively low input lag, these days you don't see double buffer all that often anymore. you usually see triple buffer, because it improves performance at the cost of latency.

No motion blur means no smearing in motion that further muddies the image during camera turns.

No TAA and no temporal accumulation of any kind means no artifacting during disocclusion, no fizzle, no additional blur in motion, which all get worse the lower the framerate, hence why modern 30fps usually SUCKS.


Playing Jedi Survivor in resolution mode, and then following it up with playing TotK will instantly show you how much more playable Zelda is in comparison. your inputs register about twice as fast, your vision isn't constantly filled with blur and artifacts, and the framepacing feels smoother due to the way better/more consistent frame delivery.


additionally, it helps further if you play it in handheld mode.
In handheld mode your screen will usually cover less of your field of view, which means the steps between frames if something moves look smaller to youe eyes, which in return reduces the feeling of stutter. that's also a factor that comes into play when people talk about PS1 era 30fps games, and which I ignored to talk about in my OP.

TVs in the 90s were generally much smaller, even if you sat right in front of your TV the FOV to screen ratio was smaller, at which point the above mentioned comes into effect, just like on the smaller Handheld screen
Motion blur would help even more.
And I don’t know what taa gates you play but is not 2015. Taa can be spotless in motion.
Vsync double buffered is the sole one.
Triple buffering is the good fast one.
I doubt it is using Any of these if it feels responsive.

Generally taa and motion blur does not impact how game plays. They can only make it look smoother when done well. I don’t like 30 fps without taa personally.

Zelda feels more responsive because they got input lag to a low level.
 

01011001

Banned
Motion blur would help even more.
And I don’t know what taa gates you play but is not 2015. Taa can be spotless in motion.
Vsync double buffered is the sole one.
Triple buffering is the good fast one.
I doubt it is using Any of these if it feels responsive.

Generally taa and motion blur does not impact how game plays. They can only make it look smoother when done well. I don’t like 30 fps without taa personally.

Zelda feels more responsive because they got input lag to a low level.

motion blur is disgusting. no game should ever use it... I'm ok with speed based motion blur in racing games, but that's about it, anything else, fuck that.

and TAA isn't even remotely perfect in any game. which is one of the reaons DLSS on PC often looks "better than native", what people mean when they say that is actually "better than native+TAA".
TAA is so shit even now, that running the game at a lower resolution and letting a deep learning algorithm clean up the image results in better motion clarity.

just watch the latest Digital Foundry video on Control's enhanced mod, where you can see modern TAA in all its motion artifacting glory.

Doom Eternal's TAA is so bad that even DLSS Balanced and Performance modes rival native + TAA in motion and image clarity.

and motion clarity is key.
video games are an interactive medium. if you add things that inhibits the ability of the player to make decisions in the name of "presentation", you make the experience worse.
adding motion artifacts of any kind, that includes motion blur, makes your vision less clear and therefore lowers your ability to make moment to moment decisions.

unreasonable input lag + motion blur + accumulation artifacts = 🤮
and most modern games have all of these, Zelda doesn't have a single one, and plays way better as a result, just like PS1, PS2/Xbox/GC and early PS360 games that ran at 30fps
 
Last edited:

Minsc

Gold Member
Sorry OP, but you're flatout wrong.

Plagues Tale: Requiem

30 Fidelity

image



60 Performance
image



60fps mode is missing a whole goddamn forest compared to the 30fps mode.

And it runs at sub 1080p.

This is fucking garbage.

I hope not a single dev moving forward wastes its time on a "performance" mode on these base consoles. Yes, lets sacrifice literally everything else to make our game run at 60fps. Stop

30fps:

image



60fps:
image




Enough.

This "every game must be 60fps" is nonsense and holding back the industry.

If you MUST HAVE 60fps games, buy a PC. Or wait for PS5 PRO.

Stop forcing such ugliness on the rest of us. I have HAD it :messenger_tears_of_joy:

60fps and everything else is worse.
or
30fps and everything else is better.

The answer is clear and obvious.

Thanks to Bojji Bojji for the pics

I understand there's less foliage, but the solution there is super simple for me, just imagine the world is a bit more desert-like. It's not like I'll play it flipping back and forth - I'd just set it to 60 (cause 30 is literally unplayable for me if it's anything like GotG) and ignorance is bliss. The missing foliage just adds to the desert environment. Thank god for options, because GotG is one of my favorite games in a while and I don't think I've have even played it if it were missing the performance settings.
 

Bo_Hazem

Banned
Yeah it’s input lag. Demons souls 30fps mode for 75ms more input lag than bloodborne. Seriously install both and test it. Bb feels great.
It’s the modes. 30fps needs to be properly implemented. Not vsynced crap.
Uncharted4 does it well. Good input lag and heavy high quality motion blur. Forbidden west 30fps also feels very good but could use better motion blur. Oled is great because it got no lag but requires great motion blur. Otherwise it looks jerky. Uncharted4 got that good motion blur.

So you are completely wrong about motion blur. It is essential and required for oled. It adds smoothness in motion. Each frame should be a capture of time that passed. Not a still.
Don’t mix this with fsr and visual artifacts

Remember that 60 to 30 fps is only 16ms difference. No reason for games to have 100ms worse input lag like some do

Again. Motion blur is essential to avoid jerky motion and stutter on oled. Unless you are running 240hz then you got enough frame data


Yeah, mate. Dealing with the 180 shutter rule in videography is a headache with dodgy lighting, but the results are more than worth it.
 

JeloSWE

Member
Sorry OP, but you're flatout wrong.

Plagues Tale: Requiem

30 Fidelity

image



60 Performance
image



60fps mode is missing a whole goddamn forest compared to the 30fps mode.

And it runs at sub 1080p.

This is fucking garbage.

I hope not a single dev moving forward wastes its time on a "performance" mode on these base consoles. Yes, lets sacrifice literally everything else to make our game run at 60fps. Stop

30fps:

image



60fps:
image




Enough.

This "every game must be 60fps" is nonsense and holding back the industry.

If you MUST HAVE 60fps games, buy a PC. Or wait for PS5 PRO.

Stop forcing such ugliness on the rest of us. I have HAD it :messenger_tears_of_joy:

60fps and everything else is worse.
or
30fps and everything else is better.

The answer is clear and obvious.

Thanks to Bojji Bojji for the pics
I'm so tired of all the 30fps advocates. You think that the 30fps Fidelity mode looks good. Imagine what they could do at 15fps.
 

FunkMiller

Gold Member
I can’t stand 30 fps on Ps5 or Xbox on my LG CX. Feels like a flip book after 60 fps.

Weirdly though, Tears Of The Kingdom feels absolutely fine at 30 fps. Have no idea why. In fact, Switch games in general feel ok.
 

dcx4610

Member
It was acceptable because it was literally a hardware limitation. Over time, people conflated film being 24fps and 30fps feeling "cinematic". With film, that was also a limitation but it was a limitation for so long that people just got used to that look. Now, anything above 24fps looks weird and fake.

With games though, input and timing is involved. It actually affects the video game playing experience. Most games on the NES were 60fps. It was only when we got into the 3D era where it was too hard to push 60fps. There is not a single video game that feels or works better at 30fps vs. 60fps. It would be like saying you prefer slowdown in older games vs. not having it.
 

Represent.

Represent(ative) of bad opinions
I'm so tired of all the 30fps advocates.
Believe me. Im WAY MORE sick of all the 60fps or bust framerate warriors. Willing to sacrifice literally everything over frame rate. Insanity.
You think that the 30fps Fidelity mode looks good.
It does. Objectively.
Imagine what they could do at 15fps.
No need. 30fps is best of both worlds. 60fps games are fucking ugly on these consoles. Running at sub 1080p.. In 2023. Ridiculous.
 

rofif

Can’t Git Gud
I'm so tired of all the 30fps advocates. You think that the 30fps Fidelity mode looks good. Imagine what they could do at 15fps.
And I am tired of all the "oh I am 12 and I just discovered 60fps" people.
There are more things to gaming than 60 fps. Other frontiers some might find important.
I am more acceptable to 30fps because I play video games for so long, no because 60fps is some new thing.
I play gaming on a 120hz crt over 20 years ago. It's nothing new and with some time, it really doesnt fucking matter. I want to play the game and be wowed by it. I remember graphics, not that the game ran 30fps (unless it's some shitty slow 30fps like we get this gen).

You have to realize that people who just want to play the game and be wowed by it and are happy to accept 30fps, don't always do that because they can't value 60fps. It can be a choice.

If there was 720p 60fps mode in tlou2 on ps4 ? Fuck no, I would play that game in 1080p30
 

Wildebeest

Member
Interesting. Do game developers ever think about using game engines that don't totally suck and make them look bad?
 
Last edited:

Bry0

Member
I can’t stand 30 fps on Ps5 or Xbox on my LG CX. Feels like a flip book after 60 fps.

Weirdly though, Tears Of The Kingdom feels absolutely fine at 30 fps. Have no idea why. In fact, Switch games in general feel ok.
I’ve been playing a lot of ps3 and 360 games on my LG CX lately and notice a similar effect. Horizon FW looks incredibly stop-motion juddery in the 30 fps mode, but a lot of these ps3 30 fps games don’t look quite as low fps despite also being 30 fps. I kind of wonder if the lack of visual clarity helps too.
 

JeloSWE

Member
And I am tired of all the "oh I am 12 and I just discovered 60fps" people.
There are more things to gaming than 60 fps. Other frontiers some might find important.
I am more acceptable to 30fps because I play video games for so long, no because 60fps is some new thing.
I play gaming on a 120hz crt over 20 years ago. It's nothing new and with some time, it really doesnt fucking matter. I want to play the game and be wowed by it. I remember graphics, not that the game ran 30fps (unless it's some shitty slow 30fps like we get this gen).

You have to realize that people who just want to play the game and be wowed by it and are happy to accept 30fps, don't always do that because they can't value 60fps. It can be a choice.

If there was 720p 60fps mode in tlou2 on ps4 ? Fuck no, I would play that game in 1080p30
This discussion will never end. Some are happy with low fps and high resolution. Some can't stand low fps and are willing to sacrifice fidelity for fluidity. It's great that we have the option for both on consoled. I personally will always chose 60fps over visual fidelity. Each to their own. If I can I play games at 120fps on my monster computer but I will play a game at 30fps if there is absolutely no other way around it and the game is absolutely unmissable. That said, after buying TOTK, I couldn't stand the 30fps and low resolution on my rather big 85" TV and tried it on PC emulation and it's freakin glorious when you have the option for high resolution AND 60fps. I remember BOTW being more acceptable on my previous 55" but this time around it was painful. I also come from a FPS and fighting game background and the fluidity of minimum 60fps is the most important thing to me. Again, it's good that we have options.
 
Last edited:

rofif

Can’t Git Gud
This discussion will never end. Some are happy with low fps and high resolution. Some can't stand low fps and are willing to sacrifice fidelity for fluidity. It's great that we have the option for both on consoled. I personally will always chose 60fps over visual fidelity. Each to their own. If I can I play games at 120fps on my monster computer but I will play a game at 30fps if there is absolutely no other way around it and the game is absolutely unmissable. That said, after buying TOTK, I couldn't stand the 30fps and low resolution on my rather big 85" TV and tried it on PC emulation and it's freakin glorious when you have the option for high resolution AND 60fps. I remember BOTW being more acceptable on my previous 55" but this time around it was painful. I also come from a FPS and fighting game background and the fluidity of minimum 60fps is the most important thing to me. Again, it's good that we have options.
if the 30fps option wasn't so terrible this gen, it would be better.... But graphics enthusiasts are left with 30fps modes with extreme input lag + 75ms to what it was on ps4 in some cases.
I am not advertising for that honestly.
For me, it was the best when there were NO MODES at all. I just want to launch the game and go. If I know there are no options, I will admire the game for what it is and how it plays. If there are modes, I will often feel unsatisfied by lagginess of one mode or blurriness of another even if I would be happy if the game only had performance mode (in no modes scenario).
So for me - no modes but every game should offer VRR FPS unlock with ability to set system level locks to 30, 40, 60. Just so when new PRO consoles comes in the future, you can unlock fps or lock it to 40 and so on
 

diffusionx

Gold Member
if the 30fps option wasn't so terrible this gen, it would be better.... But graphics enthusiasts are left with 30fps modes with extreme input lag + 75ms to what it was on ps4 in some cases.
I am not advertising for that honestly.
For me, it was the best when there were NO MODES at all. I just want to launch the game and go. If I know there are no options, I will admire the game for what it is and how it plays. If there are modes, I will often feel unsatisfied by lagginess of one mode or blurriness of another even if I would be happy if the game only had performance mode (in no modes scenario).
So for me - no modes but every game should offer VRR FPS unlock with ability to set system level locks to 30, 40, 60. Just so when new PRO consoles comes in the future, you can unlock fps or lock it to 40 and so on
The console graphics modes has been a total disaster for console gaming and it's just getting worse. You buy consoles and you buy console games to get away from that garbage. If I want to analyze framerates and IQ and performance impacts from settings toggles I'll buy the PC version. it's just to the devs to, you know, design the game, and that entails making decisions to make the game as good as possible. This includes how it looks.
 
Last edited:

rofif

Can’t Git Gud
The console graphics modes has been a total disaster for console gaming and it's just getting worse. You buy consoles and you buy console games to get away from that garbage. If I want to analyze framerates and IQ and performance impacts from settings toggles I'll buy the PC version. it's just to the devs to, you know, design the game, and that entails making decisions to make the game as good as possible. This includes how it looks.
It is to my detriment to play a ps5 game without lookin at a DF video first.
Devs just are fucking lazy and treat console as a mid pc... No optimization, no monitoring tools which would be great if I am supposed to make a choice....
And very often, almost always the modes are mislabeled or ambiguously names. Some examples:
-Just "Fidelity mode". it can be high resolution without rt... or low res but with all bells and whistles. No info
-Fidelity mode can be named Quality mode.... or high resolution mode. None of which means anything or can be even misinforming.
-performance mode is sometimes not even 60fps resulting in worse feel. They do not inform you what you are sacrificing with performance mode. Is it rt? shadows? resolution? No idea, play the guessing game.
-Some games have all of these with no info. just labels: Perf, Quality, HFR quality, HFR performance, 120hz mode, vrr mode....
-Some games have HIDDEN modes only accessible if you enble or disable 120hz or/and vrr in ps5 system settings and relaunch the game... then suddenly quality or performance mode is unlocked, or 40fps mode appears. Like gt7 or like Insomnia games - but to their defense. Insomniac at least have clear description that this is happening.

many many more examples. of course there are good use of modes too. I like 40fps stuff a lot but it is often labeled as "balanced" whatever it fucking means. Am I falling over here?! And while balanced, it is often just 1800p
 

diffusionx

Gold Member
It is to my detriment to play a ps5 game without lookin at a DF video first.
Devs just are fucking lazy and treat console as a mid pc... No optimization, no monitoring tools which would be great if I am supposed to make a choice....
And very often, almost always the modes are mislabeled or ambiguously names. Some examples:
-Just "Fidelity mode". it can be high resolution without rt... or low res but with all bells and whistles. No info
-Fidelity mode can be named Quality mode.... or high resolution mode. None of which means anything or can be even misinforming.
-performance mode is sometimes not even 60fps resulting in worse feel. They do not inform you what you are sacrificing with performance mode. Is it rt? shadows? resolution? No idea, play the guessing game.
-Some games have all of these with no info. just labels: Perf, Quality, HFR quality, HFR performance, 120hz mode, vrr mode....
-Some games have HIDDEN modes only accessible if you enble or disable 120hz or/and vrr in ps5 system settings and relaunch the game... then suddenly quality or performance mode is unlocked, or 40fps mode appears. Like gt7 or like Insomnia games - but to their defense. Insomniac at least have clear description that this is happening.

many many more examples. of course there are good use of modes too. I like 40fps stuff a lot but it is often labeled as "balanced" whatever it fucking means. Am I falling over here?! And while balanced, it is often just 1800p
When this stuff started rolling out on PS4 Pro* I knew it would not end well, like you said it's just become a complete mess, and modes get added post-launch, and now some games are toggling graphics settings like PC. I don't want ANY of this. I just want one mode, the best mode the devs feel the game runs at, the mode they would play the game at.


*nobody akshully me about N64 having "high res" modes with the memory pack
 
Last edited:

Rush2112

Member
Easy reason is popularity of online shooters and high refresh rate monitors. Frame rate matters most in shooters, espesially competitive ones. No one cares if theyre playing persona at 30 fps.
 
Last edited:

JeloSWE

Member
if the 30fps option wasn't so terrible this gen, it would be better.... But graphics enthusiasts are left with 30fps modes with extreme input lag + 75ms to what it was on ps4 in some cases.
I am not advertising for that honestly.
For me, it was the best when there were NO MODES at all. I just want to launch the game and go. If I know there are no options, I will admire the game for what it is and how it plays. If there are modes, I will often feel unsatisfied by lagginess of one mode or blurriness of another even if I would be happy if the game only had performance mode (in no modes scenario).
So for me - no modes but every game should offer VRR FPS unlock with ability to set system level locks to 30, 40, 60. Just so when new PRO consoles comes in the future, you can unlock fps or lock it to 40 and so on
Again, we will never agree on this as a community. I personally, if I had a genie's wish, would want that all games were developed with 60fps as initial target, this would mean the games would look their best and there would be a lot of smart solutions to make it work and still look good. Then they could crank up some extra resolution, textures and draw distance as an afterthought for those that are content with 30fps. Demon's Souls is a stellar example of this. But this is not how everyone likes it. So I'm really happy we get both modes, even if it complicates things a bit.

I actually see 30fps on consoles as way to cheat, to look into how future games will look. 30fps is not an acceptable frame rate in my book, and by halving the 60fps you essentially doubles the graphics target. The thing is, that you could keep going lower and raise the target, at what point is the frame rate no longer acceptable. Some people would likely be fine with 20fps, and a few would accept 15fps. In the end, the fluidity of 60fps is where I personally draw the line, regardless of good input-lag at 30fps.
 
Last edited:
Top Bottom