8BiTw0LF
Banned
No it's not. Unless you're running Uncharted 4 on a 25hz screen.I am not doing that, only showing the shutter speed motion blur principle.
launch 30fps game like uncharted 4 and do that test there in 30fps mode its the same....
No it's not. Unless you're running Uncharted 4 on a 25hz screen.I am not doing that, only showing the shutter speed motion blur principle.
launch 30fps game like uncharted 4 and do that test there in 30fps mode its the same....
That's literally what Shadow of the Colossus did on PS2 and I enjoyed it.
Another game I couldn't play with 30fps due to ultra high delay compared to other 30fps games.Cyberpunk is one example. Playing with the ingame Vsync settings its downright unbearable due to the latency. And this is a problem that you can find even on the consoles versions, which is absurd.
Exactly. Each frame at 30fps is 33ms. That frame should contain the motion captured during the last 33ms. That's why movies look so smooth.I’ve played 30fps games just fine for years. The blur did not affect my enjoyment of the game.
Also, if you think about it, if you pause any frame of a movie, there will be motion blur in it, but the way you perceive it in motion is sharp abs crisp. Why isn’t it an issue in movies? People would literally be getting sick in the movie theater and movies would be unwatchable right?
Also, I firmly believe 30fps can add a cinematic quality feel to a game that you simply cannot achieve at higher frame rates. It’s like how some people prefer tube amps to solid state amps. The imperfection is actually preferred vs the sterile sound of solid state. Same thing with frame rate. Too much information can be bad and highlight issues like pop-in and other visual glitches that would be more masked with 30fps.
READ THE FUCKING POST BEFORE POSTING NON-ARGUMENTS PLEASE.
This thread is inspired by some of the comments I read in this thread:
[/URL]
where people talk about playing old games that ran at super low framerates, and having no issues with that...
An I too am one of them, I even just recently played through Disaster Report on XBSX2, a PS2 game that has a 20fps lock, and even drops below that 20fps lock in some scenes...
I had no issues playing it... but why do I hate 30fps in modern games, when I play 20fps retro games just fine?
The quick explanation would be:
1: Framerate is not the issue.
2: Input lag, and visual noise is.
Modern games feel worse at low framerates than older games.
Example: Star Wars Jedi Survivor at 30fps feels way worse than Ocarina of Time at 20fps.
and the reasons for this are mainly how modern engines work and how modern games are rendered.
But I will elaborate on these a bit, otherwise what an OP would this be?
First of all, input lag:
Many modern engines have an insufferable amount of input lag, all while the games aren't designed with them in mind either.
Unreal Engine is especially bad with this, but is by far not the only one... God of War's current engine has the same issue, as does Cryengine and many others.
They don't inherently have these issues, but the way almost all developers use them is the issue, as they rely on many of the default way these engines read inputs, render images and Vsync games.
Ocarina of Time on N64 had less input lag at 20fps, than almost any modern game has at 30fps... and for some even the 60fps modes have higher input lag than 20fps Ocarina of Time... this is not an exaggeration, it's absolutely true.
And this is not down to modern TVs either, in fact, if you have a modern Samsung TV from the last 2 production years for example, and you play at 120fps on it, your Input Lag of your Screen is lower than the latency of a CRT... because yes CRTs also have input lag, this lag comes from how they draw an image from top to bottom 60 times each second, which means 1 image takes 16.6 ms to be drawn by a CRT, input lag is measured at the center of the image usually, meaning a CRT given those standards would have input latency of at least 8.3ms of input lag. While a modern 120hz Samsung TV has ~5ms of input lag... and about 9ms at 60hz.
The CRT Input Lag problem basically has ben solved now, we are so close to that even on TV screens now that it's not a factor anmore, let alone on high end PC monitors.
And this increase in latency isn't only obvious when comparing super old games to super new games.
New games still have a shitload of variation, with games like Call of Duty Modern Warfare 2/Warzone 2.0 having input lag so low, that they compete with old SNES games!
We are talking ~40ms of latency at 60fps, and below 30ms at 120fps, which is lower than most 8 and 16 bit games ever dreamt of reaching.
We could compare some Xbox 360 games that ran at 30fps to some modern games that run at 60fps, and Halo 3 would win... God of War Ragnarök at 60fps funnily enough has about the same latency as Halo 3 at 30fps, which is around 90ms for both (+/- 5ms)
So even tho there are modern games that absolutely crush it when it comes to latency, like Call of Duty, and there are of course old games like Killzone 2 that were infamous due to their high latency, it's sadly a pattern that latency is going up.
We are now in a spot where a bulk of modern titles have the same, or even more Input Lag than Killzone 2 had, a game that was panned at the time for how awful the aiming feels due to it's lag.
Coming back to God of War Ragnarök. That game in its 30fps Graphics Mode, has an input latency of around 170ms. Killzone 2's latency was in the 150ms ballpark!!!
So let that sink in for a moment... during Gen 7, Killzone 2 got massive bad PR for having 150ms of latency... and a modern game like God of War Ragnarök easily exceeds that!
Meanwhile Halo 3 had 90ms of latency at 30fps on an Xbox 360, and Ragnarök has about the same amount of latency in its 60fps Performance Mode.
In God of War's case it's mostly the Vsync that is the issue it seems, since as soon as you use the unlocked VRR mode that deactivates Vsync, the latency shrinks down to 90ms, in par with Halo 3 and the 60fps Vsync mode of the game.
Why does Vsync introduce input lag? Because it pre-renderes (buffers) frames. And some games take this to another level, usually to smooth out framerates due to being extremely demanding on the GPU, and they will hold multiple frames in the back buffer before displaying a new one, which gives them the possibility to basically have a fallback frame under most circumstances, which keeps the percieved framerate consistent at the cost of input lag.
Sea of Thieves in fact is the only game I know of that actually lets the user chose how many frames it should pre-render.
Secondly, how games build their final image:
Modern games have tons of high frequency details, and modern games are expected to be played on high resolution displays.
These 2 factors are the reason many games now use aggressive TAA, and more and more games use one of the many forms of Upsampling to higher resolutions from a lower base resolution.
These 2 things both lead to the same issue, a muddy and hard to parse image in motion.
Then a third factor comes in, MOTION BLUR *scary thunder noises*
And Motion Blur adds even more visual noise and muddies the final image even more.
So a modern game will often look pristine and super clean when you hold your camera completely still, but as soon as your character, and especially the camera, moves at all... MUD AND SMEARING.
And funnily enough we have an old game as an example here again, that at the time got a lot of flak for having the same issues, that nowadays are so common that noon even really talks about them in reviews or anything anymore...
And that game is Halo Reach.
Halo Reach was an early example of a game that used a form of Temporal Anti Aliasing... TAA.
And TAA at the time Halo Reach was made, was still firmly in its infancy, exhibiting a lot of issues and visual noise. So if you stood completely still in Halo Reach, it looked GLORIOUSLY smooth and clean... move the camera and it was a jittery mess.
These days TAA got a lot better, but it still has issues with ghosting and clear fizzle and disocclusion artifacts the moment something on screen moves.
But of course, TAA isn't the shiny new thing anymore... we now have FSR2, TSR and many other proprietary methods by different developers...
And these Upsampling methods basically bring bag almost all the issues Halo Reach had, and then some!
When you play Ocarina of Time at 20fps on an N64 connected to a CRT, motion is clean! You can see motion clearly, there's no artifacts from anything moving too fast, there is no blur to reduce legibility of the action happening around you, and there's no motion blur to try to make it seem smoother than it is.
"but why is this a 30fps problem mister Y" is what you are now asking I bet!
WELL, TAA, FSR2, TSR, CBR, ADD UPSAMPLING METHOD HERE... and Motion Blur, all of these rely on image data from previous frames!
THE LESS IMAGE DATA, THE MUDDIER THE IMAGE IS IN MOTION!
So, at 30fps these modern upsampling and anti aliasing methods have less temporal data available to them to create their final image!
A game running at 1440p FSR2 Quality Mode will look sharper and have less artifacts when running at 60fps than it would have at the same settings but locked to 30fps.
So the lower the framerate, the more "smear" the more "mud" and the more "fizzle" there is.
So in the end, all of this adds up.
The Motion Blur, the Image Reconstruction, the Antialiasing, all of it get worse the lower the framerate is.
This is why Jedi Survivor in motion looks like this (I purposefully took a shot from a spot where the game locks to 60fps to give it the best chances):
and F-Zero GX like this (I'm turning sharp left here to give it the worst chances with fast camera and vehicle movement):
Normalized for same image size so you don't have to zoom as much on mobile (not zoomed in in any way, just cropped)
And the lower the framerate, the worse all the weird grainy, blurry, fizzle, artifacts you see in Jedi Survivor will get. Meanwhile if you ran F-Zero GX at 10fps, it would look literally the exact same in terms of image quality and artifacts (because it has no artifacts)
And we haven't even touched motion blur, which is off in my screenshot here, along will all the other image filters.
If a game like Jedi Survivor runs at a lower framerate, the amount of data the engine has to draw the image from one frame to the next is reduced, the image gets less readable and muddy, and you will see blur in motion, even tho motion blur is off.
Smaller issues that aren't technical but due to Art Design of games:
Older games have different considerations when it comes to art (textures, models, scale) than modern games.
Older consoles couldn't have as much detail than newer consoles can handle. This means textures had less "visual noise", meaning there is less for your eyes to get hung up on, less to process for your brain.
In motion, when running through a level, or turning a camera, this means the things that are important, or could be important, as well as your general environment, is easier to digest.
You will have an easier time knowing where enemies are, or where a button or rope is, when it's simpler in design, bigger in scale and stands out more. And all of these features are features older graphics had!
On a wall with a simple flat texture, no shadows or reflections, seeing a red button that is on top of that unrealistically big in scale, is easier to see than a red button with realistic scale, on a wall with detailed textures, shadows of adjacent objects on it and other lighting applied.
So even if older games had the graphical issues I explained above, the simpler graphics would still make it way easier on your eyes, because there is not as much small detail that can get lost in all the blur, fizzle and breakup.
In short: Less Detail + Scale = Easier to read environments even at low framerates.
SO IN CONCLUSION:
Playing a game like Ocarina of Time will need getting used to at first, but once your eyes are adjusted to the 20fps output, the image will look pretty decent and clean.
Playing Jedi Survivor at 60fps already makes the whole image look muddy and blurry due to the artifacting of the upsamling. At 30fps this will only get worse, add Motion Blur to negate the stutter and you'll see barely anything in motion.
Playing Ocarina of Time at 20fps on real hardware will feel... not amazingly responsive, but more responsive than many modern games like... Jedi Survivor... or Redfall... or God of War Ragnarök feel at 30fps, hell some of them even at 60fps!
So you can not use an old game and point the finger at it saying "LOOK! back then we had no issues with this! this also ran at a super low framerate!", while not taking into account how modern game engines work, how they react to player input, and how they construct their final output image.
Old games had simple graphics, they were easier to parse by your eyes because important stuff was exaggerated in scale and shape, they didn't have motion blut, no reconstruction artifacts, no TAA or Screen Space effects that lag behind and take time to accumulate data. They had simpler engines with less latency.
And that is why they look cleaner and play better at low framerates than new games.
And yes, there are modern games and old games that are outliers from this. Like some modern ones like From Software games have pretty low latency, and some old games like Mortal Kombat on GameBoy have massive amounts of lag.
And some old games had weird fake motion blur which made the image hard to read, while some modern games have clean images with barely anything to muddy them in motion.
The difference is that there are more of the less good examples today, and more of the good examples in the past.
Crt+lcd are also "easier to eyes" with motion blur. 30fps is just fine
On OLED 30FPS is literally unplayable to some, like me. On OLED 30fps have micro lag and makes me sick almost instantly.
No it isn't. Not without black frame insertion or frame interpolation. Watch any TV program or a movie on your TV, of course there's blur in motion and it's perfectly visible. Watch people's eyes get blurry when they turn their heads, for example.Also, if you think about it, if you pause any frame of a movie, there will be motion blur in it, but the way you perceive it in motion is sharp abs crisp.
Funny because movie projectors don’t have poor response times that helps blend the frames together, so yes a movie theater would be just like watching an OLEDNo it isn't. Not without black frame insertion or frame interpolation. Watch any TV program or a movie on your TV, of course there's blur in motion and it's perfectly visible. Watch people's eyes get blurry when they turn their heads, for example.
It happens at the movie theater too. It's just that the cinema screen is dimmer and less sharp than your 4K TV, and it's not as jerky as an OLED screen. If cinema used giant OLEDs people would be getting motion sickness left and right, count on it. I got it very quickly on my OLED TV with an ultra-shaky cam movie like Man of Steel. Imagine that blown up in an OLED the size of a movie theater screen...
Man, if you'd watch the thread in 30fps you'd have plenty of time
Or have some low level motion interpolation like game motion plus on Samsung TVs. BFI would be the best solution if it didn't come with such a brightness hit. Maybe MicroLED solve it.blur kills readability of the game.
so either you have stutter on a fast pixel refresh screen,
or you have blur that smears detail, not only making graphics fidelity lower in motion, but also getting in the way of quickly parsing your surroundings.
Including the gameplay responsiveness, camera motion, animations, right?30fps and everything else is better.
Bit, it's ok!OH NO, MODS! there's a typo in the title BUT, not BIT... fuck
Horizon FW is a massive open world and it has a performance mode that looks better than both those screen shots. If anything, unreal engine is the reason for those busted looking games at 60fps.Sorry OP, but you're flatout wrong.
Plagues Tale: Requiem
30 Fidelity
60 Performance
60fps mode is missing a whole goddamn forest compared to the 30fps mode.
And it runs at sub 1080p.
This is fucking garbage.
I hope not a single dev moving forward wastes its time on a "performance" mode on these base consoles. Yes, lets sacrifice literally everything else to make our game run at 60fps. Stop
30fps:
60fps:
Enough.
This "every game must be 60fps" is nonsense and holding back the industry.
If you MUST HAVE 60fps games, buy a PC. Or wait for PS5 PRO.
Stop forcing such ugliness on the rest of us. I have HAD it
60fps and everything else is worse.
or
30fps and everything else is better.
The answer is clear and obvious.
Thanks to Bojji for the pics
Horizon FW also looks much better in fidelity mode, especially the new DLC.Horizon FW is a massive open world and it has a performance mode that looks better than both those screen shots. If anything, unreal engine is the reason for those busted looking games at 60fps.
First, I would argue the fidelity mode only looks slightly better in screenshots and the performance mode looks miles better in motion. Horizon’s performance mode also looked like trash on release and it took months of patching for it to look great; proving that if the game just got delayed for polish both modes would have been great and everyone would have been happy.Horizon FW also looks much better in fidelity mode, especially the new DLC.
And HFW is a first party game, made by arguably the most technically gifted studio on the planet.
Not many run of the mill 3rd party devs are going to be able to match what Guerilla Games does with their performance modes.
As the gen grows older, and games become more visually demanding... Most devs are simply going to be wasting our time with their 60fps modes. They will be Sub 1080p trash and missing way too many details.
Moving forward, on base consoles, 40fps will be the new performance mode.
60fps will be for the PRO consoles or PC.
The fidelity mode looks better in motion too, its way sharper thanks to higher resolution. There is also better lighting and more effects.First, I would argue the fidelity mode only looks slightly better in screenshots and the performance mode looks miles better in motion. Horizon’s performance mode also looked like trash on release and it took months of patching for it to look great; proving that if the game just got delayed for polish both modes would have been great and everyone would have been happy.
Motion blur would help even more.Just jumping back into this thread to highlight Zelda Tears of the Kingdom.
Because it's a perfect example of a game where 30fps does indeed work reasonably well, which is why my thread title has the "... mostly..." part at the end.
Zelda is very old fashioned.
a classic double buffer vsync solution, no motion blur, no TAA or any temporal upsampling.
Double Buffer Vsync means relatively low input lag, these days you don't see double buffer all that often anymore. you usually see triple buffer, because it improves performance at the cost of latency.
No motion blur means no smearing in motion that further muddies the image during camera turns.
No TAA and no temporal accumulation of any kind means no artifacting during disocclusion, no fizzle, no additional blur in motion, which all get worse the lower the framerate, hence why modern 30fps usually SUCKS.
Playing Jedi Survivor in resolution mode, and then following it up with playing TotK will instantly show you how much more playable Zelda is in comparison. your inputs register about twice as fast, your vision isn't constantly filled with blur and artifacts, and the framepacing feels smoother due to the way better/more consistent frame delivery.
additionally, it helps further if you play it in handheld mode.
In handheld mode your screen will usually cover less of your field of view, which means the steps between frames if something moves look smaller to youe eyes, which in return reduces the feeling of stutter. that's also a factor that comes into play when people talk about PS1 era 30fps games, and which I ignored to talk about in my OP.
TVs in the 90s were generally much smaller, even if you sat right in front of your TV the FOV to screen ratio was smaller, at which point the above mentioned comes into effect, just like on the smaller Handheld screen
Motion blur would help even more.
And I don’t know what taa gates you play but is not 2015. Taa can be spotless in motion.
Vsync double buffered is the sole one.
Triple buffering is the good fast one.
I doubt it is using Any of these if it feels responsive.
Generally taa and motion blur does not impact how game plays. They can only make it look smoother when done well. I don’t like 30 fps without taa personally.
Zelda feels more responsive because they got input lag to a low level.
Sorry OP, but you're flatout wrong.
Plagues Tale: Requiem
30 Fidelity
60 Performance
60fps mode is missing a whole goddamn forest compared to the 30fps mode.
And it runs at sub 1080p.
This is fucking garbage.
I hope not a single dev moving forward wastes its time on a "performance" mode on these base consoles. Yes, lets sacrifice literally everything else to make our game run at 60fps. Stop
30fps:
60fps:
Enough.
This "every game must be 60fps" is nonsense and holding back the industry.
If you MUST HAVE 60fps games, buy a PC. Or wait for PS5 PRO.
Stop forcing such ugliness on the rest of us. I have HAD it
60fps and everything else is worse.
or
30fps and everything else is better.
The answer is clear and obvious.
Thanks to Bojji for the pics
Yeah it’s input lag. Demons souls 30fps mode for 75ms more input lag than bloodborne. Seriously install both and test it. Bb feels great.
It’s the modes. 30fps needs to be properly implemented. Not vsynced crap.
Uncharted4 does it well. Good input lag and heavy high quality motion blur. Forbidden west 30fps also feels very good but could use better motion blur. Oled is great because it got no lag but requires great motion blur. Otherwise it looks jerky. Uncharted4 got that good motion blur.
So you are completely wrong about motion blur. It is essential and required for oled. It adds smoothness in motion. Each frame should be a capture of time that passed. Not a still.
Don’t mix this with fsr and visual artifacts
Remember that 60 to 30 fps is only 16ms difference. No reason for games to have 100ms worse input lag like some do
Again. Motion blur is essential to avoid jerky motion and stutter on oled. Unless you are running 240hz then you got enough frame data
I'm so tired of all the 30fps advocates. You think that the 30fps Fidelity mode looks good. Imagine what they could do at 15fps.Sorry OP, but you're flatout wrong.
Plagues Tale: Requiem
30 Fidelity
60 Performance
60fps mode is missing a whole goddamn forest compared to the 30fps mode.
And it runs at sub 1080p.
This is fucking garbage.
I hope not a single dev moving forward wastes its time on a "performance" mode on these base consoles. Yes, lets sacrifice literally everything else to make our game run at 60fps. Stop
30fps:
60fps:
Enough.
This "every game must be 60fps" is nonsense and holding back the industry.
If you MUST HAVE 60fps games, buy a PC. Or wait for PS5 PRO.
Stop forcing such ugliness on the rest of us. I have HAD it
60fps and everything else is worse.
or
30fps and everything else is better.
The answer is clear and obvious.
Thanks to Bojji for the pics
Believe me. Im WAY MORE sick of all the 60fps or bust framerate warriors. Willing to sacrifice literally everything over frame rate. Insanity.I'm so tired of all the 30fps advocates.
It does. Objectively.You think that the 30fps Fidelity mode looks good.
No need. 30fps is best of both worlds. 60fps games are fucking ugly on these consoles. Running at sub 1080p.. In 2023. Ridiculous.Imagine what they could do at 15fps.
Not controls responses, animations fluidity, camera motion fluidity.Willing to sacrifice literally everything over frame rate
Or 7.5.I'm so tired of all the 30fps advocates. You think that the 30fps Fidelity mode looks good. Imagine what they could do at 15fps.
And I am tired of all the "oh I am 12 and I just discovered 60fps" people.I'm so tired of all the 30fps advocates. You think that the 30fps Fidelity mode looks good. Imagine what they could do at 15fps.
I’ve been playing a lot of ps3 and 360 games on my LG CX lately and notice a similar effect. Horizon FW looks incredibly stop-motion juddery in the 30 fps mode, but a lot of these ps3 30 fps games don’t look quite as low fps despite also being 30 fps. I kind of wonder if the lack of visual clarity helps too.I can’t stand 30 fps on Ps5 or Xbox on my LG CX. Feels like a flip book after 60 fps.
Weirdly though, Tears Of The Kingdom feels absolutely fine at 30 fps. Have no idea why. In fact, Switch games in general feel ok.
This discussion will never end. Some are happy with low fps and high resolution. Some can't stand low fps and are willing to sacrifice fidelity for fluidity. It's great that we have the option for both on consoled. I personally will always chose 60fps over visual fidelity. Each to their own. If I can I play games at 120fps on my monster computer but I will play a game at 30fps if there is absolutely no other way around it and the game is absolutely unmissable. That said, after buying TOTK, I couldn't stand the 30fps and low resolution on my rather big 85" TV and tried it on PC emulation and it's freakin glorious when you have the option for high resolution AND 60fps. I remember BOTW being more acceptable on my previous 55" but this time around it was painful. I also come from a FPS and fighting game background and the fluidity of minimum 60fps is the most important thing to me. Again, it's good that we have options.And I am tired of all the "oh I am 12 and I just discovered 60fps" people.
There are more things to gaming than 60 fps. Other frontiers some might find important.
I am more acceptable to 30fps because I play video games for so long, no because 60fps is some new thing.
I play gaming on a 120hz crt over 20 years ago. It's nothing new and with some time, it really doesnt fucking matter. I want to play the game and be wowed by it. I remember graphics, not that the game ran 30fps (unless it's some shitty slow 30fps like we get this gen).
You have to realize that people who just want to play the game and be wowed by it and are happy to accept 30fps, don't always do that because they can't value 60fps. It can be a choice.
If there was 720p 60fps mode in tlou2 on ps4 ? Fuck no, I would play that game in 1080p30
if the 30fps option wasn't so terrible this gen, it would be better.... But graphics enthusiasts are left with 30fps modes with extreme input lag + 75ms to what it was on ps4 in some cases.This discussion will never end. Some are happy with low fps and high resolution. Some can't stand low fps and are willing to sacrifice fidelity for fluidity. It's great that we have the option for both on consoled. I personally will always chose 60fps over visual fidelity. Each to their own. If I can I play games at 120fps on my monster computer but I will play a game at 30fps if there is absolutely no other way around it and the game is absolutely unmissable. That said, after buying TOTK, I couldn't stand the 30fps and low resolution on my rather big 85" TV and tried it on PC emulation and it's freakin glorious when you have the option for high resolution AND 60fps. I remember BOTW being more acceptable on my previous 55" but this time around it was painful. I also come from a FPS and fighting game background and the fluidity of minimum 60fps is the most important thing to me. Again, it's good that we have options.
The console graphics modes has been a total disaster for console gaming and it's just getting worse. You buy consoles and you buy console games to get away from that garbage. If I want to analyze framerates and IQ and performance impacts from settings toggles I'll buy the PC version. it's just to the devs to, you know, design the game, and that entails making decisions to make the game as good as possible. This includes how it looks.if the 30fps option wasn't so terrible this gen, it would be better.... But graphics enthusiasts are left with 30fps modes with extreme input lag + 75ms to what it was on ps4 in some cases.
I am not advertising for that honestly.
For me, it was the best when there were NO MODES at all. I just want to launch the game and go. If I know there are no options, I will admire the game for what it is and how it plays. If there are modes, I will often feel unsatisfied by lagginess of one mode or blurriness of another even if I would be happy if the game only had performance mode (in no modes scenario).
So for me - no modes but every game should offer VRR FPS unlock with ability to set system level locks to 30, 40, 60. Just so when new PRO consoles comes in the future, you can unlock fps or lock it to 40 and so on
It is to my detriment to play a ps5 game without lookin at a DF video first.The console graphics modes has been a total disaster for console gaming and it's just getting worse. You buy consoles and you buy console games to get away from that garbage. If I want to analyze framerates and IQ and performance impacts from settings toggles I'll buy the PC version. it's just to the devs to, you know, design the game, and that entails making decisions to make the game as good as possible. This includes how it looks.
When this stuff started rolling out on PS4 Pro* I knew it would not end well, like you said it's just become a complete mess, and modes get added post-launch, and now some games are toggling graphics settings like PC. I don't want ANY of this. I just want one mode, the best mode the devs feel the game runs at, the mode they would play the game at.It is to my detriment to play a ps5 game without lookin at a DF video first.
Devs just are fucking lazy and treat console as a mid pc... No optimization, no monitoring tools which would be great if I am supposed to make a choice....
And very often, almost always the modes are mislabeled or ambiguously names. Some examples:
-Just "Fidelity mode". it can be high resolution without rt... or low res but with all bells and whistles. No info
-Fidelity mode can be named Quality mode.... or high resolution mode. None of which means anything or can be even misinforming.
-performance mode is sometimes not even 60fps resulting in worse feel. They do not inform you what you are sacrificing with performance mode. Is it rt? shadows? resolution? No idea, play the guessing game.
-Some games have all of these with no info. just labels: Perf, Quality, HFR quality, HFR performance, 120hz mode, vrr mode....
-Some games have HIDDEN modes only accessible if you enble or disable 120hz or/and vrr in ps5 system settings and relaunch the game... then suddenly quality or performance mode is unlocked, or 40fps mode appears. Like gt7 or like Insomnia games - but to their defense. Insomniac at least have clear description that this is happening.
many many more examples. of course there are good use of modes too. I like 40fps stuff a lot but it is often labeled as "balanced" whatever it fucking means. Am I falling over here?! And while balanced, it is often just 1800p
Again, we will never agree on this as a community. I personally, if I had a genie's wish, would want that all games were developed with 60fps as initial target, this would mean the games would look their best and there would be a lot of smart solutions to make it work and still look good. Then they could crank up some extra resolution, textures and draw distance as an afterthought for those that are content with 30fps. Demon's Souls is a stellar example of this. But this is not how everyone likes it. So I'm really happy we get both modes, even if it complicates things a bit.if the 30fps option wasn't so terrible this gen, it would be better.... But graphics enthusiasts are left with 30fps modes with extreme input lag + 75ms to what it was on ps4 in some cases.
I am not advertising for that honestly.
For me, it was the best when there were NO MODES at all. I just want to launch the game and go. If I know there are no options, I will admire the game for what it is and how it plays. If there are modes, I will often feel unsatisfied by lagginess of one mode or blurriness of another even if I would be happy if the game only had performance mode (in no modes scenario).
So for me - no modes but every game should offer VRR FPS unlock with ability to set system level locks to 30, 40, 60. Just so when new PRO consoles comes in the future, you can unlock fps or lock it to 40 and so on
I do.Easy reason is popularity of online shooters and high refresh rate monitors. Frame rate matters most in shooters, espesially competitive ones. No one cares if theyre playing persona at 30 fps.