• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Final Fantasy 16 “Framerate mode” is a joke on PS5

Zuzu

Member
The performance mode is definitely a strange one in this game because it locks to 60fps during combat by tanking the resolution down 720p and then out of combat is oscillates between 60fps and below that with a higher resolution.

But the 30fps mode is probably the best 30fps mode I've seen on a new game released for this generation. So at least it has that. Dead Space Remake's 30fps mode is also very good. Both games have great motion blur for this mode.
 

JaksGhost

Member
I’ve had no problem with it. During combat it runs pretty damn good which is where it matters to me for this type of game. Is there something odd going on with the engine outside of combat, yes, but I’m still enjoying myself.
 

TheDarkPhantom

Gold Member
Framerate is inconsistent for sure but it holds 60 FPS where it matters most: combat. As such I kept it in Performance Mode throughout, still less distracting than slow as molasses 30 FPS but to each their own.
 

Kuranghi

Member
Kura, riddle me this :
I'm one of the - apparently - "weird" ones that absolutely CANNOT do 30fps on an OLED due to the judder/stutter,(LG C2 here) I've been reading on the internet that 30fps games look even more juddery when...you have VRR enabled since - apparently - there's some behind the scenes fuckery that messes with the screen fluidity (?)
Fuck if it makes sense to me but these posters (different ones in different reddit threads) have been swearing that there is indeed a difference with VRR turned off when it comes to the judder on 30fps games ... They mentioned specifically the quality mode on Demon's souls remake, RDR2 (which is 30 by default ) etc, does that make any sense ?

I know that it's an inherent flaw on OLEDs due to their instant response time but I'm just trying to somewhat remedy the 30fps stutter shit show on my C2 to no avail, I just can't for the life of me get used to it, tried using true motion but it's obviously not perfect and the input lag drives me mad, it's just that I don't get how some people say that they don't have a problem with 30fps games on their OLEDs and that "it's fine"...

TLDNR : Some people have been saying that turning VRR off from the console can actually help with the 30fps stutter

Cheers

Its not something I've heard of happening specifically on the C2, or OLED, but it could be true since apparently the response time tuning (ie "behind the scenes fuckery") of a TV can be changed when the the refresh rate is above or below a certain value, from the rtings Hisense U7K review:

The TV has an excellent response time at 4k @ 60Hz, but the Hisense's response time tuning is different when the TV's refresh rate is below and above 100Hz: it's more aggressive above 100Hz, leading to a faster overall response time but with more overshoot errors. This doesn't cause any issues when running at a fixed refresh rate, but with VRR enabled, the TV's response time behavior rapidly changes as the TV's refresh rate hovers around 100Hz, which is very noticeable. You can see the two response time behaviors here:

Below 100:

motion-blur-vrr-95-fps-large.jpg


Above 100:

motion-blur-vrr-105-fps-large.jpg



The images indicate the perception of motion blur would be different, so I'm guessing that would change the perception of stutter. I assume that in FFXVI when VRR is on the game is always sitting above 30, so perhaps the pixel response tuning in your C2 is more aggressive above 30 vs. when VRR is off and its locked to 30, which increases motion clarity, but increases the perceived motion stutter.

As I understand it VRR generally changes how a panel responds/outputs in a variety of ways, including changing the gamma, which can lead to near black flickering. I have no hands-on experience of VRR, so my knowledge of it is limited to what I've read from professionals, enthusiasts and owners.
 

Bojji

Member
Honestly it’s ridiculous they didn’t at least add low framerate compensation with the game constantly dipping under 48fps out of combat. Feels like Insomniac Games is the only studio with a clue of how display technology works.

They should, but in reality it should be done on system level like on Xbox or PC.

Why Sony is so incompetent? System wide 120Hz output with LFC would fix stutters in majority of games (like fucking ER).
 

Kuranghi

Member
They should, but in reality it should be done on system level like on Xbox or PC.

Why Sony is so incompetent? System wide 120Hz output with LFC would fix stutters in majority of games (like fucking ER).

Yeah, bizarre really isn't it.

Being on a TV without HDMI 2.1 I find it annoying that on PS5 you can't force the bit-depth to 8-bit + dithering when outputting 4K@60hz with HDR to maintain 444 chroma instead of drop to 422. The default behaviour in Windows is to do this, instead of outputting 10-bit at 422, there its being done to maintain text clarity, but even if thats not/less of a factor on console, 422 vs 444 (greatly) affects the peak brightness of the HDR output according to Vincent Teoh of HDTVTest.

I've seen you can change the bit depth on Xbox Series, though tbf I don't know if HDR being on takes control and forces it to 10-bit, so maybe its no different there.
 

Giallo Corsa

Gold Member
Its not something I've heard of happening specifically on the C2, or OLED, but it could be true since apparently the response time tuning (ie "behind the scenes fuckery") of a TV can be changed when the the refresh rate is above or below a certain value, from the rtings Hisense U7K review:



Below 100:

motion-blur-vrr-95-fps-large.jpg


Above 100:

motion-blur-vrr-105-fps-large.jpg



The images indicate the perception of motion blur would be different, so I'm guessing that would change the perception of stutter. I assume that in FFXVI when VRR is on the game is always sitting above 30, so perhaps the pixel response tuning in your C2 is more aggressive above 30 vs. when VRR is off and its locked to 30, which increases motion clarity, but increases the perceived motion stutter.

As I understand it VRR generally changes how a panel responds/outputs in a variety of ways, including changing the gamma, which can lead to near black flickering. I have no hands-on experience of VRR, so my knowledge of it is limited to what I've read from professionals, enthusiasts and owners.

What a fine lad you are...thanks mate 😉

Welp, maybe these people on reddit are INDEED into something since you've also posted those articles...

Apparently, it differs from TV to TV so maybe that's why some people in here say that they're fine with 30fps games on their OLEDs.
Maybe it's panel specific OR their processor combined with each company's custom software - it's gotta be the latter since - correct me.if.im wrong - all companies' OLEDs have LG panels.

Maybe it also has to do with specific 30fps modes on specific games - Demon's souls remake is especially notorious since you'll find many posts online about its 30fps/quality mode.

Here's the thing though : why don't I perceive said stutters when watching a 30fps game video on YouTube on my C2 ? Shouldn't it stutter all the same ?

Hell, this is doing my head in, spent a good hour last night trying various games (PS4 Mafia remake, Hogwart's legacy PS5 in quality mode etc) turning on/off VRR on both the PS5 and the the C2, trying OLED BFI on/off, activating/deactivating 120Hz and...nothing, maybe there's a MINOR difference with VRR turned off but the difference is so negligible that it might have been placebo in the end.

Dammit, I should have gone for a miniLED as to avoid this friggin headache 😁

Cheers
 

Kuranghi

Member
Apparently, it differs from TV to TV so maybe that's why some people in here say that they're fine with 30fps games on their OLEDs.
Maybe it's panel specific OR their processor combined with each company's custom software - it's gotta be the latter since - correct me.if.im wrong - all companies' OLEDs have LG panels.

All the other companies do/will use LG Display's WOLED panels (Sony and Samsung also use Samsung's Display's QD-OLED panels) but there is factor to consider, LG Display has different OLED panel grades/tiers: I think 4 as of 2023. Thats partly why the A2, B2, C2 and G2 have increasing quality of near black handling and faster response times, among other differences. The processor/processing definitely matters too though.

Here's the thing though : why don't I perceive said stutters when watching a 30fps game video on YouTube on my C2 ? Shouldn't it stutter all the same?

If I had to guess thats due to the quality of the capture device HW and the encode it performs, then the added compression from the youtube encode and maybe also the way the youtube app handles its output.

Dammit, I should have gone for a miniLED as to avoid this friggin headache 😁

What size is your C2 and what country are you in?

If its smaller than 65" or you're in the US there isn't really a great choice for MiniLEDs anyway: The Samsung QN95C has a fucked game mode and the Sony X95L isn't available below 65" in general and is 85" only in US. So that leaves the TCL QM8 but then you have to be in the US as its not available in Europe, and besides it has poor EOTF-tracking giving an adjacently fucked image presentation/game mode to the Samsung and the Hisense U8K, I think the U8K is definitely better than the QM8 but Hisense has their own problems which may or may not annoy you, though I don't think the different response time tuning on the U7K I spoke of above affects the U8K at least.

If considering changing from OLED to LCD/MiniLED I would personally wait and see what 2024 brings for Sony, Hisense and TCL, the U8N and QM850 seem very promising on paper and Sony has said their 2024 flagship is a MiniLED with a (true) Backlight Master Drive successor that can reduce blooming massively.

I don't think Samsung can get their game mode back to how it was pre-2021 without increasing input lag or returning to the backlight HW/algorithm they used prior to the Neo QLED series (or whatever caused the game mode changes since the Neo QLEDs appeared) and Sony always seems to be lagging behind in their TV gaming features (ironically).

Even if Sony got up to speed in gaming features their relatively recent change of dimming algorithm philosophy to lighting up more adjacent backlight zones to preserve the creator's intent is an enemy of HDR gaming since many developers seem to be testing their HDR output mostly on OLEDs (from what I've seen of BTS videos and speaking to devs) and so they make HUD elements very bright and they float over near black content, which no FALD LCD can show without blooming unless the creator's intent is ignored or big dimming algorithm/backlight HW improvements are made.

Since they've literally said they have made massive dimming algorithm/backlight HW improvements (a new drive IC in the backlight) this year heres hoping the Sony flagship is a revolution for LCD backlight control. What sizes it will be available in and the price remains to be seen though... the ZD9 (Or really any flagship 4K LCDs they've made) wasn't available below 65" and it cost £4000/£7000 in 65/75" at launch lol. Sony can set their own pricing when almost everyone consistently votes them as best TV each year (though not for gaming, but I'd definitely trade off the gaming features advantages of other brands for the best overall image quality).

Okay thats cola enough for me. Nah wait I changed my mind 😬
 

Giallo Corsa

Gold Member
All the other companies do/will use LG Display's WOLED panels (Sony and Samsung also use Samsung's Display's QD-OLED panels) but there is factor to consider, LG Display has different OLED panel grades/tiers: I think 4 as of 2023. Thats partly why the A2, B2, C2 and G2 have increasing quality of near black handling and faster response times, among other differences. The processor/processing definitely matters too though.



If I had to guess thats due to the quality of the capture device HW and the encode it performs, then the added compression from the youtube encode and maybe also the way the youtube app handles its output.



What size is your C2 and what country are you in?

If its smaller than 65" or you're in the US there isn't really a great choice for MiniLEDs anyway: The Samsung QN95C has a fucked game mode and the Sony X95L isn't available below 65" in general and is 85" only in US. So that leaves the TCL QM8 but then you have to be in the US as its not available in Europe, and besides it has poor EOTF-tracking giving an adjacently fucked image presentation/game mode to the Samsung and the Hisense U8K, I think the U8K is definitely better than the QM8 but Hisense has their own problems which may or may not annoy you, though I don't think the different response time tuning on the U7K I spoke of above affects the U8K at least.

If considering changing from OLED to LCD/MiniLED I would personally wait and see what 2024 brings for Sony, Hisense and TCL, the U8N and QM850 seem very promising on paper and Sony has said their 2024 flagship is a MiniLED with a (true) Backlight Master Drive successor that can reduce blooming massively.

I don't think Samsung can get their game mode back to how it was pre-2021 without increasing input lag or returning to the backlight HW/algorithm they used prior to the Neo QLED series (or whatever caused the game mode changes since the Neo QLEDs appeared) and Sony always seems to be lagging behind in their TV gaming features (ironically).

Even if Sony got up to speed in gaming features their relatively recent change of dimming algorithm philosophy to lighting up more adjacent backlight zones to preserve the creator's intent is an enemy of HDR gaming since many developers seem to be testing their HDR output mostly on OLEDs (from what I've seen of BTS videos and speaking to devs) and so they make HUD elements very bright and they float over near black content, which no FALD LCD can show without blooming unless the creator's intent is ignored or big dimming algorithm/backlight HW improvements are made.

Since they've literally said they have made massive dimming algorithm/backlight HW improvements (a new drive IC in the backlight) this year heres hoping the Sony flagship is a revolution for LCD backlight control. What sizes it will be available in and the price remains to be seen though... the ZD9 (Or really any flagship 4K LCDs they've made) wasn't available below 65" and it cost £4000/£7000 in 65/75" at launch lol. Sony can set their own pricing when almost everyone consistently votes them as best TV each year (though not for gaming, but I'd definitely trade off the gaming features advantages of other brands for the best overall image quality).

Okay thats cola enough for me. Nah wait I changed my mind 😬

Kura, I had my Panasonic plasma for a good 10 years and finally decided to jump into the 4K bandwagon last year with a Philips PUS8887 (VA panel, plus, I always loved Ambilight), and while it was OK-ish , it had the typical problems of most Android TVs with that friggin' Mediatek chip in that it halved the vertical resolution whenever you tried to output 4K@120Hz amongst other things.
Had to take it back due to a LED on a LED strip being faulty and got me a LG C2 - I was actually about to get a Philips PML9597/9636 but...prices were silly, almost same price (if not more) than the C2 so here we are.
I already had read about OLED stutter but if I had firsthand knowledge of how bad it'd be I'd definitely have opted for the miniLED.

Anyway, just to put more gasoline into the fire :
Read a couple of different topics and besides the OLED stutter thing, there's people claiming that the PS5 actually outputs the same 30fps games WORSE than an actual PS4 Pro - this is on the same TV, with the same game, switching back and forth, they claim that on PS5, there's this stutter/judder that isn't present on the PS4...now, this is either the biggest Placebo on earth, or, there's some dark magic dog fuckery at play - sounds like a stupid internet conspiracy theory and you'd think that we would have heard something by now from the "big" tech-sites/channels but, I don't know, there's enough people saying/describing the same exact thing that it doesn't sound like a coincidence...

That's all for today, going to wear my tinfoil hat 😉

Thanks for the insights brother, appreciate it !
 
Last edited:
You are correct, though I'm not sure it necessarily warrants a thread. The problems with the game's performance have been well documented. I actually look forward to playing it, but I'm actively waiting on the PC version, as I'm not dealing with 30 fps, and the performance mode is borked. Thank goodness I'm patient.
Or ps5 pro
 

Solidus_T

Banned
I agree. It takes a huge hit to the visual quality while the framerate isn't even a stable 60, meanwhile the quality mode is a very stable 30fps and the graphics are incredible - way better than the performance mode.
 
What? I had no problem with performance mode. Although I am on a 1080p TV so maybe that's why? The fidelity mode was shit though. It was very obviously slower than 30 to my eyes. It was as bad as demons souls fidelity mode. Don't know why some games feel fine at 30 and some don't. Maybe it's the 4k down sampling.
 

DenchDeckard

Moderated wildly
To be fair the 30fps mode is very solid on this game and I had no issue playing it as its obviously what the game was designed for.

But we know what happens when you announce a 30fps action game. So they managed to dodge that fore by announcing a 60fps mode and being final fantasy.

For me, personally the framerate modes were the least of this games issues.
 

Go_Ly_Dow

Member
If it’s badly optimized on PS5, expect an even worse optimization on PC.
Just look how bad Square Enix past PC ports have been.
We'll see, this one is coming from the FFXIV team who are loved on PC and it seems like they're giving it time. XV on PC was praised a lot and still looks incredible.

My prediction: they'll reveal a PC port and Xbox port at around the same time. Then they'll release them in line with the PS5 Pro for which they'll do a free patch and some kind or definitive edition with both DLCs included (I'm going with FFXVI Ultima Edition) and this will do something like native 2160p 30fps / 1440p 60fps. Both a good upgrade from the base PS5 version. Whenever Sony reveal the Pro FFXVI will be one of the games they demo to show the advantages.

Call me crazy but I'm feeling lucky.
 
Last edited:

Akuji

Member
loved the Demo, pre ordered. but then wanted to wait for the performance mode to get better. i read in this thread it still sucks :(

well, gotta play ff7 remake now before rebirth launches. Maybe i just buy it on PC again or wait for the rumored ps5 pro ...
 
Last edited:

Whitecrow

Banned
loved the Demo, pre ordered. but then wanted to wait for the performance mode to get better. i read in this thread it still sucks :(

well, gotta play ff7 remake now before rebirth launches. Maybe i just buy it on PC again or wait for the rumored ps5 pro ...
It's not perfect, but it's perfectly playable. Also in combat is pretty much locked. Dont use this thread to guide your decision.
 

Bojji

Member
VRR can make your problems easier.

Good luck when game drops under 48 regularly. Of course Sony can't extend that window for tvs that have 40 as the lowest supported refresh rate for VRR, they are too incompetent. Only good thing about this company are developers, Sony games have excellent use of VRR and 120hz refresh rate, too bad third parties suck as always.
 

Punished Miku

Human Rights Subscription Service
Ghostwire Tokyo?

No, it doesn't have a 40hz option. It does have a stupid amount of ways to play in 30, 60 or a very uneven unlocked frame rate mode though. Should have just focused on 2 modes with that one.
Every game should have unlocked as an option even if it sucks now. For no extra work it future proofs it. Many of the better BC games just happened to be unlocked and now run fantastically 2 gens later for no additional work.
 

Fbh

Member
The presentation in general is underwhelming IMO
It looks nice in some spots and the cutscenes are impressive but the regular gameplay looks just ok and the world feels dated. Nothing about the presentation looks good enough IMO to justify having to choose between playing at 30fps or having the resolution drop as low as 720p during combat.

Not to mention that deciding to make it an action game but focusing on making it a 30fps experience is just another bizarre design decision in a game that's full of them. DMC style action games have generally aimed for 60fps even as far back as the Ps2.
 
Last edited:

Arsic

Loves his juicy stink trail scent
Fidelity mode, VRR TV , and the game runs butter. Many have said and myself included , that it’s the best 30fps mode achieved. I’d be less against 30fps if it was like this game pulled it off.
 

Akuji

Member
It's not perfect, but it's perfectly playable. Also in combat is pretty much locked. Dont use this thread to guide your decision.
true, there was nothing holding it back to a point where i would say its "unplayable" but the demo was on a peformance level that i felt something was off. the reviews/ impression / frame time videos came later and just confirmed what i felt.
Iam pretty sensible for these things. But as you said, its in a good enough state to enjoy it. Personally i would probably go with the fidelity mode as of now. I will just hold out a while longer, if it gets a PS5 Pro patch, perfect, if not, i play it eventually anyway.
 
Top Bottom