[Rumor] [MLiD] PSSR 2 Reportedly Doesn’t Have Frame Generation in the Plans: "developers hate it, hate it, hate it."

LectureMaster

Or is it just one of Adam's balls in my throat?
Time stamped :




PS5 Pro PSSR 2 rumor says no frame gen

The new comments come from hardware leaker and insider, Moore's Law Is Dead in a new Broken Silicon episode, where he says he's relaying what he's hearing from developer sources and backend documentation.

When the discussion turned to frame generation for the next version of PSSR, he shared a message he says came from a source familiar with Sony's tooling. Here's what he said: "A source I was talking about this [PSSR 2], then I just asked, 'is there any reference to frame gen?'"

Apparently, the source replied, "not a single one in the next PSSR. I haven't heard a single developer ask Sony to add Frame Gen to PSSR." He then follows with:""not only is Sony not working on it, but they know that developers hate it, hate it, hate it."

The claim is not that frame generation is impossible on PlayStation, but that Sony's next PSSR 2 upgrade is being treated as an upscaling and image-quality project, not a frame-gen initiative. One possible reason is that frame generation can come with tradeoffs like artifacts and inconsistent motion.
 
I'm not a fan of frame generation, but it could be useful for games that reach at least 60fps without it and allow for a 120fps mode, thus taking greater advantage of the 120hz mode.
 
Framegen adds input lag. On PC the best usecase it to boost an already great framerate (60 to 120). For consoles it doesn't make sense enough.

I'm not a fan of frame generation, but it could be useful for games that reach at least 60fps without it and allow for a 120fps mode, thus taking greater advantage of the 120hz mode.
Yes, but if you see all the shit YT coverage this is shown as a miracle solution to sub-30FPS games on SteamDeck. "It doubles your frame rate"! The fact that you need a high base frame rate seems to be completely ignored by all the content muppets.
 
Gamers hate it too, or at least they should. Frame generation creates significant visual artifacts and input lag. Then companies like nVidia make tools to try to reduce the input lag but it ends up generating even more ugly artifacts. You're almost better off streaming games rather than using frame gen.
 
yep keep that shit away, its artifacts galore in every implementation out there.
also you know it'd be abused to hell and back on console by your avg unreal-dev shop
 
Last edited:
Don't the current consoles still have terrible inherent input lag all on their own (much higher than a PC at the same framerate?), and then in addition to that also lack any kind of Nvidia Reflex or AMD Anti-Lag technology?
 
Don't the current consoles still have terrible inherent input lag all on their own (much higher than a PC at the same framerate?), and then in addition to that also lack any kind of Nvidia Reflex or AMD Anti-Lag technology?
They don't lack those things they just don't need a name for it. On console if the dev optimises then all they're doing is implementing "Reflex". On PC it exists because CPU and GPU combinations differ immensely.
 
Smells like another year of crossgen has been added to the next consoles.
Redstone has been a bit of sad trombone. I've yet to see PSSR in action. I don't know anyone with a PS5pro.
 
Gamers hate it too, or at least they should. Frame generation creates significant visual artifacts and input lag. Then companies like nVidia make tools to try to reduce the input lag but it ends up generating even more ugly artifacts. You're almost better off streaming games rather than using frame gen.
Nvidia uses Multi frame gen in conjuction with Reflex tech to do 1080/360p streaming on the GFN Ultimate tier with RTX 5080.

Framegen would be really useful to get next gen console Cloud Gaming to 120 fps.
 
ML FG is a part of Redstone, so they'll be able to use it if they want. I find FG to be quite good with the caveat being you need to start from a high enough frame rate to have the overhead to reliably double the frames.
 
In principle, FG is a great option to have, particularly at higher FPS.
In reality, console games have been using it to cover up poor FPS, when it's even worse than just leaving FG off.
 
They don't lack those things they just don't need a name for it. On console if the dev optimises then all they're doing is implementing "Reflex". On PC it exists because CPU and GPU combinations differ immensely.

What I mean is doesn't say Street Fighter 6 or Tekken 7 at 60fps on PS5/SX have inherently higher input lag than any PC at 60FPS, and then Reflex/Anti-Lag widens the gap even further taking PC even lower. I know that was certainly the case with consoles in the past, I haven't really followed PS5/SX as closely. My understanding was there's some inherent overhead with consoles (or something not exposed to developers) you can't really "optimize" your way around.
 
Last edited:
Framegen adds input lag. On PC the best usecase it to boost an already great framerate (60 to 120). For consoles it doesn't make sense enough.
Agreed, if the base fps is 30, the frame Gen can't really double it, but adds input lag and artificial effects.

Console should target nice clean 4k 60fps image rather than pushing for high frame rate.
 
Like any tool, it has its place. FG excels in certain scenarios, while failing in others. As usual, people will share their hyperbolic and generalized hot takes, which mean very little without context.
 
Last edited:
weird feedback, a lot of games can easily benefit from it tremendously. If you look at a some of the first party games with unlock frame rate option, they seems to have a lot of GPU head room. ie Tlou2 could have been a 1080p >4K PSSR + FG mode to hit 120fps same with GoY, GoWR etc and many others. Probably show a better showcase and advertisement than what they got now with the pro. Not just playing PS5 fidelity mode in 60fps, but fidelity mode in 120fps would have been far more appealing with the tag price. Can't blame 3rd party on lackluster support when their own first party can't utilize their HW to maximum extent. There are a lot of AMD tech that are already available they could have integrate in to their SDK, how about AFMF2.1 for all PS4 titles? PS5 pro is a hardware made for the tech enthusiast but Sony themselves doesn't even very enthusistic about the hardware.
 
I only use frame gen when it's absolutely necessary, but it's funny all the people crying about artifacts and latency with FG. If I showed you a video of a fully maxed out Cyberpunk at 4k with FG, not one of these people would be able to point out these egregious "artifacts" or latency issue. The benchmark has FG adding ~5ms of latency in my tests, you're not noticing that.

Yeah, it's not always perfect, and it's generally only good when you have a decent starting point to begin with, but this idea that it's always a net negative is just stupid. Frame gen is also something that needs to exist, much like with DLSS, because developers have gotten fucking lazy and don't optimize games for shit anymore. And even for developers who do push things, and optimize, things like path tracing are expensive so more options to deal with that aren't a bad thing.
 
The developers of Black Myth Wukong recently released a patch which actually removed FG entirely from the PS5 version of the game. They decided actually real performance matters and removed the Lumen lighting from the Performance mode which allowed it to reach 60 fps natively without FG. The game plays a million times better on PS5 now than before the patch and the removed lighting hardly matters compared to having a good gameplay experience

FG is a really bad idea unless you already have 60+ fps and you're just trying to achieve more "smoothness" feeling on a high-refresh rate display like 120 hz+. It's not a substitute for real frames, but it can be very nice to make the real frames feel "smoother"
 
Last edited:
I only use frame gen when it's absolutely necessary, but it's funny all the people crying about artifacts and latency with FG. If I showed you a video of a fully maxed out Cyberpunk at 4k with FG, not one of these people would be able to point out these egregious "artifacts" or latency issue. The benchmark has FG adding ~5ms of latency in my tests, you're not noticing that.

Yeah, it's not always perfect, and it's generally only good when you have a decent starting point to begin with, but this idea that it's always a net negative is just stupid. Frame gen is also something that needs to exist, much like with DLSS, because developers have gotten fucking lazy and don't optimize games for shit anymore. And even for developers who do push things, and optimize, things like path tracing are expensive so more options to deal with that aren't a bad thing.

on PC it's an entirely different thing.
on Nvidia you have Reflex, which will dramatically decrease input lag. if you compare the input lag of a console version of a game, to a PC version with Reflex enabled, it often reduces the latency to around half of the console version, often lower than that.

Fortnite on PS5 at 60fps has somewhere around 90ms to 100ms of lag. on PC with Reflex at 60fps it has between 30ms and 40ms.
this means adding a few milliseconds of lag on top of this insanely reduced lag compared to console, will not be an issue.
however, now imagine adding 1 frame of lag on top of the already really bad lag of the console version here. not great.

so first they'd need an equivalent of Nvidia Reflex, and good luck with that. not even AMD has caught up to reflex yet with their own version of lag reduction.
 
Framegen is awful. Glad it's not included.
Loop Trump GIF


Which games do you think suffer from it?
 
They'll include it in five years when everyone else will be using it.
It's Sony and Cerny, they are the first to everything in gaming tech after all /s

On a more serious note, FG has limited applications to consoles since you pretty much have to run at 60+ FPS for FG to be usable (there are exceptions where you can make do with as low base FPS as 40 but they are rare).
For consoles this means a mode which will use the 60 FPS mode to hit a no man's land of 90 to 120. Something which could work with VRR but will require a lot of refinement in areas of frametime health and general compatibility (let's not forget that Sony still can't do 120Hz RGB or LFC in 60 FPS output). So it's expected that they "hate it" - the reasons are they own platform issues though and not FG itself.
 
I have 0 issues with frame gen if it's used at like 60 plus FPS to take it closer to 120.

Consoles are all about using tech that trickles down from pc to create a good end user experience. I don't get why it's not there for peeps who want it with games over 60 FPS. I'm sure it will get added.

Why do developers hate, hate, hate it?
 
Last edited:
Framegen is already viable and keeps getting better - rather like image reconstruction (remember). The end point will be perfect 'fake' frames with no added latency. But hey - keep playing catchup with Nvidia. You'll get there one day.
 
on PC it's an entirely different thing.
on Nvidia you have Reflex, which will dramatically decrease input lag. if you compare the input lag of a console version of a game, to a PC version with Reflex enabled, it often reduces the latency to around half of the console version, often lower than that.

Fortnite on PS5 at 60fps has somewhere around 90ms to 100ms of lag. on PC with Reflex at 60fps it has between 30ms and 40ms.
this means adding a few milliseconds of lag on top of this insanely reduced lag compared to console, will not be an issue.
however, now imagine adding 1 frame of lag on top of the already really bad lag of the console version here. not great.

so first they'd need an equivalent of Nvidia Reflex, and good luck with that. not even AMD has caught up to reflex yet with their own version of lag reduction.
First, PS5 version is around 60-70, PC version is around 40-50 with reflex and VSync disabled. The difference is not reflex, it's the VSync
 
There are consoles games this gen where the performance mode doesn't maintain 60fps, of course frame gen shouldn't be a priority.

It's only gonna add bad input lag unless you're getting into the 80s of fps, and are ok taking the hit for smoother motion of even higher fps. Maybe for turn-based games too.
 
Last edited:
What I mean is doesn't say Street Fighter 6 or Tekken 7 at 60fps on PS5/SX have inherently higher input lag than any PC at 60FPS, and then Reflex/Anti-Lag widens the gap even further taking PC even lower. I know that was certainly the case with consoles in the past, I haven't really followed PS5/SX as closely. My understanding was there's some inherent overhead with consoles (or something not exposed to developers) you can't really "optimize" your way around.
There isn't anything inherently there on console to increase lag (except maybe Vsync in some games that can't be force disabled). They're the same hardware mostly, PC builds often have better CPUs but they are architecturally near identical.

PC streetfighter has about the same input lag as PS5 street fighter. There is a caveat though. That is if you are using a 120hz monitor on PC with similar specs. The PC people who tend to complain likely play on 240hz and 144hz monitors at higher fps. Then when they switch to a 120hz TV @ 60fps they see lag.

Reflex/antilag exists on console by default for a well optimised game.
Reflex is something introduced to PC because it was getting higher input lag when there was no framerate cap. I remember in the past people would turn on framerate caps to lower lag.

The latency they get rid of on PC is because the CPU would be churning out frames to the render queue faster than the GPU could render it. For console if you were getting the most out of your game engine the CPU and GPU would be in sync for a given frametime. There was no name for it and no settings to turn off/on but you got it by default in a well optimised game.

the difference is reflex.
Wgq07u4F6ycOk9hK.png
Try not to rely too much on DF their input lag tests are usually ass. Especially when nvidia are involved.
 
Last edited:
I only use frame gen when it's absolutely necessary, but it's funny all the people crying about artifacts and latency with FG. If I showed you a video of a fully maxed out Cyberpunk at 4k with FG, not one of these people would be able to point out these egregious "artifacts" or latency issue. The benchmark has FG adding ~5ms of latency in my tests, you're not noticing that.

Yeah, it's not always perfect, and it's generally only good when you have a decent starting point to begin with, but this idea that it's always a net negative is just stupid. Frame gen is also something that needs to exist, much like with DLSS, because developers have gotten fucking lazy and don't optimize games for shit anymore. And even for developers who do push things, and optimize, things like path tracing are expensive so more options to deal with that aren't a bad thing.
Point out added latency on a video? Well, that's not possible. Seems to me you really don't understand the problem of added input lag. It will ALWAYS be there with Fake Gen and many people are noticing it immediately when they play.
 
Is this in reference to the upgrade to PSSR launching next year on the PS5 Pro, or the next gen of PSSR meant for the PS6 family of devices?

Considering Cerny referenced frame gen in one of his SIE/AMD videos not too long ago, it would be odd for it to not be included in the PS6 gen of devices. Especially with HDMI 2.2 going to be a thing by the time the PS6 lands, future frame gen can let the PS6 hit upwards of 240/480 FPS. Plus frame gen on the PS6 Portable could let it punch well above its weight class.

So yeah, I think some sort of future frame gen is in the works for the PS6 family of devices.
 
Is this in reference to the upgrade to PSSR launching next year on the PS5 Pro, or the next gen of PSSR meant for the PS6 family of devices?

Considering Cerny referenced frame gen in one of his SIE/AMD videos not too long ago, it would be odd for it to not be included in the PS6 gen of devices. Especially with HDMI 2.2 going to be a thing by the time the PS6 lands, future frame gen can let the PS6 hit upwards of 240/480 FPS. Plus frame gen on the PS6 Portable could let it punch well above its weight class.

So yeah, I think some sort of future frame gen is in the works for the PS6 family of devices.
I think it's in general but it wasn't specified. Keep in mind that Black Myth: Wukong already had framegen on a base PS5. It's just not something developers like in general this rumour is saying so there seems to be little effort to work on it. Not sure how true that is though because I remember the same talk where he namedropped it (but didn't say if they're implementing it).
 
There isn't anything inherently there on console to increase lag (except maybe Vsync in some games that can't be force disabled). They're the same hardware mostly, PC builds often have better CPUs but they are architecturally near identical.

PC streetfighter has about the same input lag as PS5 street fighter. There is a caveat though. That is if you are using a 120hz monitor on PC with similar specs. The PC people who tend to complain likely play on 240hz and 144hz monitors at higher fps. Then when they switch to a 120hz TV @ 60fps they see lag.

Reflex/antilag exists on console by default for a well optimised game.
Reflex is something introduced to PC because it was getting higher input lag when there was no framerate cap. I remember in the past people would turn on framerate caps to lower lag.

The latency they get rid of on PC is because the CPU would be churning out frames to the render queue faster than the GPU could render it. For console if you were getting the most out of your game engine the CPU and GPU would be in sync for a given frametime. There was no name for it and no settings to turn off/on but you got it by default in a well optimised game.


Try not to rely too much on DF their input lag tests are usually ass. Especially when nvidia are involved.
Yes God of War had 60ms of latency on PS4 Pro (at 60fps) last time NXGamer tested (so less than the PC port). On Playstation DF used the 30fps mode against 60fps on PC. Notice they are saying 60hz screen, not 60fps framerate. Desingenious reporting and typical DF framing against anything Playstation.

Destiny 2 has about the same latency on PS5 vs PC at 60fps. We know many COD games also have record low latency on PS4/PS5, often down to ~40ms at 60fps.

I don't want those fake frames adding latency particularly when combined with FG artefacts + bluring in motion from FSR.
 
Last edited:
Isn't PSSR 2 supposed to come in like 6 month ?
The fact that the specs are still unknown is normal ?

I have no idea of the time frame of those type of endeavor, but i thought that the last 6 month is more tweaking and bug chasing ( :messenger_tears_of_joy: )so shouldn't we know more about the thing architecture/inner workings ?

Anyway less ghosting and artifcats is good for sure.
 
x2 frame gen is pretty good if your frames are already 50+
Turn off Vsync and turn on Nvidia Reflex.
I used it with Indy along with Path Tracing and I honestly didn't notice any latency and had 100+ frames almost constantly.
 
Fuck framegen, its exactly opposite of dlss/fsr where in those fluidity and responsivility takes priority over image quality, with framegen u get worst from both words- way worse responsivility and on top all kinds of ai upscaling artefacts, its basically blurry mess of epic proportions on top of making u feel like u playing some "underwater lvl" whole time :messenger_astonished:
Even with all that any meaningful implementation of framegen happens when ur "native" fps is around 60 or better yet- north of 90, and by that time u dont really need ai upscaling anyways :P
 
Framegen is already viable and keeps getting better - rather like image reconstruction (remember). The end point will be perfect 'fake' frames with no added latency. But hey - keep playing catchup with Nvidia. You'll get there one day.

No it won't. Any sort of frame generation needs original frames to work from, hence it definitionally has to take additional time after the source frame is completed to create it.

Its just image smoothing, and does not affect how the actual game-code INCLUDING THE INPUT/LOGIC LOOP is running, hence actual latency is always going to be relative to what the the source frame rate is. A rate usually lowered to allow for smooth frame-pacing and hence consistent numbers of generated frames.
 
On a side curiosity. Tom of MLID commented in his last video that FSR4 ("Redstone") for RDNA3 would come early next year.
Let's see if this is true, or just more of the usual crap....
 
I have very little issues with two times generation but even though I have a 5090, I don't even rely on that other than for probably cyberpunk. It helps if the implementation is reasonable but I am glad to hear about this as I'm listening to the latter half of that podcast as well at the very moment.

And we already and somebody who recently got a PlayStation 5 pro, I'm definitely excited to see these advancements and glad that Sony seems to be taking this very seriously in order for this to run on different hardware. Do you get you excited to see the implementation for the console and potential handheld. I say that's a good thing even though I would say supporting it is a good thing as well just to have options. The apis are there so I think that Nvidia has made a lot of ground work on this and AMD is playing catch up while Sony seems to have surpassed AMD if you believe the rumors
 
Frame gen can be good, going from 50-60 base for 100-120fps target produces nice results if implemented correctly by developers (and they fuck it up quite often, mainly hud elements). Personally I try it in most games (how it works) but actually used it (whole playthrough) in just few.
 
They'll include it in five years when everyone else will be using it.
It's Sony and Cerny, they are the first to everything in gaming tech after all /s

On a more serious note, FG has limited applications to consoles since you pretty much have to run at 60+ FPS for FG to be usable (there are exceptions where you can make do with as low base FPS as 40 but they are rare).
For consoles this means a mode which will use the 60 FPS mode to hit a no man's land of 90 to 120. Something which could work with VRR but will require a lot of refinement in areas of frametime health and general compatibility (let's not forget that Sony still can't do 120Hz RGB or LFC in 60 FPS output). So it's expected that they "hate it" - the reasons are they own platform issues though and not FG itself.
Never change dude never change. I guess for you it's fundamental remind that Sony and Cerny are always the last in gaming tech because seems you never miss the chance to make similar subtle insinuating in whatever tech discussion they are involved somehow. Though, it's a rumour by an anonymous source who the hell know how much reliable is such comment but sure arrogant incompetent Sony as always.😉
 
Last edited:
idk. With Lossless you can actually play PS2 games (RE4 was the one I played the most) at 60 FPS without any weird artifacts, its seriously incredible, that and DLSS are the true revolutions in AI usage imho
 
Last edited:
My understanding was there's some inherent overhead with consoles (or something not exposed to developers) you can't really "optimize" your way around.
That exists on PC (and 'reflex' is a workaround for it), on consoles it's a developers choice all the way. The fact that it's been low-priority for developers in general isn't the fault of platforms.
Eg. during VR launches, 'late data latching' (which is really just another name for doing things like Reflex) was something you could 'actually' do on a console. On PC it was just a name for theoretical optimization that some driver hacks made possible, but it was vendor and GPU model specific for each implementation so noone ever used it in practice.

No it won't. Any sort of frame generation needs original frames to work from, hence it definitionally has to take additional time after the source frame is completed to create it.
Not entirely as bad.
You can extrapolate just using the original frame and future motion vectors (not wait for two full frames), so it's possible to reduce latency (relative to source framerate) - though not by the same amount as actually running at 2x native framerate.
But there are no current middleware frame-gen solutions that do this (Intel was working on it but that was before they nuked their company so I doubt we'll see it now).

This has been done in VR already though - just so we're clear it's not just 'theoretical'.
 
Top Bottom