• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why doesn't Sony or Microsoft add frame-generation as a system level option?

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Why not just give users the option? Many games could benefit.

I'm talking about something like AFMF or Lossless Scaling frame generation (yes I already know there is one game on console using FSR3 FG, I don't care about it for current gen games on console).



It would be a game changer for me to be able to use frame-generation in my backwards compatible titles.
 

Killer8

Member
Frame generation is not free performance. It still takes some hardware power to run it. On a PC where you know you have extra to spare, it is fine. On a console where certain amounts of power have already been allocated to developers and to the OS, you can't really go back on that.

Maybe PS5 Pro will leave room for it as an option.
 

BennyBlanco

aka IMurRIVAL69
The amount of artifacts that it creates is probably not a trade off they would want to make. Nvidia is the best one and they still get motion artifacting.

Well it beats their current solutions to get to 60 fps

IMG-0258.jpg


The little bit that I’ve messed around with lossless scaling it was surprisingly good, and DLSS3 is great
 

BennyBlanco

aka IMurRIVAL69
the best in the business is proprietary tech that would costs money to implement ?

Lossless Scaling was made by 1 nerd in Europe and is really quite impressive.



Obviously it’s not perfect and not as good as DLSS3, but it works on nearly everything and doesn’t touch the games files. It’s better to have this option than not to have it. Using FSR to achieve 60 fps on consoles has been a shitshow.
 

squidilix

Member
Why not just give users the option? Many games could benefit.

I'm talking about something like AFMF or Lossless Scaling frame generation (yes I already know there is one game on console using FSR3 FG, I don't care about it for current gen games on console).



It would be a game changer for me to be able to use frame-generation in my backwards compatible titles.

Enjoy your shitty input lag.

0zhlbhr.jpeg
 

damidu

Member
never saw a problem-free implementation of this even on 4000series.
and the moment its introduced at system level, you can bet shit devs will abuse it and release 20fps games,
just like they are doing with fsr-souped up 700p games now.
 

Audiophile

Member
They should on PS6 by having hooks in the SDK right from the beginning, so that all games released can relay the appropriate information at the system level for system-wide, universal functionality. Mandate a 40fps mode for all games not targeting 60fps; which by 2028 should be enough frames to provide solid frame gen results. Let users toggle it on or off on a game-by-game basis.
 

squidilix

Member
You won't notice it for non-competitive twitch games. You will however notice input lag at 30 FPS.

Also, have you actually tried it?

Did you know having more Hz is for most lower input delay and not higher ?

What's the point if your 120fps is more laggy than 30fps ? Just non-sense
 
Last edited:

Fafalada

Fafracer forever
You won't notice it for non-competitive twitch games. You will however notice input lag at 30 FPS.
Hard disagree - DLSS input lag is already 'quite' noticeable in anything that has camera control to aim, and somewhat unpleasant in 3rd person action games that aren't latency tolerant.
Basically I equate it to the difference between playing game locally and remote - or if the game is remote - adding DLSS lag on top just gets - well not fun at all.
If those numbers for Lossless scaling are true - that actually gets considerably worse from the above.

Why not just give users the option? Many games could benefit.
One problem is that game-agnostic post processing aggravates the latency issues significantly - the latency mitigation techniques that Vendor PRs are parading around aren't something you can 'toggle' in drivers, they need game integration to make a real difference (and can't help/be used in every game even if there was a magic game agnostic toggle).
The other is the cost of course - like Auto HDR this is something you could offer for legacy/BC titles relatively easily - but it's not free - so software that already pushes performance would be impacted differently.
 

StereoVsn

Member
Lossless Scaling was made by 1 nerd in Europe and is really quite impressive.



Obviously it’s not perfect and not as good as DLSS3, but it works on nearly everything and doesn’t touch the games files. It’s better to have this option than not to have it. Using FSR to achieve 60 fps on consoles has been a shitshow.

Yep, and also XSX/S and PS5 have AMD APU so they could implement AMD’s tech for them.
 

RoboFu

One of the green rats
Frame generation doesn't make a game play better. It's a stupid parlor trick. The whole point of higher frame rates is to make a game play better.
 
Last edited:

MaKTaiL

Member
The amount of uninformed people here that know nothing about Frame Generation and its benefits is quite staggering. Most here ditch the idea without even using it.
 

Mr.Phoenix

Member
It will happen eventually. On the PS side, it won't be just frame gen, but it will also be PSSR. First PSSR will come only on the PS5pro, and then so will FG. I expect on the PS6, these would be system features.

Then the real abuse will happen... just watch devs settle for 30fps games and exclusively use FG to level out performance. You give devs that kinda built-in cheat code... they will abuse it.

Frame generation doesn't make a game play better. It's a stupid parlor trick. The whole point of higher frame rates is to make a game play better.
Its a trick, not a stupid one. And no, while higher framerates has better input latency as one of its benefits, its not the whole point. We have to acknowledge that under 80ms latency, there are a lot of people can't notice or tell the difference in non-competitive games. Which is where I expect these techs to mostly be used.

Have you ever panned the camera 360 degs in a 30fps game? Or hell, any kinda significant camera motion.... that is the point of higher framerates.
 
Last edited:

RoboFu

One of the green rats
It will happen eventually. On the PS side, it won't be just frame gen, but it will also be PSSR. First PSSR will come only on the PS5pro, and then so will FG. I expect on the PS6, these would be system features.

Then the real abuse will happen... just watch devs settle for 30fps games and exclusively use FG to level out performance. You give devs that kinda built-in cheat code... they will abuse it.


Its a trick, not a stupid one. And no, while higher framerates has better input latency as one of its benefits, its not the whole point. We have to acknowledge that under 80ms latency, there are a lot of people can't notice or tell the difference in non-competitive games. Which is where I expect these techs to mostly be used.

Have you ever panned the camera 360 degs in a 30fps game? Or hell, any kinda significant camera motion.... that is the point of higher framerates.

Here's a hint then.. don't make 30 fps games. This whole fake ai generated stuff is the exact wrong way we should be going. If the hardware cannot handle something then it can't handle it. We are actually paying more money to cover it up than we use to pay for actual performance improvements. 🤣
 

winjer

Gold Member
The problem is that Frame Generation uses up some GPU power and vram.
So there is a chance that an system wide feature like this would cause some games to have performance issues.
 

hinch7

Member
You kind of want at least 60fps as a base for frame gen to be a good experience. Which both Series and PS5 consoles aren't really tooled for. Frame gen also isn't free and takes up resources. Another thing is input lag which is an issue already with consoles to displays. Adding FG on top would be horrible unless capped to 60fps. Iirc its 100ms game input latency just for PS5 at 60fps native - from DF's findings.

Next generation we'll have vastly more powerful CPU and GPU's. And hopefully with that we'll get more 120hz experiences on console without it looking like a pixelated mess.
 
Last edited:

Mr.Phoenix

Member
Here's a hint then.. don't make 30 fps games. This whole fake ai generated stuff is the exact wrong way we should be going. If the hardware cannot handle something then it can't handle it. We are actually paying more money to cover it up than we use to pay for actual performance improvements. 🤣
Unfortunately, this is a pipe dream... hell even borderline naive lol. There will always be 30fps games. As long as you have finite power in computer hardware, and ever-growing ambitions or standards... there will always be 30fps games. Even the 4090 that is today at the pinnacle of gaming hardware, will sometime in the future be only able to run games at at best... 30fps.

And that thing you said about paying more money to cover it up... couldn't be further from the truth. The issue, is that performance improvements as we traditionally knew them, are a thing of the past. We no longer have performance doubling from a simple node shrink anymore, and those node shrinks are becoming prohibitively expensive. Think about it, going from all the way from 90nm down to 28nm barely raised chip prices up by more than 10%. Going from 28nm to 7nm... saw chip prices rise by over 200%. and going lower is still getting more expensive.

The writing was clearly on the wall... we cannot advance the industry anymore by simply throwing more cores at the problem. We have to throw "smarter cores" at it. And hence... AI.

Oh and then there is the other thing... better-looking games.. even at 30fps... generate more buzz than than better-performing games at 60fps. So if a dev would ever have to choose, between pretty grafixxxx at 30fps or bleeding edge performance at 60fps... we know what they would choose. Case it point, just watch GTA6 be the best-selling game of this gen again.. all the while running at 30fps.
 

King Dazzar

Member
The whole point of higher frame rates is to make a game play better.
Improved input latency is always welcome. But I love the increased motion smoothness and loss of judder even more so. Its not input latency that turns me away the most from 30fps. Its the awful motion processing of the image.

However if we start introducing lots of input latency then the improved image quality trade off isnt worth it - otherwise we'd all be using our TV's in non-game modes.
 
Top Bottom