• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Should AI Frames be considered legitimate in FPS Performance Comparisons

Should AI Frame Generated Frames be considered

  • Yes - A frame is a frame man

    Votes: 55 17.0%
  • No - Fake frames should not be considered

    Votes: 206 63.6%
  • Other/Depends

    Votes: 24 7.4%
  • Both should be considered

    Votes: 39 12.0%

  • Total voters
    324

Makoto-Yuki

Gold Member
if it makes the numbers go up then yes. it might have issues but it has been improved and will continue to improve.

AI is the future. Rasterization is archaic.

Never had any issues with "fake frames" or even DLSS (alright maybe the early days but now it's just as good if not better than native).

people are just mad they can't afford a 4090 or 5090.
 
We have been doing comparisons for years where the native resolution is always used + the method used for rescaling because it would change now? With the frames it would be the same, what is the real performance and what is it with AI.
 

BennyBlanco

aka IMurRIVAL69
In the few games Ive used it for like FH5, Aw2, Cyberpunk…. It’s been quite good. I also didn’t notice any added latency. But I wouldn’t use it for something like Tekken or CoD. If their new tech and especially the new reflex is really a big improvement then it’s going to be very good.

But they are obfuscating normal performance numbers with it. It’s a good thing to have but I would only use it if I felt like I had to.
 

Buggy Loop

Member
They'll have to compare without frame gen and then with. Which is fine.

It depends on the games you play. I think for competitive multiplayer its not a good option, but you're already running these games at insane FPS anyway. A fully path traced game that is single player and is a showcase of graphics? I would prefer frame gen insertion to get the whole package eye candies than to lower settings. You don't have to use full 4x, 2x or 3x will probably satisfy 99% of peoples.

The models will only get better. Its inevitable that a few gens from now that we not even nit picking this anymore.

Its the same discussion we had for upscalers when they first showed up. They're in all benchmark conversations now. With FSR/XeSS/DLSS vs off, etc.

I don't need raw raster anymore. That is already blazing fast and would cap my monitor framerates. 🤷‍♂️ Why even use framegen for those?

This is really for "WTF how can it run" path traced games. Reminder that 2019 we had Quake 2 RTX, very simplified geometry and light sources and that was kneecappings GPUs. A city like Cyberpunk 2077 path traced in 2023 is close to a miracle in such a short time.
 
Last edited:

a'la mode

Member
Framegen nonsense should just stay in Nvidia Powerpoint presentations so they can get to claim they're making huge generation leaps while pretending to render. Performance comparisons should be raw numbers without framegen, but there should be numbers for both native and upscaling - so for example, 4K native and 1080/1440 upscaled to 4K are both useful and good metrics.
 
Last edited:

Makoto-Yuki

Gold Member
my 4080 can't max out my 360Hz monitor.

i'm getting a 5090 and if it can hit 240+fps in Cyberpunk then i'll be very happy!! even if it is "fake frames". i played through all of Cyberpunk at max settings with frame gen and enjoyed every single frame of it.
 

TintoConCasera

I bought a sex doll, but I keep it inflated 100% of the time and use it like a regular wife
If it makes the game run better then yeah. Really helped me with Dragon's Dogma 2 and got a much better experience thanks to it.

My only gripe is when the game runs like shit by default and someone tries to use the "but it runs fine with framegen" argument because in those cases the game won't feel near as smooth as it should.

At least for now, that might improve in the future. Framegen is still on it's early days imo and it could get much better in the future/soon.
 

foamdino

Member
Frame gen generates frames through a different set of operations compared to the normal process - I think it's silly to compare two separate pipelines and pretend they are the same (as they are not). Also input latency in traditional processes scales with rasterised frames, which is why (traditional) high framerates feel better - now we can have frame-gen 100+ fps which feel 30fps (because that's what they actually are). Also generating multiple frames from a single frame is *bound* to lead to terrible artifacts in the image quality. This whole set of nvidia gpus is designed first and foremost for AI/ML training in datacentres, they're gaslighting gamers and cooking up weird "gaming scenarios" for AI tech.
 

StereoVsn

Gold Member
Frame Gen and techniques such as DLSS/FSR are trickier to test since they involve a multitude of other parameters.

What about motion blur, visual artifacts, image stability, latency, CPU/VRAM overhead and all the other side effects of the modern enhancement techniques?

IMO, general rasterization performance should be measured, with no RT and RT. And then we would want to see DLSS/FSR with analysis and on top of that Frame Gen with analysis.

Thing is above is more laborious/time consuming to perform so more expensive for content creators. It’s not an easy thing to do.
 

DonkeyPunchJr

World’s Biggest Weeb
Voted no in the poll, but I agree with this. Shouldn't be an either/or. As long as we are comparing like for like then it is valid.
there are only a handful of really demanding games that push high end GPUs to their limit e.g. Cyberpunk 2077 with path tracing.

If I’m playing a game like that, you can bet I’m going to use DLSS/frame generation. I don’t really give a shit if it only runs at 15 FPS vs 20 FPS if I run it at native resolution no frame gen. That data is more of a curiosity than a useful data point that’s going to influence my purchase.
 

Buggy Loop

Member
Ain’t it all fake frames? It’s just 1s and 0s…

496dc7f3371d10d07bb94e101fde11c7.gif
 
I put Other/Depends. This is just for me personally, but if the image quality of the generated frames is not noticeably degraded during gameplay, and the latency is not noticeably worse, then I'll take fake frames any day.

But that's very subjective, so I don't have an answer as to how you would objectively define at what point it reaches "parity" with native to do an apples-to-apples comparison. So...I guess to get a clear picture you'd need: 1) Native vs Native and then for fake frames/DLSS, you'd have to do your own benchmarking based on your own perspective?
 
I put Other/Depends. This is just for me personally, but if the image quality of the generated frames is not noticeably degraded during gameplay, and the latency is not noticeably worse, then I'll take fake frames any day.

But that's very subjective, so I don't have an answer as to how you would objectively define at what point it reaches "parity" with native to do an apples-to-apples comparison. So...I guess to get a clear picture you'd need: 1) Native vs Native and then for fake frames/DLSS, you'd have to do your own benchmarking based on your own perspective?

I won’t because of latency.
 

Mithos

Member
Both a 4090 and 5090 can do DLSS so it can be excluded/ignored.
Both a 4090 and 5090 can do FG so it can be excluded/ignored.

So non DLSS/FG should be the base.

Comparing FG to MFG could be an extra thing to do though.
 
Who cares that much, its optional in 99% of games.

Personally, I will use it if its implemented properly because Input lag can be minimised and more frames - its a win imo.

I really believe the focus should be on removing input latency as much as possible. It’s one of the most important technical aspects for having a good experience. Stable frame times with ultra low latency makes for a super fun responsive game. Once you get used to it, it’s almost impossible to go back. Everything feels off.
 

Kataploom

Gold Member
No, it literally reduces performance, in the case of LSFG it's around 20% even. It is great for smoothing animations, camera movements, etc. But it's not performance improvement, it's just animation smoothing, that's it.
 

kevboard

Member
soon we will need a 1 to 1 comparison and an AI enabled comparison.

DLSS Super Resolution already made native resolution less important, DLSS Frame Gen could do this for frame rates as well.
Nvidia Reflex 2 seems to massively reduce latency, which means one of the main downsides of frame gen is mitigated.

so ignoring it during reviews is also not the way forward.
it should be part of the review and highlighting the amount of latency and the motion clarity/integrity will be important.
 

SScorpio

Member
One thing I don't see talked about in the discussion with generated frames is the additional CPU hit that would be required to hit those same higher frame rates with pure rasterization.

There is bottle necking caused by older CPUs paired with higher end new GPUs. But with most games you have a single thread or maybe two or three which are where the main processing takes place. With some additional work divided out to other cores which fly through them and then sit idle. But the frame rate you can hit will still be capped by those main threads.The existing single frame generation we've had this last generation of video cards had the option of motion vector data on options to help it decide how objects were moving to optimize the generated frame. Now it can have input data as well which will affect everything in the scene. Which will mimic the smooth motion benefit we have with higher frame rates.

It will be interesting if the market moves in the way NVIDIA is thinking it will. I bought the original GeForce 256 at launch and it was a game changer. Being able to offload things that the CPU previously had to do suddenly made games play so much better. But will this AI rendering do the same to the market? Remix being able to replace a 3D object in a game in real time without modifying the executable leads to me think we might be seeing some along the lines of the original Talisman project Microsoft was undertaking during the extremely early days of Direct3D vs OpenGL vs GLIDE.

With raster you are pumping out all of draw calls to form a full scene every frame. Talisman was object based. It was able to take a 3D object and create a 2D sprite of it. If the next frame you had the same angle, lighting, etc. Just output the saved sprite out and you didn't spend anytime rendering it. So this isn't just taking a full image and applying a fancy Photoshop filter. You could have things closer to the camera moving faster than further way to hand proper parallax effect. We could be on the verge of extremely fluid motion, especially if the rendering if aware of rigging and animation, and animation was one of the things Jensen mentioned. People forget the early days of 3D where 15-20 FPS, with the "magic" the Voodoo 1 brought was solid 30 FPS. It's been recent that solid 60 FPS on the PC was a thing which elevated it over console. In simpler e-sports games made to run on everything you can get 240, 360 FPS. Having new techniques to hit those rate in extremely complex scenes needs something more than straight brute force.
 

Kilau

Member
As someone planning to get a 5000 series card, no way. Fake frames should be considered for playability but it’s not real performance.
 

Edgelord79

Gold Member
If people can’t tell the difference then absolutely not. It doesn’t matter at that point.

If we aren’t at that point then yes they should be kept separate.
 
Last edited:

Buggy Loop

Member
No, it literally reduces performance, in the case of LSFG it's around 20% even. It is great for smoothing animations, camera movements, etc. But it's not performance improvement, it's just animation smoothing, that's it.

So say we remove AI upscaling, frame gen and reflex. You're left with native

How's that performance and latency?

kurt-kurt-angle.gif
 
Top Bottom