analog_future
Resident Crybaby
I stole this from reddit but it highlights very effectively how stupid people sound with these new RTX 5000 cards
Last edited:
4000 cards?
I stole this from reddit but it highlights very effectively how stupid people sound with these new RTX 4000 cards
It actually made games look better and feel the same as before.
FG introduces lag and artifacts in many scenarios. Stupid comparison.
Is it fake, though?
AI is just a smarter calculator.
AI already knows the answer (because its pre-calculated) and all it needs to do is enhance the work already done the hard way, giving raster calculations more time to breathe.
"Fake frames" are just as real as calculated, only produced at a much faster rate.
Eventually, AI cards are the future, and with luck, make gpus considerably cheaper.
The funny thing about this picture is that it's misleading. Because it combines upscalling which gives better perf with frame gen. But even in this picture we can see how much lag FG introduces if it needs three times the framerate to have same latency as Native (112 fps vs 42 fps).The funny thing about this comment is tech like Reflex didn't exist back in the Geforce 3 days and latency was higher back then than DLSS+FG+Reflex is today.
The funny thing about this picture is that it's misleading. Because it combines upscalling which gives better perf with frame gen. But even in this picture we can see how much lag FG introduces if it needs three times the framerate to have same latency as Native (112 fps vs 42 fps).
Thanks to FG? No, thanks to Reflex, so what are you talking about?It has nearly half the latency as native (62.6ms vs 101.5ms).
Thanks to FG? No, thanks to Reflex, so what are you talking about?
I stole this from Reddit
Framgen is something completely different. They are showing games running at 22 fps being boosted by framegen.
Framgen is something completely different. They are showing games running at 22 fps being boosted by framegen.
DF literally took a shit on Black Myth Wukong's PS5 performance mode because they used framegen to go from 30 to 60.
Before reflex people ran a fps cap to reduce latency since PC games were just bad at balancing CPU and GPU demand. However I'd still take no framegen with reflex than reflex with framegen, especially no multiframegen. And to think some were complaining about Killzone Shadowfall targeting 60fps on a PS4 using 'fake pixels' and starting up lawsuits.The funny thing about this comment is tech like Reflex didn't exist back in the Geforce 3 days and latency was higher back then than DLSS+FG+Reflex is today.
But your whole point is to praise FG. Reflex is not part of it. I can use it without any DLSS.Well are we talking native or not? Or are we picking and choosing what technologies on top of "native" count towards our final calculations?
Again, Reflex was not something that existed in the Geforce 3 days.
But your whole point is to praise FG. Reflex is not part of it. I can use it without any DLSS.
So apples to apples comparision would be to have Reflex always on and then on top of that compare DLSS, FG, native...
In my original post I said that FG introduces lag, and your screen confirms that, because every bar with FG on has higher latency than without FG. Which actually confirms what I wrote.
Don't forget about rendering artifacts, it's not only lag.My whole point is people are bitching about the latency that FG adds while ignoring the fact that we all played with similar latency before Reflex existed and no one had a problem with it.
We all played OoT at 15fps too. Just look at people bitching about 30fps today and its latency (usually just 16ms more). Multiframegen adds more than 16ms even. It's going to be the same sort of complaints about 30fps vs 60fps. Some don't like the increased latency.My whole point is people are bitching about the latency that FG adds while ignoring the fact that we all played with similar latency before Reflex existed and no one had a problem with it.
DLSS cannot take a 22 fps game and get it to 60+ fps unless you go native to DLSS ultra performance which is 720p.Frame gen isn't boosting any games running at 22fps. DLSS is bringing the framerate & input latency up to an acceptable level, and then frame gen takes it from there. Not the same.
its not latency. I see ghosting and TV quality motion interpolation effects. its so obvious they are fake frames that my mind rejects it.My whole point is people are bitching about the latency that FG adds while ignoring the fact that we all played with similar latency before Reflex existed and no one had a problem with it.
Or like the 500 series which were just 400 series with some of the power management improvements that some non-reference 400 cards ALREADY had or had an even better version of, basically re-selling a generation to people (ala gamecube and Wii)I was smart and bought a geforce 4 instead.
Oh wait, it was an mx which was a rebranded geforce 2. Nvidia, you did it again!
In heavy RT games, DLSSQ can indeed double the frame rate. I saw this happen several times on my RTX4080. In raster games, however, DLSSQ does not improve the fps that much.DLSS cannot take a 22 fps game and get it to 60+ fps unless you go native to DLSS ultra performance which is 720p.
Their own examples showed Cyberpunk PT running at 22 fps without DLSS at native 4k. DLSS Quality does not double performance. Balanced might get you 44 fps. Performance might get you to 50 fps. Then framegen takes over. Still lower than the 60 fps Nvidia, AMD and DF recommend for framegen.
We all played OoT at 15fps too. Just look at people bitching about 30fps today and its latency (usually just 16ms more). Multiframegen adds more than 16ms even. It's going to be the same sort of complaints about 30fps vs 60fps. Some don't like the increased latency.
The Geforce 3 was a revolutionary GPU thanks to shaders, and it also powered the very first Xbox console. I will never forget how blown away I was when I first saw 3Dmark in 2001, especially the "nature" test. Gaming performance was also amazing, my previous geforce 2 MX 32MB had somewhere around 15fps in quake 3 at 1600x1200x32, while my Geforce3 128MB around 80fps, 5x incrase. Good times.Too late
It's crazy how dinky and little the fans and heatsinks were back then and how much power use has blown up
If it's over 60 fps, then fine. But even then, i am comparing it to what im running with dlss quality right now. Its not like i am upgrading from an AMD GPU. If Heavy RT games got better DLSS scaling then the 20% upgrade we are seeing is still not enough.In heavy RT games, DLSSQ can indeed double the frame rate. I saw this happen several times on my RTX4080. In raster games, however, DLSSQ does not improve the fps that much.
what is the white connector on the lower left side? i know the blue oneToo late
It's crazy how dinky and little the fans and heatsinks were back then and how much power use has blown up
DVIwhat is the white connector on the lower left side? i know the blue one
men, the future is now
Absolutely.I'm one of the dumdums that cannot grasp what is wrong with faux frames. Is it OK if I claim that everyone else is dumdum?
Reduced image quality, motion artefacts, and increased input lag.
I stole this from reddit but it highlights very effectively how stupid people sound with these new RTX 5000 cards
Probably for the same reason people began to act like 1080p sucks too. They got new 120hz displays and anything else became "unplayable". Harder, better, faster, stronger.Why are more people acting as if 60 fps isn't good enough nowadays? I've seen people say 120 fps or bust.
I’m fine with 60 on console. On PC, aim for 90fps minimum.Why are more people acting as if 60 fps isn't good enough nowadays? I've seen people say 120 fps or bust.
Ah man I used to run 2001 way later than it was relevant because I liked seeing those tests get maxed out lol, nostalgiaThe Geforce 3 was a revolutionary GPU thanks to shaders, and it also powered the very first Xbox console. I will never forget how blown away I was when I first saw 3Dmark in 2001, especially the "nature" test. Gaming performance was also amazing, my previous geforce 2 MX 32MB had somewhere around 15fps in quake 3 at 1600x1200x32, while my Geforce3 128MB around 80fps, 5x incrase. Good times.