First RTX 5090 Benchmark: FG vs No FG

looks like I am buying a 5090... I think..

I just dont feel like spending that much lol. I still feel like if I can get a 4090 used for half of that price or a bit more then its worth it ? but the AI thing.... I am 100% sure the 5090 will be compatible with whatever 6090 will offer in terms of AI for having that hardware needed while the 4090 will struggle.

sigh..
They will gimp the 5090 when the 6090 comes out. No reason multiframe gen can't run on a 4090 when it can run on a 5070.
 
The game seems to run at over 60fps all the time even with just DLSSQ, that's very impressive. I have similar results, but at 1440p 😂
 
Last edited:
Most complaints will be by people that will not be buying one. :messenger_winking_tongue:
Could I buy TV for FG instead?




Or this thing:
 
Keep living in the past. Tensor cores don't have free real estate on the silicone. Specialized AI hardware is literally giving us exponential increases.
Let alone, new nudal radioptican neurowave technique for path diffusial reprojection.

Or the new terminator vs CNN/BBC (AMD is just on CNN, Intel also BBC-ed) vs competitors.
 
That latency. Welcome to Console gaming PC gamers.
The Rock Shut Up GIF by WWE
 
Honestly the new Transformer model for DLSS is the biggest difference here and it's coming to all RTX GPUs making performance mode equivalent to Balanced and I assume Balanced to Quality brings new life to the 40 series.

Also anyone else notice that TERRIBLE ghosting and halo effect on the person on the bike.


Ps love pc centric
 
Keep living in the past. Tensor cores don't have free real estate on the silicone. Specialized AI hardware is literally giving us exponential increases.
I guess in your future, artifacts and ghosting are the new normal. I'd rather play at 60fps with none of those things than 240fps with them.
 
Creating 209fps from actual rendered 28fps is pretty insane.
I want to see how it looks though.
 
Last edited:
First benchmark of the 5090 using CP2077:



Seems to up latency from about 35ms to about 45ms going from DLSS only to DLSS + 4x FG max settings including pathtracing. Not bad…

Tested at 4k:
No DLSS: ~28fps, ~65ms latency
DLSS Quality: ~66fps, ~33ms latency
DLSS frame gen 3x: ~168 fps, ~46ms latency
DLSS frame gen 4x: ~209fps, ~48ms latency

Nvidia did work on frametime consistency with FG. No graphs provided though.


87c0e9ed-a939-4473-87ce-1c7cf021b9b8_text.gif
 
So I'm getting 100+ additional FPS at the cost of only 13ms of additional latency on-top of the DLSS Quality performance..

That's... I mean dude I used to play on console which was already way above this kind of latency running natively, 13ms is nothing, that's essentially free performance as far as I can tell.


That's an easy yes to toggling that on holy crap.
 
Last edited:
looks like I am buying a 5090... I think..

I just dont feel like spending that much lol. I still feel like if I can get a 4090 used for half of that price or a bit more then its worth it ? but the AI thing.... I am 100% sure the 5090 will be compatible with whatever 6090 will offer in terms of AI for having that hardware needed while the 4090 will struggle.

sigh..
The AI model is coming to the 4090. If you can get one, go for it; the 5090 isn't amazing for anything other than AI workloads and raytracing. 6000 series is going to be good.
 
I don't understand the latency.
Why at 28 fps native, there's 65 ms of latency? 30 fps latency is 33ms, and 60fps latency is 16ms, so I'm a bit confused about the numbers here.
 
Just don't use path tracing.
Path tracing offers low fps for trivial lighting vs ray tracing.
25fps with path tracing or 90 fps+ with ray tracing and no dlss4
IMO that extra shade you can't see 85% of the time isn't worth the loss of 50+ fps
 
Last edited:
Just don't use path tracing.
Path tracing offers low fps for trivial lighting vs ray tracing.
25fps with path tracing or 90 fps+ with ray tracing and no dlss4
IMO that extra shade you can't see 85% of the time isn't worth the loss of 50+ fps
What The Wtf GIF by Justin


Did you read the thread, see the video or just decide to post first?
 
Actually, the perf could have been deducted without even checking benchmarks.

Mr Huang J. has clearly stated: "The More You Pay...".

That approach had been very consistent. Any major perf bump came with proportional $$$ bump, keeping perf/$ constant.

Half a tier upgrades, like "supers" were coming with a bit better perf/$.


And we can see that with 5000 series:
5090 - full tier buff, +$400 MSRP
5080/5070Ti/5070 - half a tier or lower buff, but a bit better perf/$ than last gen

On top of it, Lord is giving us FG with more Fs.

What is there to b*tch about, b*tches?

Path tracing offers low fps for trivial lighting vs ray tracing.
Buggy Loop Buggy Loop apparently wanted to open a thread highlighting the difference between RT cores (found even in oldies like 2060) and PT cores: new, groundbreaking tech from the recent developments.

But hater f*cks from that A-company fanbase frustrated him so much, he decided not to bother. :(
 
Path tracing is transformative in something like Indiana Jones and imo worth the extreme hit to performance. I couldn't imagine playing without it, as I greatly appreciated the serious enhancement to the game's visuals. 60fps with PT in a single player experience is my preference over 100+ fps without it, without a doubt.
 
Just don't use path tracing.
Path tracing offers low fps for trivial lighting vs ray tracing.
25fps with path tracing or 90 fps+ with ray tracing and no dlss4
IMO that extra shade you can't see 85% of the time isn't worth the loss of 50+ fps
Everything depends on the settings. In black myth wukong full PT tanks performance, but medium PT still looks very good and runs comparable to lumen (at least on my PC). I get 123fps with medium PT and 127 with with lumen. In my case it doesnt make sense to turn off PT.
 
Last edited:
Everything depends on the settings. In black myth wukong full PT tanks performance, but medium PT still looks very good and runs comparable to lumen (at least on my PC). I get 123fps with medium PT and 127 with with lumen. In my case it doesnt make sense to turn off PT.
My thoughts exactly.
A lot of times "ultra" looks practically the same as high and in some cases medium.
But not everyone thinks like that.
It's either ultra or nothing.
So in that sense, if ultra pt isn't an option for them, then just go with rt ultra.

As for me, if dlss4 lets me get away with pt ultra and I get very satisfying frame rates, you can bet my sweet ass I'm going to take FULL advantage of it.
 
The AI model is coming to the 4090. If you can get one, go for it; the 5090 isn't amazing for anything other than AI workloads and raytracing. 6000 series is going to be good.

I think people will get disappointed when 6000-Series gets announced.
The way I see it, more and more die-space will go towards Ray/Path-Tracing and AI Hardware and less and less space towards raster.
With the tech now, Nvidia has no incentive to push raster performance by 50% or anything crazy like that ever again because they don't care about native performance anymore, especially at 4k.
DLSS is there to lower the needed brute-force (raster) power and then frame gen will boost the FPS even further with Reflex lowering input latency.

I am fairly confident that 6000-Series will only be about a 20-25% increase as well as more and more die-space will be used for Tensor and CUDA-Cores to boost Raytracing and Pathtrycing performance instead of raster.
 
People may hate on Nvidia for their pricing, but you can clearly see they are putting a lot of effort into features and advancing their systems.

I think if the results show across the board, these are some pretty great advancements. And if their hardware actually performs with them dropping the price, and all this is to be determined, then you have to give them a little bit of credit.

If it wasn't for the pricing of last generation and obviously they're high-end being really expensive, I don't think people would be bitching as much.

This isn't like Intel sitting on their hands and AMD coming along and eating their lunch, deservingly so. Nvidia continues to innovate and advance their strengths and I cannot recommend something else especially if price isn't as much of an issue versus wanting the best quality and features of your gpu. I can say all that and still give AMD credit for the strides they are making but Nvidia is the best and for many reasons.

Edit

I have some opinions regarding cyberpunk and it's dlss 3 implementation. I liked it but I did not like the smear I noticed on text specifically. Maybe it's the neon nature and the coloring or the way the game handles things but that's one of the main reasons I turned off frame generation and just went with standard dlss 2. And that game is a showcase and runs incredible on the hardware that I have, as it should. I have used frame generation in other games and I believe in plague tail requiem it ran great and I had no complaints. Of course you can zoom your camera in and somebody with microscopic analyzers will call it all out but if I barely notice it and I look at this stuff all the time, the regular consumer won't give a damn. And as long as latency is kept in check, and it seems like Nvidia is doing amazing strides in this department, then whether people like it or not we have some cool hardware advancements if you want to spend the money on the newest generation stuff.

I'm actually going to echo what I said coming into the PS5 generation and say that this is the only one of the most interesting times in hardware and graphical advancements since the Advent of 3d. I think there is some cool new technology and both software and hardware are really pushing this kind of stuff. I think the AI boom has clearly had an impact but right now I really am liking some of the things we are seeing.

On the back end when it comes to developing games, I do hope developers continue to make strides and efficiency and getting the most out of the hardware without using artificial intelligence. Or using it in a smart way to offload rendering or do jobs that are traditionally handled in certain ways but changing it to reduce overhead. So if we can get AI to work not on producing fake frames but making something more efficient versus allowing the hardware to handle it, I think we may have something.

Even if it requires a separate chip or an additional thing on the motherboard to do this, I'm all about spreading the load and not straining the main processors as much. It's kind of like having a dedicated graphics card to handle all the graphics versus onboard. You're not just straining the CPU and everything on the SOC and can offload intensive tasks to a dedicated card. I think we are in for an interesting time if developers and hardware manufacturers can figure out the way to implement some things and take the strain off where games actually feel more optimized without requiring ai.

But as you can read for most of my post, I fully embrace this and I don't criticize it. I just want developers and creators to figure out ways to optimize it more that way we can say we can run cyberpunk native or a path race game similar to Black myth, at at least 60 frames native and shooting up to three or 400 frames with all the bells and whistles of ai. Maybe one day we could be there.
 
Last edited:
This is what i wanted to see. This kicks the living balls off Lossless scaling. Granted Lossless scaling is only 10 dollars.
But Lossless scaling can be updated.

Luckily latency is harder to notice with controller so it gets masked a bit on console. I assume Reflex can mask a bit of this as well.
Unluckily it is not harder to notice, consoles supports M+kb on some games, and Reflex doesn't mask anything as it literally reduces latency.
 
Last edited:
I think people will get disappointed when 6000-Series gets announced.
The way I see it, more and more die-space will go towards Ray/Path-Tracing and AI Hardware and less and less space towards raster.
With the tech now, Nvidia has no incentive to push raster performance by 50% or anything crazy like that ever again because they don't care about native performance anymore, especially at 4k.
DLSS is there to lower the needed brute-force (raster) power and then frame gen will boost the FPS even further with Reflex lowering input latency.

I am fairly confident that 6000-Series will only be about a 20-25% increase as well as more and more die-space will be used for Tensor and CUDA-Cores to boost Raytracing and Pathtrycing performance instead of raster.
I don't disagree. I can see raster stagnation but continue large leaps with AI and RT perf. I wonder if there will be ramifications with old games.
 
But that is amazing performance.
Without looking at the hardware and power draw numbers, yes.
If you figure those in there really isn`t much progress on architectural level over ADA. This gen seriously is just about bruteforcing it for the 90 and faking it on anything below with MFG. The classic scaling and node shrink gains per gen-timeframe we´re used to are completely absent.....Moore`s law is currently dead.
 
Last edited:
But that is amazing performance.
Not really. It's basically just a 4090 Ti. They increased Cuda cores by 30% and power draw by 30% and got a 30% boost. Nothing special about that. A new architecture is supposed to bring performance gains on it's own, while actually trying to keep power draw the same or even reduce it.
 
ngl a little bit of each.
I still stand that path tracing isn't all that worth it if you're not going to use dlss4

Couldn't disagree more. Path tracing is an absolute game changer and completely transformers the look of the game. Most recent example is Indiana Jones, where it's literally a generational leap.
 
Just don't use path tracing.
Path tracing offers low fps for trivial lighting vs ray tracing.
25fps with path tracing or 90 fps+ with ray tracing and no dlss4
IMO that extra shade you can't see 85% of the time isn't worth the loss of 50+ fps
Nah at 4k Path Tracing is definitely worth to run even at DLSS performance it will provide a better presentation than 4k native with no path tracing
 
Top Bottom