Doom: Dark Ages… oofff 5000 Series

MikeM

Member
0ocZHEP.png


9070xt beating out the rtx 5080 in an RT mandatory game was not on my bingo card.

Season 1 Nbc GIF by The Good Place
 
Not much info to go on here but id's latest engine is one weird beast. Indiana Jones ran terribly maxed out on a 4080S I used to have, like we're talking 10fps. But that was VRAM related and 9070XT have 16GB too.
Was there any theories by hardware unboxed in the video?
 
With PT it will be quite different. What is funny to me is that ada beats Blackwell in 1080p and 1440p (basically a match in 4k).
 
Not much info to go on here but id's latest engine is one weird beast. Indiana Jones ran terribly maxed out on a 4080S I used to have, like we're talking 10fps. But that was VRAM related and 9070XT have 16GB too.
Was there any theories by hardware unboxed in the video?

that's because you used the wrong settings. you should have set your texture pool to medium, which will result in barely any visual differences but would have given you 100+ fps on near max settings...

I got 60+ fps on a 3060ti at 1440p on pretty high settings... with DLSS quality mode (which looks better than native in iD Tech games) I got 80fps
 
Last edited:
Not much info to go on here but id's latest engine is one weird beast. Indiana Jones ran terribly maxed out on a 4080S I used to have, like we're talking 10fps. But that was VRAM related and 9070XT have 16GB too.
Was there any theories by hardware unboxed in the video?

These results are consistent across all PC focused outlets.

performance-upscaling-2560-1440.png
 
No source posted, can't tell if this is native or with FSR enabled. Seems like a fanboy win at the moment. nVidia also does not have a driver yet.
 
Last edited:
Nvidia did a great job fooling everyone with the initial RT games that likely used Nvidia APIs and drivers.

I remember this started changing around the time Far Cry 6 came out and the RT performance was roughly identical between nvidia and AMD cards and we all chalked it up as a one off.

But then the City Sample came out a few months later, and hardware lumen ray tracing basically ran the same on both Nvidia and AMD GPUs despite running on AMD's inferior rt cores. Its since got a small 7% boost giving nvidia cards a small edge but nothing like cyberpunk.

This has held true for all UE5 games with hardware Lumen, and most recently AC Shadows. Even Avatar's performance wasnt as bad as cyberpunk or control, and other original RTX showcases. IIRC, the 6800xt was within 5-10% of the 3080 whereas in cyberpunk, they arent even fucking close.

I bet GTA will be the same since its being developed on the base PS5 and XSX.
 
What's your point? :

performance-2560-1440.png
still not updated drivers from nVidia.

Please note that the NVIDIA driver (576.40) that we used is newer than the first press driver for the game (576.31). The new build includes performance improvements that will be included in an upcoming public driver. We measured an impressive 5% performance improvement across a wide range of cards and generations.
 
Last edited:
i would like to see it running on my PC. I dont believe in these tests. You probably change some small settings and will get much better results
 
So now we need to wait for drivers to cover for new Nvidia cards?

The Office Someone GIF
We shall see. I'm not a fanboy and will go where the performance takes me. If it turns out the 5080 blows dick over time, then I won't buy it. Simple. I can roll with my 3070TI for a while or go AMD. It's just unlikely that nVidia would perform this badly, but there's always time for a first.
 
Last edited:
We shall see. I'm not a fanboy and will go where the performance takes me. If it turns out the 5080 blows dick over time, then I won't buy it. Simple. I can roll with my 3070TI for a while or go AMD. It's just unlikely that nVidia would perform this badly, but there's always time for a first.

Dude, the 5000 series being a shitshow is nothing new.
 
that's because you used the wrong settings. you should have set your texture pool to medium, which will result in barely any visual differences but would have given you 100+ fps on near max settings...

I got 60+ fps on a 3060ti at 1440p on pretty high settings... with DLSS quality mode (which looks better than native in iD Tech games) I got 80fps
I know the tweaks, played through it long ago, but the engine behaved badly when VRAM peaked above 16GB.
 
I know the tweaks, played through it long ago, but the engine behaved badly when VRAM peaked above 16GB.

well... this doesn't happen if you use the correct settings 🤷 the texture pool settings are specifically there to adjust this.
you'll never even really need to set it above medium as the difference there are already super minimal compared to the max texture settings
 
Nvidia did a great job fooling everyone with the initial RT games that likely used Nvidia APIs and drivers.

I remember this started changing around the time Far Cry 6 came out and the RT performance was roughly identical between nvidia and AMD cards and we all chalked it up as a one off.

But then the City Sample came out a few months later, and hardware lumen ray tracing basically ran the same on both Nvidia and AMD GPUs despite running on AMD's inferior rt cores. Its since got a small 7% boost giving nvidia cards a small edge but nothing like cyberpunk.

This has held true for all UE5 games with hardware Lumen, and most recently AC Shadows. Even Avatar's performance wasnt as bad as cyberpunk or control, and other original RTX showcases. IIRC, the 6800xt was within 5-10% of the 3080 whereas in cyberpunk, they arent even fucking close.

I bet GTA will be the same since its being developed on the base PS5 and XSX.
It's because the RT implementations in these games are designed to run well on console hardware, for example by reducing the roughness of reflections, or changing the "roughness cutoff" where RT reflections come into play.
 
These results are consistent across all PC focused outlets.

performance-upscaling-2560-1440.png
1440p wins again, I'll max out the screen lol
Very nice performance here. And interesting to see AMD cards perform so well, or maybe Nvidia cards perform so badly, idk what's more correct.
 
They usually run a 9800x3d. See below screenshot of the setting visual differences:

RaakIll.png


MlCidTa.png



7ITfPQN.png

ok... these looks almost identical lol.
I wonder what exactly actually changes between these presets. but then again, presets are a terrible way to benchmark games, as presets don't always do what you'd expect.

you'd expect the "medium" preset in a game to set everything to medium, but that's not always the case. some games mix settings in their presets.
which makes me wonder if they made sure that everything is on the lowest setting possible for their low benchmark
 
well... this doesn't happen if you use the correct settings 🤷 the texture pool settings are specifically there to adjust this.
you'll never even really need to set it above medium as the difference there are already super minimal compared to the max texture settings
I know I only commented on it because if you just start the game and lazily flick everything to max like these bench people usually do you'll hit a brutal wall. But DOOM TDA seems to be a different scenario. Hitting the VRAM ceiling on Indy caused a drop down to the 10s, here they're in the mid 40s at least.
 
The lack of presets scaling is irritating. It's the same thing for Indiana Jones where they locked PCs out of console settings. This game runs on a Series S, why isn't there a preset close to that for low-end hardware?
 
Last edited:
Is this using the Nvidia beta driver that was meant to be used with the game? Mortismal said in his review that the driver fixed some of the issues. Would make sense that the 9070XT out performs the 5080...
 
Top Bottom