Not much info to go on here but id's latest engine is one weird beast. Indiana Jones ran terribly maxed out on a 4080S I used to have, like we're talking 10fps. But that was VRAM related and 9070XT have 16GB too.
Was there any theories by hardware unboxed in the video?
Not much info to go on here but id's latest engine is one weird beast. Indiana Jones ran terribly maxed out on a 4080S I used to have, like we're talking 10fps. But that was VRAM related and 9070XT have 16GB too.
Was there any theories by hardware unboxed in the video?
Not native.These results are consistent across all PC focused outlets.
![]()
The 4090 will go down as one of the best purchases I've ever made. What a legendary card.4090 chuds rise up
still not updated drivers from nVidia.What's your point? :
![]()
Please note that the NVIDIA driver (576.40) that we used is newer than the first press driver for the game (576.31). The new build includes performance improvements that will be included in an upcoming public driver. We measured an impressive 5% performance improvement across a wide range of cards and generations.
Still not most recent nvidia driver. THey
still not updated drivers from nVidia.
We shall see. I'm not a fanboy and will go where the performance takes me. If it turns out the 5080 blows dick over time, then I won't buy it. Simple. I can roll with my 3070TI for a while or go AMD. It's just unlikely that nVidia would perform this badly, but there's always time for a first.So now we need to wait for drivers to cover for new Nvidia cards?
![]()
It's the hidden benefit to having shit drivers, you get to use them as an excuse.So now we need to wait for drivers to cover for new Nvidia cards?
![]()
We shall see. I'm not a fanboy and will go where the performance takes me. If it turns out the 5080 blows dick over time, then I won't buy it. Simple. I can roll with my 3070TI for a while or go AMD. It's just unlikely that nVidia would perform this badly, but there's always time for a first.
So good, expensive but probably my best purchase after the 1080ti.4090 chuds rise up
still not updated drivers from nVidia.
I know the tweaks, played through it long ago, but the engine behaved badly when VRAM peaked above 16GB.that's because you used the wrong settings. you should have set your texture pool to medium, which will result in barely any visual differences but would have given you 100+ fps on near max settings...
I got 60+ fps on a 3060ti at 1440p on pretty high settings... with DLSS quality mode (which looks better than native in iD Tech games) I got 80fps
Doesn't seem that way with this game:i would like to see it running on my PC. I dont believe in these tests. You probably change some small settings and will get much better results
I know the tweaks, played through it long ago, but the engine behaved badly when VRAM peaked above 16GB.
lol wtf is thisDoesn't seem that way with this game:
![]()
Apparently the game doesn't have much visual differences between all the settings. Low is in no way potato mode that we have comes to expect.lol wtf is this
Doesn't seem that way with this game:
![]()
They usually run a 9800x3d. See below screenshot of the setting visual differences:which CPU are they using? this sis clearly CPU limited if the framerate stays this similar
It's because the RT implementations in these games are designed to run well on console hardware, for example by reducing the roughness of reflections, or changing the "roughness cutoff" where RT reflections come into play.Nvidia did a great job fooling everyone with the initial RT games that likely used Nvidia APIs and drivers.
I remember this started changing around the time Far Cry 6 came out and the RT performance was roughly identical between nvidia and AMD cards and we all chalked it up as a one off.
But then the City Sample came out a few months later, and hardware lumen ray tracing basically ran the same on both Nvidia and AMD GPUs despite running on AMD's inferior rt cores. Its since got a small 7% boost giving nvidia cards a small edge but nothing like cyberpunk.
This has held true for all UE5 games with hardware Lumen, and most recently AC Shadows. Even Avatar's performance wasnt as bad as cyberpunk or control, and other original RTX showcases. IIRC, the 6800xt was within 5-10% of the 3080 whereas in cyberpunk, they arent even fucking close.
I bet GTA will be the same since its being developed on the base PS5 and XSX.
1440p wins again, I'll max out the screen lolThese results are consistent across all PC focused outlets.
![]()
Maybe, but the shit show is only exacerbated by fanboys looking for at least 1 game to prove them right.Dude, the 5000 series being a shitshow is nothing new.
They usually run a 9800x3d. See below screenshot of the setting visual differences:
![]()
![]()
![]()
It shouldn't happen at all.Maybe, but the shit show is only exacerbated by fanboys looking for at least 1 game to prove them right.
I know I only commented on it because if you just start the game and lazily flick everything to max like these bench people usually do you'll hit a brutal wall. But DOOM TDA seems to be a different scenario. Hitting the VRAM ceiling on Indy caused a drop down to the 10s, here they're in the mid 40s at least.well... this doesn't happen if you use the correct settingsthe texture pool settings are specifically there to adjust this.
you'll never even really need to set it above medium as the difference there are already super minimal compared to the max texture settings