Warhammer Darktide recommended specs (features RTXGI)

hlm666

Member
Here are the system requirements and a link to the article discussing various things about the game.

220cca59af08c54d3d05989d41aabcac12327fda.jpg


"Partnering up with NVIDIA we opted to support ray tracing in our renderer and ended up implementing both RTXGI and raytraced reflections to boot. This also lays the groundwork needed for us to continue experimenting with additional ray tracing features down the line which carries the promise of further improving things like shadows, transparency rendering and VFX (visual effects such as particles). We also decided to support other RTX features like DLSS and Reflex to further improve frame times and response times of the game."

"Another added benefit of the RTXGI implementation we ended up going with is actually that we decided to replace our baked ambient light solution with baked RTXGI probe grids. This allows us to use RTX cards on our development machines to quickly bake GI that can be applied to our scenes even for gpus that do not have enough power to push advanced ray tracing features like this. You won't get the added benefit of the GI being fully dynamic that you get if you have a powerful gpu in your machine of course but the static GI still retains the nice dark feeling in our scenes that would otherwise be very flat and boring."

 
Why are they only targeting medium, 1080P and 60fps for "recommended" specs?

Those are some high ass specs. I didn't think the game looked that good?
 
Last edited:
The Beta had a performance that absolutely didn`t fit the visuals.......looks like this won`t change in the final game.
 
I'm all over this....probs will take a year of patching like vermintide 2 to be truly amazing but I'm still In. My 4090 is ready.
 
This game looks awesome and looks like running it on medium is no issues for me, but it sucks that it's a co-op focused game. It doesn't look like it has a solo mode.

rage comic me gusta GIF
 
The Beta had a performance that absolutely didn`t fit the visuals.......looks like this won`t change in the final game.
The RTGI on high applied to all dynamic light objects where low was only static lights. So high made some of the fights in darker areas look pretty good. However the performance was terrible, i was getting like 30 fps in worst case with dlss on rendering at 1080p. They say in those reqs that they expect 60-70 fps for a 3080 which is what I was using so for that to be true there has to be some big performance improvements there, see how that plays out at release.

edit: to clarify my dlss settings, i was on a 1440p panel with dlss on quality. I didn't mean i was running 1080p with dlss on.
 
Last edited:
Next gen bbe! Lol

If it warrants it, its good we are finally getting some next gen games.

Have you actually seen this game?

Having poor performance and then passing it off as "it's next gen" doesn't make it next gen.

Let's call a spade a spade, it looks like it runs like crap.
 
it looks like it runs like crap.
Well, I was in the closed beta and had no issues with 1440p + DLSS Quality @ High Settings with an RTX 2080 + Ryzen 5800x.
The fps were mostly in the 50s and 60s with the classic Fatshark issue, "horde drops", so VRR is a must.
It´s far from polished, that much is clear given the visual to performance ratio but overall it plays well if you have mid tier hardware or above.
Forget about RT though. Worst perf-hit of any RT game I´ve played so far with basically invisible changes due to the fuckton of action on your screen most of the time.
 
Last edited:
During the beta the game ran like shit on my 5800x and rtx3070 this game couldn't even hit 60fps during gameplay and it doesn't exactly have amazing graphics.

Fuck this game I doubt it will be fixed or optimised for launch.
 


For anyone wondering what the RTX looks like.... well yeah... probably one of the least impressive implementations.

I would point out that everyone saying "preformance last beta sucked" Nvidia literally released drivers yesterday with Darktide as one of the specific games targeted for that that release...

The preformance is still going to suck though...
 
Last edited:
What's DLSS 1080p? Like 720p native or some shit? And this is medium settings.
this_is_bullshit_the_wire.gif

Game doesn't even look that good.

Have you actually seen this game?

Having poor performance and then passing it off as "it's next gen" doesn't make it next gen.

Let's call a spade a spade, it looks like it runs like crap.

I was having a bit of a joke at fat shark as the game will probably look and run great In like 1 to 2 years.
 
I played the last beta at a solid 60+ fps in 1080, mostly medium settings (had to fiddle with some, like TAA), on my 5600X and 6700XT. Game was super fun, that's the most important part.
 
Those specs are ouch.
I suspect there is some incentive for studios to make games choke with RTX and perhaps Nvidia incentivizes this to justify pushing their high end and making their cars a better value proposition. That's just me with my conspiracy hat on.
In most games I can't tell if RTX is on or not, the most prominent effect is to half your framerate. I think until we have more RTX-only games, which may be a console generation away, I will switch to AMD due to better price to performance.
 
As a Guy who currently games on Nvidia cards when the fuck did 6800xt became equal to 3060

1080p medium without Ray tracing on 3060 with DLSS on...... what a joke

A Plague Tale's Requiem hits 60fps on 1440p on RTX 2060 on console settings at Quality DLSS so wtf is this shit
 
The 4080 just came out and games already have it as the rec-spec high?
 
As a Guy who currently games on Nvidia cards when the fuck did 6800xt became equal to 3060

1080p medium without Ray tracing on 3060 with DLSS on...... what a joke

A Plague Tale's Requiem hits 60fps on 1440p on RTX 2060 on console settings at Quality DLSS so wtf is this shit

Nvidia sponsored game. Preferential driver treatment. Happens all the time with sponsored games in both directions.
 
Optimisation is pretty poor.

I love L4D deep rock galactic and played a lot of Vermintide 2, but this game felt very disappointing on the first run. The diversity is just not there, diversity in classes, in environments, in enemies. Did not click with me as Vermintide did where it felt instantly gratifying.
 
When NVidia sponsors a game, you can be sure optimization will go out the window. Both for GPUs from the competition, and GPUs of older nvidia generations.
NVidia always has to push some settings, as to force people to upgrade. Be it tesselation, physics, hairworks, ray-tracing, etc.
 
I´ve been playing the pre-order Beta for a while now. Glad to say that my initial performance issues are gone.
1440p with everything cranked up (no RT) with DLSS quality and I´m basically never dropping below 60fps anymore and the "Horde-Drops" are gone for me, too.
Still rather heavy for how it looks, but much better than the first beta.
 
Last edited:


For anyone wondering what the RTX looks like.... well yeah... probably one of the least impressive implementations.



The funny part is the only thing most people are actually able to discern here are the RT reflections and RT ambient occlusion, but the game probably does screen space reflections and screen space ambient occlusion which they disabled for the comparison.
It's been 4 years since the release of the RTX20 and IMO they still haven't shown a compelling enough case for real time raytracing over screen space techniques in any game other than Lego and Minecraft.
The emperor still has no clothes.
 
The funny part is the only thing most people are actually able to discern here are the RT reflections and RT ambient occlusion, but the game probably does screen space reflections and screen space ambient occlusion which they disabled for the comparison.
It's been 4 years since the release of the RTX20 and IMO they still haven't shown a compelling enough case for real time raytracing over screen space techniques in any game other than Lego and Minecraft.
The emperor still has no clothes.

screen space reflections are disgusting and replacing those dogshit graphics glitches alone makes RT worth it.
 
Game looks great but it is utterly unplayable for me. I have a 3080, i9-12900k and for some reason it runs like dog shit no matter what the settings for me. Is there a reason for this? Is there something on my rig that is causing such poor performance?
 
Game looks great but it is utterly unplayable for me. I have a 3080, i9-12900k and for some reason it runs like dog shit no matter what the settings for me. Is there a reason for this? Is there something on my rig that is causing such poor performance?
I have your exact same setup and the game has run perfectly fine for me. Everything high, DLSS on, RTX off. That's odd.
 
Game looks great but it is utterly unplayable for me. I have a 3080, i9-12900k and for some reason it runs like dog shit no matter what the settings for me. Is there a reason for this? Is there something on my rig that is causing such poor performance?

It's an NVidia sponsored game.

eQCUNPG.jpg
 
Last edited:
Game looks great but it is utterly unplayable for me. I have a 3080, i9-12900k and for some reason it runs like dog shit no matter what the settings for me. Is there a reason for this? Is there something on my rig that is causing such poor performance?
Maybe you need to update some drivers.
 
The last patch improved the RT performance substantially. Been playing 1440p dlss quality with rtgi on high and rt reflections on low and staying above 60fps 95% of the time, normally between 70-80 fps. I've had one drop into the low 50's when I seemed to get cpu bound in one level during a pretty low intensity part so i'm not sure what was going on there.

edit: using a 3080, that's probably handy to know.
 
Last edited:
My drivers are up to date as far as I know. I will look again when I get home. I wonder if it is because I am playing the Gamepass version? lol
Woah - Didn't realise it was on Gamepass - Installing it now to check it out on my new 4090 and see if i can power through it
 
What the hell! My tower is up to date and this runs like dog shit. But my laptop i7-10870H 3080 runs this like a dream. FML 🤦🏾‍♂️
 
Top Bottom