• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Doom The Dark Ages will require a GPU with Ray Tracing support to be playable on PC

yamaci17

Member
Core requirements are probably bullshit (for the most part). I doubt 5600/12400 CPUs will have any problems vs. 3700x (that is slower CPU but has more cores).
yeah mate I hope you're right. I finally ditched my 2700x and went for a cheap 5600 a couple months ago. so I'd be furious if this game somehow runs faster on 2700x or 3700x or something :messenger_tears_of_joy:
 

Crayon

Member
Any rt requirements for the foreseeable future are going to be very lightweight. Games will have to run on PS5 for like 5 or 6 more years. It's kind of interesting to see how far they can stretch the little rt capability that's there. It's just soooo little and that's a downer.

The real story here is that vram. Releasing an 8gb card should be illegal at this point. We're already at the point where 12 is just getting you by, as if it had been available at a good value for years. But there were relatively few 12gb cards to bridge the gap.

And now the reality lands that 16gb is not some lavish, future-proofing spec. It's enough to make some comfortable wiggle room for those who don't necessarily need the best of the best all of the time.

Going back to PS5 as the baseline for development, it might be able to anchor down requirements at more like 12gb for a while. Even then, that's going to be pegging some GPUs that should bury PS5 at min spec.
 
Doom'16 had a min spec of a GTX670 a GPU that was 4 years old at the time (I dunno if a 1.2GB GTX570 was enough)


Doom Eternal had a min spec of a GTX1050Ti/1060 a GPU that was 4 years old at the time (much older was not advisable due to VRAM)


Doom Dark Ages has a min spec of an RTX2060S a GPU that will be 6 years old at the time(8GB VRAM likely the actual reason its not a base 60)






So looking back this is the oldest GPU Doom has supported as a min-spec in the new generation.
If I would be so bold as to hazard a guess if you have anything 3060Ti class you are gonna be running the game in the 90fps range at "Supreme" settings.
If you are at minspec then 60 should be no problem.
VRAM is gonna be our biggest enemy i would guess.

XAwJkfZ.png






P.S What GPU are you using to run Doom Eternal at 4K120 and at what settings?
RTX 4080, maxed out (ultra nightmare settings I believe).
 
Last edited:

Shifty1897

Member
They're doing it because otherwise you have to build two lighting models for your game. It's less labor intensive to just build for Ray tracing, these moves aren't popular but it's the kind of thing that will help reduce the cost of development.
 

//DEVIL//

Member
you mean any card from the last 6/7 years ?

how dare they ??

I should be able to play my game on a toaster GPU if I wanted.

Fuck MS I knew them buying Doom was a bad idea.


...

Anyone who is below a 2000 series card deserve not to play the game to be honest. cheap bastards -_-
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
RTX 4080, maxed out (ultra nightmare settings I believe).

You GPU came out 2 years and a full generation after the game(Eternal)......of course it should be blazing through the game.
The GPU will be 3 years old by the time this game comes out......if the game was still pushing native 4K120 wouldnt you be disappointed that games seemingly havent improved since you bought your GPU.
Worse still wouldnt you be sad that a 4 year old game runs the same as a brand new game?

You are almost certainly still going to be above 60 probably close to 90 with some optimized settings 120 should be doable.
Id probably just turn on DLSS to hit 120fps.....or turn down some settings.

Dont you actually get excited when you can give your GPU a hard workout?


Note: Doom Eternal maxed out at 4K uses about 12GB of VRAM, im guessing this game will be around that level if not higher.
You might find yourself VRAM limited before your GPU actually runs out of horsepower.




<----Same GPU cant lock native 4K120 Im already resigned to the fact the 4080 is old and is not going to be doing 4K120 in every game.
Tech gets better games get heavier, why you would want it to be stagnant is beyond me.
 

Crayon

Member
A lot of this is going to depend on how good the game looks in minimum settings. It could end up looking better than many games on high. These guys know what they're doing.
 
It does not affect me personally, but I can't say I'm a fan of games going down this path that will lock some people out of playing it. I know RTX cards are about to be in their 4th gen, but a great pro for PC gaming is the customisation. Being able to lower settings on ageing systems helped with longevity and was more accessible to people. Even people with decent AMD cards are out of luck because of 1 setting being forced.

I know they said ray tracing helps with things like damage calculations, but I don't know how important that will be, especially in a game like Doom were you are firing your weapons nonstop all over the screen into hoards of enemies.

PC gaming is again falling into that trap of out pricing a lot of people (if it hasnt already on the hardware side). Prices of games and hardware are just going up and up, and with this trend of forced settings requiring certain GPUs, sales will probably slow down and then these companies will cry about profits because people can't afford to play catch up with the technology and pricing advancing so much. Gamepass cloud gaming isn't the only option I would want for gaming, at all.

That said, I really enjoyed the previous 2 Doom games, and this looks promising, so hopefully it's good.
 

SlimySnake

Flashless at the Golden Globes
Thats right keep on alienating gamers with these ridiculous specs then go complain about ‘no one wants to play PC games 1!!!”

Also, RT in doom? Get a life, dev.
What? The first RTX card launched in 2018. It was like $250 and is still 2x faster than the series s. First AMD RT card launched in 2020.

if you dont have an RTX card in 2025, you should not be calling yourself a pc gamer. Just play games on mobile.
 

Bojji

Member
It does not affect me personally, but I can't say I'm a fan of games going down this path that will lock some people out of playing it. I know RTX cards are about to be in their 4th gen, but a great pro for PC gaming is the customisation. Being able to lower settings on ageing systems helped with longevity and was more accessible to people. Even people with decent AMD cards are out of luck because of 1 setting being forced.

I know they said ray tracing helps with things like damage calculations, but I don't know how important that will be, especially in a game like Doom were you are firing your weapons nonstop all over the screen into hoards of enemies.

PC gaming is again falling into that trap of out pricing a lot of people (if it hasnt already on the hardware side). Prices of games and hardware are just going up and up, and with this trend of forced settings requiring certain GPUs, sales will probably slow down and then these companies will cry about profits because people can't afford to play catch up with the technology and pricing advancing so much. Gamepass cloud gaming isn't the only option I would want for gaming, at all.

That said, I really enjoyed the previous 2 Doom games, and this looks promising, so hopefully it's good.

There were always locks like this with DX versions, shader support etc. But these locks were far more often in the past: DX10 was ~2007, DX11 2009, DX12 2015 and DX12U 2020.

Hardware supporting DX12U dates to 2018. This game uses Vulcan but has the same requirement as DX12U in this case (RT).

Nvidia, AMD and Intel only released DX12U cards since 2020, consoles with RT were released in 2020 as well. I think it's time to drop pre DX12U support - after 5 long years...
 
Last edited:
You GPU came out 2 years and a full generation after the game(Eternal)......of course it should be blazing through the game.
The GPU will be 3 years old by the time this game comes out......if the game was still pushing native 4K120 wouldnt you be disappointed that games seemingly havent improved since you bought your GPU.
Worse still wouldnt you be sad that a 4 year old game runs the same as a brand new game?

You are almost certainly still going to be above 60 probably close to 90 with some optimized settings 120 should be doable.
Id probably just turn on DLSS to hit 120fps.....or turn down some settings.

Dont you actually get excited when you can give your GPU a hard workout?


Note: Doom Eternal maxed out at 4K uses about 12GB of VRAM, im guessing this game will be around that level if not higher.
You might find yourself VRAM limited before your GPU actually runs out of horsepower.




<----Same GPU cant lock native 4K120 Im already resigned to the fact the 4080 is old and is not going to be doing 4K120 in every game.
Tech gets better games get heavier, why you would want it to be stagnant is beyond me.
In fairness my RTX 2080 Ti was also able to run Doom Eternal at 4K120 (with DLSS) - released 2 years prior to the game.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Thats right keep on alienating gamers with these ridiculous specs then go complain about ‘no one wants to play PC games 1!!!”

Also, RT in doom? Get a life, dev.

Bro if you dont have a DXR capable card in 2025.....you deserve everything you get.
Weve had DXR capable cards since 2018.........2018 mate.....thats 6 fucking years ago.


If you saved a dollar every day since DXR cards were a thing, you would be able to afford a 5090 today!

So not having an RTX 2060S class card today is fully on you.
Get fucked and left behind.


DlD7UTcU8AALnEu.jpg:large
 

Mister Wolf

Member
Bro if you dont have a DXR capable card in 2025.....you deserve everything you get.
Weve had DXR capable cards since 2018.........2018 mate.....thats 6 fucking years ago.


If you saved a dollar every day since DXR cards were a thing, you would be able to afford a 5090 today!

So not having an RTX 2060S class card today is fully on you.
Get fucked and left behind.


DlD7UTcU8AALnEu.jpg:large

They can always buy a used Series S.
 

Buggy Loop

Gold Member
eh its fine

Hardware for it has been out since 2018.

Programmable shaders were a lot more brutal in forced support in the span of a year or so.

RT has had 7 years. Indiana Jones and Doom are the only ones forcing RT hardware that I can think of.
 

Zathalus

Member
Core requirements are probably bullshit (for the most part). I doubt 5600/12400 CPUs will have any problems vs. 3700x (that is slower CPU but has more cores).
FPS might be fine but it could cause micro-stutter like what happened to 4 core CPUs. That is just speculation though, I guess we will see when the game comes out.
 

Bojji

Member
And how do you know this?

I guess it would be in Indiana Jones? If they had it they would use it most likely.

FPS might be fine but it could cause micro-stutter like what happened to 4 core CPUs. That is just speculation though, I guess we will see when the game comes out.

While VRAM and features requirements are very specific and there is (usually) truth to them CPU requirements most of the time are just "insert random cpu from year 20xx here". But of course in this case it might be different.

6 core cpus are still killing 90+% of games:

WobsMZ5.jpeg


 
Last edited:

viveks86

Member
Thats right keep on alienating gamers with these ridiculous specs then go complain about ‘no one wants to play PC games 1!!!”

Also, RT in doom? Get a life, dev.
How in the world is this ridiculous? Seems like the most scalable specs I've seen in a while, if you start from the minimum requirements. Sounds to me like you are like me, and just need to switch to console gaming, where the cycles are longer and the rapid evolution isn't frustrating to keep up with. The dev is doing just fine. We need to accept this as the nature of tech evolution instead of shackling devs with backwards compatibility beyond a certain point.
 
Last edited:

Spukc

always chasing the next thrill
As a soon to be 5090 owner..

This is fucking bullshit.
Made to push new gpu’s gtfo with all that bullshit
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
And how do you know this?

The game/engine has no fallback for sun/moon light.
So if you turn off the RayTracing there will be no lighting.
Since most of the light in the game is from the sun/moon with that off, youll only get light from artificial lights if at all.


How do we know this, you can .ini edit RayTracing to off.
As there is no fallback the engine doesnt even know what you are talking about.
 

IntentionalPun

Ask me about my wife's perfect butthole
The game/engine has no fallback for sun/moon light.
So if you turn off the RayTracing there will be no lighting.
Since most of the light in the game is from the sun/moon with that off, youll only get light from artificial lights if at all.


How do we know this, you can .ini edit RayTracing to off.
As there is no fallback the engine doesnt even know what you are talking about.
Gotcha thanks.
 

Fbh

Member
I'm not a PC gamer but this doesn't seem like such a big deal.
Wouldn't most GPU's released in the past 5 years or so still be compatible?

How far back are AAA PC games supposed to be backwards compatible ? This reads as if console gamers were mad it's not going to launch on Ps4/Xb1
 

marjo

Member
It's hard for me to empathize with those complaining about this. Ray Tracing cards have been around since 2018. You could probably pick up a used 2060 for about $100.
 

viveks86

Member
Well, to be a fair to those complaining, I can't think of a time in the last 15 years when a technical leap in graphics was as disruptive as ray tracing. So we should expect a phase like this when people are unhappy that they have fallen on the wrong side of a hard cut-off. Is the ray tracing worth such a hard cut-off? That's probably debatable from game to game. But certainly the way to go in the long term.
 

phant0m

Member
Well, to be a fair to those complaining, I can't think of a time in the last 15 years when a technical leap in graphics was as disruptive as ray tracing. So we should expect a phase like this when people are unhappy that they have fallen on the wrong side of a hard cut-off. Is the ray tracing worth such a hard cut-off? That's probably debatable from game to game. But certainly the way to go in the long term.
It’s not. RT is the least impressive graphical leap in my lifetime and it costs so much (in compute/performance). Honestly in like 80% of games I need side-by-side screenshots to tell.

And baked lighting still looks better half the time.

Mediocre as hell tech. DLSS and frame gen is WAY better and cooler tech for gaming
 

Filben

Member
Indiana Jones has only gotten 12k in simultaneous players on steam and have been some reports that it didn't sell very, but this can be because of first person debacle
Because MS don't sell games with their Game Pass. It's hard to gather data. We will have to wait for the PS5 version.
 

viveks86

Member
It’s not. RT is the least impressive graphical leap in my lifetime and it costs so much (in compute/performance). Honestly in like 80% of games I need side-by-side screenshots to tell.

And baked lighting still looks better half the time.

Mediocre as hell tech. DLSS and frame gen is WAY better and cooler tech for gaming
This is a short term perspective though. There is a reason why devs are flocking to ray tracing and it's not just trend chasing. The sheer amount of dev time and compute needed to bake lighting is enormous. Game levels are literally designed with baked lighting constraints in mind. For example, most game worlds are not interactive to this day because their lighting (and resulting shadows) is baked. Spiderman requires entire server farms running round the clock to bake their lighting during development because it is impossible to implement RTGI even with current hardware at the speeds that the character moves through a scene. If a dev or artist, changes certain objects in the world, they literally have to re-run the baking process. All of this will eventually be a thing of the past when real time path tracing becomes the defacto standard (may be another 10-15 years from now). Until then, we just have to go through these growing pains. There is no other aspect of rendering hardware that is expected to scale for the next 20 years other than RT and AI. Until we are stuck with semiconductor based computing, this will be our reality.

With regard to what looks better, that is not an issue of the tech as much as it is an issue of art and where the lights are placed in a scene with respect to a subject. All the tech is doing is simulating light bounce and intersections. For it to look good is the job of an artist. This will improve with time. Right now people are just swapping out baked lights for ray tracing and expecting everything to work automatically. There is a reason movies use fake lights even in real life to shoot a scene. Light placement is everything. RT will allow you to change those placements at any point in time, which you can never afford with baked lighting. That's the game changer.
 
Doom eternal run like a dream on the RTX4080, even at 4K native (over 200fps without RT, and over 150 with with RT)



In 'Doom: The Dark Ages', however, the RTX4080 will 'only' get 60fps at 4K Ultra settings, according to the requirements, so I expect tp see some big improvements in the lighting quality. Doom eternal had amazing art direction and fairly detailed levels, but the lighting was flat :p, so no wonder the game ran so well.

RT in doom eternal was also very limited, although RT reflections still looked 10x times better than screen space reflections.

 
Last edited:

Filben

Member
Im talking about the steam, the consoles may sell better but if fails on the steam I think will be enough to stop the trend
Yeah, but Game Pass is available on PC, too. So, many PC players sub for a month for 12 bucks to play the game instead of paying 60 bucks on Steam. Hence low concurrent player numbers and sales.
 

Landr300

Neo Member
This is a short term perspective though. There is a reason why devs are flocking to ray tracing and it's not just trend chasing. The sheer amount of dev time and compute needed to bake lighting is enormous. Game levels are literally designed with baked lighting constraints in mind. For example, most game worlds are not interactive to this day because their lighting (and resulting shadows) is baked. Spiderman requires entire server farms running round the clock to bake their lighting during development because it is impossible to implement RTGI even with current hardware at the speeds that the character moves through a scene. If a dev or artist, changes certain objects in the world, they literally have to re-run the baking process. All of this will eventually be a thing of the past when real time path tracing becomes the defacto standard (may be another 10-15 years from now). Until then, we just have to go through these growing pains. There is no other aspect of rendering hardware that is expected to scale for the next 20 years other than RT and AI. Until we are stuck with semiconductor based computing, this will be our reality.

With regard to what looks better, that is not an issue of the tech as much as it is an issue of art and where the lights are placed in a scene with respect to a subject. All the tech is doing is simulating light bounce and intersections. For it to look good is the job of an artist. This will improve with time. Right now people are just swapping out baked lights for ray tracing and expecting everything to work automatically. There is a reason movies use fake lights even in real life to shoot a scene. Light placement is everything. RT will allow you to change those placements at any point in time, which you can never afford with baked lighting. That's the game changer.
Never going to happen, not at a reasonable cost at least

and nobody (or very few) will buy a strong hardware just to make devs lives easy
 

Landr300

Neo Member
Yeah, but Game Pass is available on PC, too. So, many PC players sub for a month for 12 bucks to play the game instead of paying 60 bucks on Steam. Hence low concurrent player numbers and sales.
but there's games who are available on GP and have far bigger numbers on steam, one of them is forza horizon

something indeed happened with indy, either be the FPS debacle or limited by RT, doom will answer this question, if the game doesn't sell well we have a trend
 
Last edited:
I don’t understand the big push for ray tracing. It’s nice, sure, but it’s not as nice as others make it out to be
GPU makers need RT to sell new HW by artificially making existing HW appear inadequate.
See, this is just another pointless use of RT.
I can understand they want to have some RT features, but at least give the option to turn it off for people that care more about performance.
Performance and gameplay aren't the focus of GPU makers who run the show.
4k/RT are bad for gaming/gameplay which makes them good for Nvidia/AMD because newer GPUs will always improve things slightly without making them too good.
PS and Nintendo doomed gaming to "whatever Nvidia and AMD say it is" when they abandoned esoteric HW.
 

Gaiff

SBI’s Resident Gaslighter
GPU makers need RT to sell new HW by artificially making existing HW appear inadequate.
GPU makers don’t make games.
Performance and gameplay aren't the focus of GPU makers who run the show.
4k/RT are bad for gaming/gameplay which makes them good for Nvidia/AMD because newer GPUs will always improve things slightly without making them too good.
PS and Nintendo doomed gaming to "whatever Nvidia and AMD say it is" when they abandoned esoteric HW.
Nonsense.
 

Rockman33

Member
Feels like the same people who complain about the series s being too weak are also complaining why their 8 year old mid tier at the time graphics card won’t play a new game….
 

viveks86

Member
no, unless they find a way to reduce the cost in new nodes which doesn't appear to be the case anytime soon
Hardware ray tracing architectures are in their infancy. I don’t expect node size to be the gating factor for a while. I expect at least 10x RT performance in that timeframe. Perhaps we should revisit this exchange in 15 years, if GAF and the both of us survive it :)
 
Last edited:

Landr300

Neo Member
Hardware ray tracing architectures are in their infancy. I don’t expect node size to be the gating factor for a while. I expect at least 10x RT performance in that timeframe. Perhaps we should revisit this exchange in 15 years, if GAF and the both of us survive it :)
how do you think they gonna make less expensive hardwares?
 

viveks86

Member
how do you think they gonna make less expensive hardwares?
I’m not an expert on the chip side of things. Not expecting much on that front other than better economies of scale (unless there is some technological breakthrough we don’t know about yet). RT and ML applications will be everywhere at that point and not just gaming and crypto. I do expect better RT and ML architectures though. And way more AI acceleration for RT and AI denoising.
 
Top Bottom