• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Doom The Dark Ages will require a GPU with Ray Tracing support to be playable on PC

Magic Carpet

Gold Member
I hope the game settings will do a better job with memory detection this go around.
It was possible to crash Indy Jones Great Circle by pushing the settings beyond the memory available.
 

winjer

Member
No, but if they find new and innovative ways to enhance the experience, why not? RT can be used for so much more than just lighting and reflections. Glad to see different approaches.

See, this is just another pointless use of RT.
I can understand they want to have some RT features, but at least give the option to turn it off for people that care more about performance.
 

kiphalfton

Member
You’ll need an 8GB graphics card to even run the game, but that’s a fairly low bar for a title releasing in 2025.

Arguably yes, that is a low bar.

Nvidia is screwing people though, by limiting VRAM.

By the time Nvidia releases a generation of graphics cards that have 16GB minimum, it won't be far off from the bare minimum. And the cycle will keep repeating.

VRAM shouldn't be the proponent that affects performance. However, who's to say Nvidia won't use that to their advantage to get more people to buy the xx70 or xx80 models (which historically, has barely that much more VRAM).
 
Last edited:
Mandatory RT?

Doom 2016 and Eternal were probably popular partly due to the fact they would run on every single GPU. Both had excellent performance to visual ratio (if that is even a thing).

Eternal feels superb with 4k120 - going to be a step back to be limited to 4k60.
 

Kataploom

Gold Member
Well, this is necessary, you don't even need a post-2020 expensive card for that and games with RT as exclusive lighting method are way better optimized than those with hybrid gimmicky approach we've seen since 2018. AMD is gonna be just fine, don't let Nvidia marketing blind you, full RT games are totally feasible these days for both brands as long as they're don't focus too much on Nvidia tech (mostly sponsored titles).
 

Gaiff

SBI’s Resident Gaslighter
See, this is just another pointless use of RT.
I can understand they want to have some RT features, but at least give the option to turn it off for people that care more about performance.
Wait until you experience it before knocking it. Besides, RT has been available for 7 years now. Unsure how this will even affect performance. It could be relatively light but simply murderous for non-RT capable cards, at which point I'd ask, why should we worry about 7-year-old hardware.
 
Last edited:

winjer

Member
Wait until you experience it before knocking it. Besides, RT has been available for 7 years now. Unsure how this will even affect performance. It could be relatively light but simply murderous for non-RT capable cards, at which point I'd ask, why should we worry about 7-year-old hardware.

RT always causes a performance hit. The difference is only if it's big or huge.
 
Seems like a perfectly sane requirement in 2025. If you're playing modern (AAA) games on PC and don't currently have at least an RTX 20XX card then... uh, good luck. There's still games for you, just not those pushing technology forward a bit. Technology that's been on the market for years.
 

yogaflame

Member
Off topic: Ps5 base or xbox series x can run the game with RT ( maybe lesser frame rate) and Ps5 pro with improve RT than base ps5 due to by that time the game launch the PSSR ML is more mature. For me, as long as the game can run 4k and 60 fps, even deactivating Rt will be fine.
 
Last edited:

Bojji

Member
Seems like a perfectly sane requirement in 2025. If you're playing modern (AAA) games on PC and don't currently have at least an RTX 20XX card then... uh, good luck. There's still games for you, just not those pushing technology forward a bit. Technology that's been on the market for years.

Pretty much. Same way we should have games build with mesh shaders in mind for years now, but of course old GPUs and consoles (long cross gen) are blocking this...
 
Last edited:
I don’t understand the big push for ray tracing. It’s nice, sure, but it’s not as nice as others make it out to be
It depends on how its implemented in the game and what hardware your playing on. It looks amazing in some games and really adds to the experience.

Some people only care about FPS, but for me I want to play with raytracing on at all times and I am happy with a minimum 60 FPS.

But I also think from a business perspective they are using it as an excuse to get people to upgrade their GPUs/consoles.
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
Seems perfectly fine, my laptop wont be able to play it (1650) but if Indy is anything to go by my main PC (6900XT) will have no problems other than some poorly lit croissants.
 

Kataploom

Gold Member
Do you really think we need RT for hit detection.
I was thinking on it some days ago, they could just use Raycasting as they've always done, but RC only allows very simple per object detection (more specifically, per physical collider detection), if you want to more detailed per object detection then it would be hell of expensive in terms of hardware and he'll of a headache in terms of production pipeline due to now requiring to have an impossible to calculate amount of colliders per object, it would add too much complexity and games are complex enough already.

RT had that solved, hit a point and you'll get material information of that specific point and I'm not even sure devs will require a single collider since it would be ran through the GPU, then make that data available for the CPU so it can be accessible in the main logic, and you wouldn't be required to use any complex in-house architecture for that.

That's as far as I can think it is.
 

MikeM

Member
RT always causes a performance hit. The difference is only if it's big or huge.
Thats not true at all. It all depends how much RT is being performed.

Resident Evil 4 only has a 2fps (83fps to 81fps) hit to performance on my 7900xt at 4k for example.
 
RT always causes a performance hit. The difference is only if it's big or huge.
In 99% of games RT does not make any memorable visual difference, but always tanks the frame rate. I always turn it off.

But hey, if RT becomes mandatory for games going forward perhaps we'll see better usage of it (since games will be developed with RT from the ground up).
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
See, this is just another pointless use of RT.
I can understand they want to have some RT features, but at least give the option to turn it off for people that care more about performance.
Why is accurate location specific damage pointless? Hitboxes have limitations, being able to shoot through a chink in someone's armor naturally, shoot them in the eye through a helmet etc sounds cool and a solid progression if they pull it off.
 

winjer

Member
Why is accurate location specific damage pointless? Hitboxes have limitations, being able to shoot through a chink in someone's armor naturally, shoot them in the eye through a helmet etc sounds cool and a solid progression if they pull it off.

That is not how armour for gunfire works.
It's just a plate of metal or ceramic, on the chest. There are no chinks, like on medieval armour.
 

poppabk

Cheeks Spread for Digital Only Future
That is not how armour for gunfire works.
It's just a plate of metal or ceramic, on the chest. There are no chinks, like on medieval armour.
doom-dark-ages-1-s4xvntyms56d.jpg

This is medieval armor though.
But even in regular armor there are gaps.
shopping
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Mandatory RT?

Doom 2016 and Eternal were probably popular partly due to the fact they would run on every single GPU. Both had excellent performance to visual ratio (if that is even a thing).

Eternal feels superb with 4k120 - going to be a step back to be limited to 4k60.

Doom'16 had a min spec of a GTX670 a GPU that was 4 years old at the time (I dunno if a 1.2GB GTX570 was enough)


Doom Eternal had a min spec of a GTX1050Ti/1060 a GPU that was 4 years old at the time (much older was not advisable due to VRAM)


Doom Dark Ages has a min spec of an RTX2060S a GPU that will be 6 years old at the time(8GB VRAM likely the actual reason its not a base 60)






So looking back this is the oldest GPU Doom has supported as a min-spec in the new generation.
If I would be so bold as to hazard a guess if you have anything 3060Ti class you are gonna be running the game in the 90fps range at "Supreme" settings.
If you are at minspec then 60 should be no problem.
VRAM is gonna be our biggest enemy i would guess.

XAwJkfZ.png






P.S What GPU are you using to run Doom Eternal at 4K120 and at what settings?
 
Last edited:
Bullshit reasons probably.

Every game uses the tracing of rays for gameplay purposes, to measure distances to a collision object for instance.
That's not true, ray casting is not the same as ray tracing, ray tracing is more involved. With the former you are calculating what the ray might hit (generally a hitbox) and when it does the calculation stops, and you can tie it to physics updates or frame updates making it even less taxing on the cpu.
With ray tracing you are tracing the ray until it hits something, and then when it comes to light colour, more rays are sent to the object to determine its colour, and then the ray will recursively cast itself from the object it hit, or "bounce"; this can make the hit detection interact more accurately when you move away from hitboxes and just use material information instead.
 

Trogdor1123

Gold Member
It depends on how its implemented in the game and what hardware your playing on. It looks amazing in some games and really adds to the experience.

Some people only care about FPS, but for me I want to play with raytracing on at all times and I am happy with a minimum 60 FPS.

But I also think from a business perspective they are using it as an excuse to get people to upgrade their GPUs/consoles.
I’d agree with everything in this post. It looks very nice if used right but the performance hit is staggering. I guess it needs to start somewhere though and will be better and better.
 

Mortisfacio

Member
Doom'16 had a min spec of a GTX670 a GPU that was 4 years old at the time (I dunno if a 1.2GB GTX570 was enough)


Doom Eternal had a min spec of a GTX1050Ti/1060 a GPU that was 4 years old at the time (much older was not advisable due to VRAM)


Doom Dark Ages has a min spec of an RTX2060S a GPU that will be 6 years old at the time(8GB VRAM likely the actual reason its not a base 60)






So looking back this is the oldest GPU Doom has supported as a min-spec in the new generation.
If I would be so bold as to hazard a guess if you have anything 3060Ti class you are gonna be running the game in the 90fps range at "Supreme" settings.
If you are at minspec then 60 should be no problem.
VRAM is gonna be our biggest enemy i would guess.

XAwJkfZ.png






P.S What GPU are you using to run Doom Eternal at 4K120 and at what settings?

VRAM is my primary concern going forward. I plan on getting a 5090, but for those getting the 5080 with 16gb of VRAM there's already games out today that max settings use more than that. It's concerning if the GPUs that aren't even out might see bottlenecks with VRAM. Indiana Jones for example sees 18gb+ util at max/4k.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
VRAM is my primary concern going forward. I plan on getting a 5090, but for those getting the 5080 with 16gb of VRAM there's already games out today that max settings use more than that. It's concerning if the GPUs that aren't even out might see bottlenecks with VRAM. Indiana Jones for example sees 18gb+ util at max/4k.

Alan Wake 4K native with PT maxes out my 4080 16GB VRAM.
Outlaws at 1440p maxed out drowns my 4080.


So yeah i wouldnt be shocked if more games maxed out clock 16GB easy work.
 

yamaci17

Member
I have no problem with ray tracing requirements but I hope game doesn't destroy 6 cores in terms of frametime stability or something
insisting on that 8/16 part even on the minimum requirements is a bit weird. indiana jones lists 3600 as minimum and I'd say that's fair. no idea why they went with 3700x here
 

Bojji

Member
I have no problem with ray tracing requirements but I hope game doesn't destroy 6 cores in terms of frametime stability or something
insisting on that 8/16 part even on the minimum requirements is a bit weird. indiana jones lists 3600 as minimum and I'd say that's fair. no idea why they went with 3700x here

Core requirements are probably bullshit (for the most part). I doubt 5600/12400 CPUs will have any problems vs. 3700x (that is slower CPU but has more cores).
 
Top Bottom