DF: Doom: The Dark Ages "Forced RT" Backlash - Is It Actually Justified?

DLSS4 presents a whole slew of artifacts where I can run Eternal at 200+ FPS native with RT on my 4080.

So again, is this tradeoff even worth it? The Dark Ages looks great, but everything I saw in my 2 playthroughs can been achieved with GI baking and other fast yet conservative methods.

And despite owning a 4080 it's a problem for my 240hz qhd OLED. I barely can run TDA in 120hz in decent quality, let alone shot for 240. RT is a burden for fast-paced action games.

Who the fuck is going to sit there and bake the levels as big as the levels in Doom DA?

You want the game to be a PS6 launch title with a 500GB download?
 
That's way worse actually. They are arbitrarily preventing non RT cards from running the game?

Are they getting paid by Nvidia for this? If not, then why are they creating this artificial barrier to playing their game?
IDK, but that would be an actual valid complaint, not that RT is mandatory, because it's also bring used for shots impact calculations for enemies impact and destruction physics and there's a great reason for them to do that. Now gatekeeping people from running the game even tho their hardware can albeit slower, is shit, like they didn't like people's money lol
 
Who the fuck is going to sit there and bake the levels as big as the levels in Doom DA?

You want the game to be a PS6 launch title with a 500GB download?
Big? They are medium-sized at best and we have essentially pre-baked open-worlds like Forbidden West with full weather and day-night cycles.

Absolutely nothing shown in TDA is RT-exclusive, even so called damage model that's actually pretty toy-like.

But you can eat PR BS as much as you want. The only reason why RT is forced in both TDA and Indy is because it saves dev time and maybe budget, it has nothing to do with damage model or location size.
 
8800 GT in 2007 didn't game at 1280x1024.
We might have different story. I remember most friends was buying 8800Gt and they have 1280x1024, include me. Some of them 1024x768
I know real life guy who get 5070ti for 1080@120
 
Last edited:
But you can eat PR BS as much as you want. The only reason why RT is forced in both TDA and Indy is because it saves dev time and maybe budget, it has nothing to do with damage model or location size.
It's not a secret, the developers themselves explicitly said so in interviews. Billy Kahn, the engine director, basically said that without this approach they couldn't have delivered a game at this scale as fast as they did. Gamers have been complaining about how long and how expensive games have become, but when solutions are found we also don't like them. I don't know, something has to give in the end.
 
It's not a secret, the developers themselves explicitly said so in interviews. Billy Kahn, the engine director, basically said that without this approach they couldn't have delivered a game at this scale as fast as they did. Gamers have been complaining about how long and how expensive games have become, but when solutions are found we also don't like them. I don't know, something has to give in the end.
Oh I assure you R&D for idtech8 alone burned more money and manhours than proper light baking.
 
Last edited:
Big? They are medium-sized at best and we have essentially pre-baked open-worlds like Forbidden West with full weather and day-night cycles.

Absolutely nothing shown in TDA is RT-exclusive, even so called damage model that's actually pretty toy-like.

But you can eat PR BS as much as you want. The only reason why RT is forced in both TDA and Indy is because it saves dev time and maybe budget, it has nothing to do with damage model or location size.
The mech levels are huge, you only dont think they are huge because of the size of the mech and speed its moving at.
The dragon levels?
Even the doom slayer actually travels at ridiculous speeds, untether the camera and see the distances he is actually covering.


But thats neither here nor there


The size of the level directly ties to budget/time to iterate.
If they baked everything it would have taken them longer to release the game and it would have a much larger footprint on disk.

Serendipitously Forbidden West is one of my go to games to show people the benefits of RTGI.

Take Aloy into any forested area in the shade and watch her shine.
Shes NOT planted in the world around her.....your eyes just get used to inaccuracy so you give it a pass.
You shouldnt have to settle when we have the hardware to do it and GG is a AAA studio.

This close to greatness, if only they had a decent real time GI pass even some accurate fast updating probes would ahve helped.
 
We might have different story. I remember most friends was buying 8800Gt and they have 1280x1024, include me. Some of them 1024x768
I know real life guy who get 5070ti for 1080@120
Well sure, I'm not claiming they don't exist, but people buying cards like that generally have upgrades elsewhere, like a 1440p monitor. I myself had a 8800 GT back in 2008 with a 1680x1050 monitor.
 
The only reason why RT is forced in both TDA and Indy is because it saves dev time and maybe budget

Both of these are still extremely important though.

Budget balloons and the increased costs get added onto the retail price - people complain.
Game takes longer to make - people complain.
Bigger download size due to the baked lighting - people complain.
 
Meanwhile the Series S version looks :messenger_fire: by all accounts :messenger_sunglasses:
Like melting pixels?

Accurate.

And why is inflation being brought up when virtually everyone owns an RT capable GPU in 2025?
Just stop. I normally agree with you 90% of the time, but you're just being disingenuous about this one.

There's nothing shader wise that is any more special than past games. You, yourself even ragged on them for their downgrade in that department.

So the whole RT required is a marketing ploy, nothing more. They even got their nVidia card bundles out of it.
 
Last edited:
shut-up-you-need-to-shut-the-fuck-up.gif
 
1080ti was released in 2017. It's over 8 years old. That is longer than a console generation. It was a great card for the time, but the feature set has moved on, not just RT, but mesh shader support is missing as well. It's also stuck with the shitty version of FSR.

Demon Souls remake launched a mere 7 years after the PS4 did. We all know it could have run on the lowly PS4 if they sacrificed some visual fidelity. Devs and Sony didn't want to do that. Fair play to them. The developers at id obviously want to leverage RT, looked at the current console and PC market and decided the next generation of their engine will not support traditional lighting without RT. Fair play to them as well.
 
Last edited:
It's not a secret, the developers themselves explicitly said so in interviews. Billy Kahn, the engine director, basically said that without this approach they couldn't have delivered a game at this scale as fast as they did. Gamers have been complaining about how long and how expensive games have become, but when solutions are found we also don't like them. I don't know, something has to give in the end.
This.

Gamers are absolute morons. The anti-RT movement is comical especially. You now have id and Ubisoft explaining exactly why RT is beneficial from a developmental and time stand point, yet idiots push against it.

People want better looking games, shorter dev time, bigger scope and scale, high performance, AND don't want to pay more. What the fuck do they think this is? There has to be trade-offs somewhere and when you tell them ray tracing cuts dev time significantly and enhances visuals, they bitch that their shitty hardware could run older games at twice the frame rate.
 
Last edited:
1080ti was released in 2017. It's over 8 years old. That is longer than a console generation. It was a great card for the time, but the feature set has moved on, not just RT, but mesh shader support is missing as well. It's also stuck with the shitty version of FSR.
The big problem GTX 1080ti can run a lot new games easily + UE5 games also, and will run until PS6
 
Are they getting paid by Nvidia for this? If not, then why are they creating this artificial barrier to playing their game?
Because using RT removed an artificial barrier in the development process that saved them time and money.

And as much as I empathize with people still gaming on a GTX 1000 or RX5000 card, this is something we're probably going to start seeing more of. And that's a good thing. Developers won't be able to learn how to optimize and apply RT in more creative ways if they're half assing it with every release.

Also worth pointing out that support for Pascal cards is ending relatively soon.
 
Last edited:
The big problem GTX 1080ti can run a lot new games easily + UE5 games also, and will run until PS6
Sure, the power is there. Feature set isn't. My X850 was a powerful card back in the day, but it completely lacked support for Shader Model 3.0, so when Splinter Cell released in 2006 (a mere year after I bought the card), I was shit out of luck. Even though weaker, but newer GPUs ran it just fine.
 
still early. Only with PS6 gen need start.
Because Sony's using AMD HW 100% of new PlayStations have RDNA and dedicated RT cores that *75% of them don't want and will never use.
Sony needs to double down on R&D and build a proprietary GPU that doesn't waste money and space on HW that's effectively worthless for most of its users.
In reality, some non-zero percentage of the 25% of PS5 users with RT 'turned ON' is the direct result of the PS5's complicated UI causing users to pick the wrong setting - creating a poor UX.

*According to Cerny.
 
Last edited:
My X850 was a powerful card back in the day, but it completely lacked support for Shader Model 3.0
different story, different time. Back then, graphic improved very fast. Today graphic change very very slow, like per console generation.

Sony needs to double down on R&D and build a proprietary GPU that doesn't waste money and space on HW that's effectively worthless for most of its users.
PS6 will be different story compared to PS5. There will be heavily force Neural things/RT/PT etc. Lot's of users with cards like 4070/7800XT and below will be dead.
 
Last edited:
different story, different time. Back then, graphic improved very fast. Today graphic change very very slow, like per console generation.
Sure, technology was evolving more rapidly, but we are still talking one year vs eight years here. You have a point if this happened after RT support only came out last year. But it's been around on PC for ages at this point. The console lifecycle is over the halfway mark as well.

Plus, RT is actually a large leap forward for rendering. It's not some insignificant feature that is the issue, but a complete overhaul of the GI system.


oyh4y7a5462f1.jpeg
 
Who cares? Play video games and have fun. This is just the usual daily reminder of why I maintain that gamers have ruined gaming. Imagine caring what people think about lighting and reflections in games that already look amazing as is. This is the segment of the community at large that is embarrassing. Bring on the hate.

On Fire GIF by DefyTV
That's the thing. Tons of people cannot play. Broadly speaking, most people agree that they could have achieved the look they ended up with in TDA without using RT the way they did.

DF video makes it sound like gaming doesn't exist outside of North America and Western Europe. I can see how they arrived at that conclusion living in that bubble. RT cards are an old hat to them. But the value prop really hasn't been there. I was rocking a 1080 until a couple months ago and only had 1 or 2 games that I couldn't find satisfying settings in. Money isn't even a problem for me, but for people who do have that problem, it makes that terrible value prop even harder to overlook.
 
Last edited:
A developer should be left to make the game they want with the feature set they want.

If they target what they want and hit solid performance who the fuck is anyone to tell them otherwise.
 
Like melting pixels?

Accurate.


Just stop. I normally agree with you 90% of the time, but you're just being disingenuous about this one.

There's nothing shader wise that is any more special than past games. You, yourself even ragged on them for their downgrade in that department.

So the whole RT required is a marketing ploy, nothing more. They even got their nVidia card bundles out of it.
maybe i dont get what you are trying to say. I just have no idea why inflation is being brought up when nvidia stopped making non-RT GPUs in 2018. And AMD since 2020. We are 7 years in. When do you want devs to start using this built hardware feature?

And yes, doom could look better and maybe it wouldve had they used the mesh shader features in the consoles. Tech that has been in both AMD and Nvidia GPUs since 2016.
 
No, I'm surprised it took this long. For 5 years now, every new GPU series supports RT, even low end parts.



What about stability?



Meanwhile PC gaming xx years ago:

YG6lEWC.jpeg


And games required pixel shaders to run very early, not years after...
even gaming PC has trouble running certain games with stuttering and bad optimizations, forced RT spells even more trouble for gaming PC. Majority of the PC setups still using old GPUS.

These people with old GPU are going to get left out. Dont care about your PC brothers ? : (
 
Last edited:
People think the game could have run at 8k 240fps if there was no "forced" RT because RT is expensive for a GPU. That's not how game was designed. People have no idea how game design works.
OP is also an idiot for the OG title
 
Last edited:
even gaming PC has trouble running certain games with stuttering and bad optimizations, forced RT spells even more trouble for gaming PC. Majority of the PC setups still using old GPUS.
Doom The Dark Ages had zero stutters. Not shader nor traversal. It doesn't even need a shader compilation step. It runs perfectly well on a $329 GPU from over 4 years ago.
 
Last edited:
Doom The Dark Ages had zero stutters. Not shader nor traversal. It doesn't even need a shader compilation step. It runs perfectly well on a $329 GPU from over 4 years ago.
yea but does it applies the same to other games? ID tech is incredible i know.
 
even gaming PC has trouble running certain games with stuttering and bad optimizations, forced RT spells even more trouble for gaming PC. Majority of the PC setups still using old GPUS.

These people with old GPU are going to get left out. Dont care about your PC brothers ? : (

I'm playing Clair Obscur now on PS5 Pro and it's stuttering just like PC version that I 100% completed before. I'm really tired of this "there is no stuttering on consoles" bullshit.
 
I'm playing Clair Obscur now on PS5 Pro and it's stuttering just like PC version that I 100% completed before. I'm really tired of this "there is no stuttering on consoles" bullshit.
i didnt say theres no stuttering on console, but i do believe if stuttering present on all platforms, PC platform will be the worse.
 
i didnt say theres no stuttering on console, but i do believe if stuttering present on all platforms, PC platform will be the worse.

Traversal stutter is mostly present on all platforms. Shader stutter is PC exclusive but even if shaders are not correctly cached at startup (most games do that) this fixes itself when playing.
 
Because Sony's using AMD HW 100% of new PlayStations have RDNA and dedicated RT cores that *75% of them don't want and will never use.
Sony needs to double down on R&D and build a proprietary GPU that doesn't waste money and space on HW that's effectively worthless for most of its users.
In reality, some non-zero percentage of the 25% of PS5 users with RT 'turned ON' is the direct result of the PS5's complicated UI causing users to pick the wrong setting - creating a poor UX.

*According to Cerny.
It's because most people would rather play at 60fps and RT/Fidelity mode usually drops things down to 30FPS. If more games did what Spider-Man 2 did, some level of RT available in every mode - fidelity and performance - then everyone would benefit from RT. By the time the PS6 comes along, I'm guessing more games will be doing that.

The base PS5 isn't exactly an RT powerhouse, and Sony creating their own proprietary GPU would be a waste of money and a terrible idea. The partnership with AMD is the second best hardware decision they've made. Moving away from a wholly new architecture with every generation was the first.
 
The only reason why RT is forced in both TDA and Indy is because it saves dev time and maybe budget.

Sounds like a fair trade off, we're often complaining about hyper inflated Dev budgets and times here. If it helps devs put out quality games faster, it's a worthy endeavor.
 
RT is actually a large leap forward for rendering. It's not some insignificant feature that is the issue, but a complete overhaul of the GI system.
Except it's not. Basically PhysX all over again. You didn't have to study comparison screenshots and videos for large leaps forward. Half of gamers weren't disagreeing about the incredible visuals in Medal of Honor Allied Assault. No one had to point where to look for the subtle shading in Mario 64. You didn't need to go back and forth with the screenshot slider to be amazed by Gears of War on an HDTV. It was self evident. Maybe people who didn't experience those leaps in real time think ray tracing is a big deal, but there are Xbox 360 games that hold their own against Dark Ages.
 
Doom The Dark Ages had zero stutters. Not shader nor traversal. It doesn't even need a shader compilation step. It runs perfectly well on a $329 GPU from over 4 years ago.
It absolutely does not. I have a 6750xt and it runs like dog shit at 1080p with a 7800x3D. Can't even maintain 60fps no matter what settings.
 
Top Bottom