DF: Doom: The Dark Ages "Forced RT" Backlash - Is It Actually Justified?

what does RT has to the with bigger levels, more enemies and load times?
Incoming: explanation on how RT saves devs TONS of time because they don't have to pre-bake the lighting, yet game dev cycles are still taking 2-3x longer than previous gens
 
what does RT has to the with bigger levels, more enemies and load times?
The devs mentioned that having RT allowed them to bypass baking which is a time consuming process with larger environments. Ubisoft said the same thing for Shadows. Without RT, lighting of that quality on such a scale wouldn't have been possible.

The load time part is just regarding the advancements in technology. You can't have one of these discussions without mouth breathers asking about gameplay as if technology somehow hampered it.
 
Item 2 is a sticking point for your arguments, and it's clear you don't really understand what's happening in the development landscape. id, the creators of Doom: The Age Ages, explained that moving to fully real-time lighting via ray tracing eliminated the need for baking completely - not just in lighting, but in numerous effects as well. This saved approximately two years of development. So, instead of needing the standard seven years for a AAA game, DTDA took around five. That's a massive saving. There's a reason many developers shifted to Unreal Engine 5: it's real-time lighting model is "good enough" that they don't need to rely on heavily baked lighting and effects. Saving two years of development can be the difference between shipping your game and going out of business, so even though UE5 is garbage performance wise, that's a tradeoff they're happy to make.
Games that still use the baked approach to its lighting and effects are self-evidently taking longer - you're just falling afoul of the survivor bias, by only noting the games that have already released... which already took seven years to make. Notice people complaining about a lack of next gen games, lack of Sony delivering first party, etc. That's because a good amount of developers are only using ray tracing for reflections, or to speed up probe look ups. They're still baking most of their lighting and effects work, so their development schedules have ballooned again this generation - meaning many of them will only release one game this generation, right as it winds down.
Your third point is pretty funny given that id delivered real time ray traced lighting at 60FPS on every console - including the Series S. Other games like Metro Exodus show that RT is perfectly useable on current hardware at good frame rates - but it's down to the developers to actually make it work. And frankly, most developers simply aren't good enough.

More on topic, back in the day id lived on the bleeding edge. Their games often coincided with the release of new hardware that demanded users upgrade just to play their games. RT-capabable hardware released seven years ago, and current-gen consoles released five years ago. Bitch about progression if you want, but you forfeit the right to complain about anything technology related if you honestly believe developers should lag nearly a decade behind. As far as I'm concerned, mandatory RT should have arrived with the current generation of consoles, to save us from garbage like UE5's software lumen ruining an entire generation of video games.
FWIW, my criticisms of RT were not specific to id's implementation but meant about RT in general. I would happily agree it is better than most, especially shoddy Lumen ones.

Also 2016 -> Eternal took 4 years, not 7 and did not require RT.
 
Last edited:
still early. Only with PS6 gen need start.
PS5 will be about Path tracing not basic Ray tracing like current gen consoles are. The PS5 should have at least as much power as a 4080 from 2022, 4080 is an excellent path tracing card. Would be nice if the PS6 would reach 4090 level but seeing as the PS5 Pro basically only reached the 3070 idk that Sony can pull it off this time.
 
Does it runs on Series S? With 4tflops?

Tflops wise is the same as a 1060.
it does because it supports RT unlike the 1060. resolution wise its really low though. PS5 targets 1080p 60 fps with its 10 tflops. thats basically what the 4060 is going to give you.
 
The problem is the "forced" part. All that id had to do was to give the user an option to turn in on/off. Just like all other games.
Doesn't it have no other lighting option. Like it would just be a black screen if it was turned off.
 
Thread reminds me of the upset moms when Nintendo released the SNES and they couldn't play the new games on their NES.

"People market things to make you spend more money."
 
Last edited:
Yeah, im just not buying that. If you consider a PS5 a competent console then you can easily get a PS5 equivalent GPU for $200-300. the 4060 Alex is talking about in this video is $300-350 and should offer equivalent performance to a PS5.

I do understand that compared to last gen when you could get a 970 or a 1060 for $300 and have double the PS4 performance, nowadays you need to spend roughly double that to double the PS5 performance. And thats fucked up, but the PS5 also had a shortage for years and yet, no one complained that sony made demon souls, ratchet and spiderman 2 exclusive to the PS5. we WANT devs to invest in next gen tech in the console space and we are supposedly the peasants. its time the PC userbase stopped complaining and embraced next gen.
Why would you waste money on just a PS5 equivalent card when you can just get a hassle free experience in a console?

Putting in all that work just for the bare minimum. Doesn't sound fun.
 
Who cares? Play video games and have fun. This is just the usual daily reminder of why I maintain that gamers have ruined gaming. Imagine caring what people think about lighting and reflections in games that already look amazing as is. This is the segment of the community at large that is embarrassing. Bring on the hate.

On Fire GIF by DefyTV

Is this part of the gloriousness of Doom?

 
FWIW, my criticisms of RT were not specific to id's implementation but meant about RT in general. I would happily agree it is better than most, especially shoddy Lumen ones.

Also 2016 -> Eternal took 4 years, not 7 and did not require RT.
Never said Eternal took seven years. I explained that id said DTDA would've taken seven years if they tried to keep using their baked solutions. This is due to the step up from Eternal in terms of scope, scale, and variety. Eternal used offline baked GI via software-based RT on ids render farms. It looks as good as it does because id have some of the best baked solutions in the industry. Which makes sense - they literally invented the technique.
 
They are more likely to lose their jobs because it takes 15 years to make a large game cuz they have to bake all the lighting with every iteration and the lighting data is 100GB by itself.

Not because they lost 4% of gamers because said gamer is too poor to buy a 200 dollar GPU.
Cuz those gamers were likely not your customers anyway.
lol imagine believe that devs take 15 years to make a game only because of the type of lighting they use

i mean, story doesn't exist, they dont find other barriers along the way, its everything about the graphics

they will continue to take 15 years to make a game anyway and now with far less consumers to buy and the ones who CAN buy will be angry and wont buy because the game is running worse than it should be and doesnt seen to offer that much change on the graphical side to justify

when you guys wants to know what happened with this industry just take a look at this post, the lengths people go to try to justify the unjustifiable is the root of all evil in the shit show that has become the videogame industry
 
Last edited:
People that were around for Doom 3's launch will tell you that requiring a decent GPU from the last 5 years to play the new Doom ain't that bad.

my first PC was a Pentium 4 with a Geforce 2MX... and I remember when Direct X8 became the minimum spec 😭
meanwhile people are crying that their shitty old AMD cards or their 8 year old Nvidia cards can only run 90% of modern AAA games...

funny times we live in now.
 
Gonna pull an O onQ123 and quote myself:

Moore's Law has hit a wall, and Rasterization has also kind of run into a limitation of high cost to implement.

The future will be through improvements in Raytracing (to deliver leading image fidelity at less cost to implement) and AI-enhanced upscaling and rendering (to be able to cover Raytracing's performance hit without losing image quality). This requires a different thinking around architecture design and a culture shift amongst end users to facilitate the move to raytracing capability as a basic ground (otherwise you're in the current situation where most big games have to essentially be built twice in a sense).

The 1st player in graphics to understand this is Nvidia, Intel realized it 2nd, and AMD is waking up too late to it.

Raytracing has provided a massive improvement to fidelity while requiring very little dev resources in comparison to baked-in lighting.

I'm willing to bet that at least some of the problems with dev costs have to do with the requirement of having BOTH an RT and non-RT solution to lighting in every situation. You're essentially doing double work for no real reason at this point.


Item 2 is a sticking point for your arguments, and it's clear you don't really understand what's happening in the development landscape. id, the creators of Doom: The Age Ages, explained that moving to fully real-time lighting via ray tracing eliminated the need for baking completely - not just in lighting, but in numerous effects as well. This saved approximately two years of development. So, instead of needing the standard seven years for a AAA game, DTDA took around five. That's a massive saving. There's a reason many developers shifted to Unreal Engine 5: it's real-time lighting model is "good enough" that they don't need to rely on heavily baked lighting and effects. Saving two years of development can be the difference between shipping your game and going out of business, so even though UE5 is garbage performance wise, that's a tradeoff they're happy to make.
Games that still use the baked approach to its lighting and effects are self-evidently taking longer - you're just falling afoul of the survivor bias, by only noting the games that have already released... which already took seven years to make. Notice people complaining about a lack of next gen games, lack of Sony delivering first party, etc. That's because a good amount of developers are only using ray tracing for reflections, or to speed up probe look ups. They're still baking most of their lighting and effects work, so their development schedules have ballooned again this generation - meaning many of them will only release one game this generation, right as it winds down.
Your third point is pretty funny given that id delivered real time ray traced lighting at 60FPS on every console - including the Series S. Other games like Metro Exodus show that RT is perfectly useable on current hardware at good frame rates - but it's down to the developers to actually make it work. And frankly, most developers simply aren't good enough.

More on topic, back in the day id lived on the bleeding edge. Their games often coincided with the release of new hardware that demanded users upgrade just to play their games. RT-capabable hardware released seven years ago, and current-gen consoles released five years ago. Bitch about progression if you want, but you forfeit the right to complain about anything technology related if you honestly believe developers should lag nearly a decade behind. As far as I'm concerned, mandatory RT should have arrived with the current generation of consoles, to save us from garbage like UE5's software lumen ruining an entire generation of video games.
Excellent reply.

The only part I'd differ with is the notion of devs not being good enough.

Some are, but RT is generally a newer technology to deal with, and their job isn't aided by AMD being the main console supplier, given how badly they've been lagging behind Nvidia in the RT performance and AI upscaling department for years, to the point where Sony had to unite their HW development with them to help them get a leg-up and catch-up.
 
People that were around for Doom 3's launch will tell you that requiring a decent GPU from the last 5 years to play the new Doom ain't that bad.
people that were around for doom 3 will tell you that the diference between doom 3 and whatever came before at least justified the requirements at the time

plus, people who couldn't play doom 3 at launch where playing at max settings less than 4 years after at less than half the price with an 8600 gt, thats not happening anymore, moore law is dead, it will only get worse
 
people that were around for doom 3 will tell you that the diference between doom 3 and whatever came before at least justified the requirements at the time

but it was in the same situation tho.
plenty of games back then offered modes for older shader models still, while others didn't. just like with this "forced RT" discussion.
Doom 3 was a forced Pixel Shaders 1.1 game. same as SC: Chaos Theory. Doom 3 had a shader model 2.0 mode, and Chaos Theory a Shader Model 3.0 mode. and then, eventually, Shader Model 2.0 was "forced"... and then 3.0.

and this happened rapidly.

so even if you saw a bigger jump. it still meant that the card you bought maybe 2 years ago couldn't play some new games anymore.
 
DF is completely oblivious to the pushback. Comparisons to landmark generation defining titles, like Half Life 2 are moot. Those games raised an obvious visual bar and justified their new tech baseline.

Doom on the other hand, looks no better if not less so next to its predecessor, hence the outcry. Why do I need bigger hardware to run this at lower fidelity?

This is the crux of the matter, it's not the tech - it's the final perception.
 
but it was in the same situation tho.
plenty of games back then offered modes for older shader models still, while others didn't. just like with this "forced RT" discussion.
Doom 3 was a forced Pixel Shaders 1.1 game. same as SC: Chaos Theory. Doom 3 had a shader model 2.0 mode, and Chaos Theory a Shader Model 3.0 mode. and then, eventually, Shader Model 2.0 was "forced"... and then 3.0.

and this happened rapidly.

so even if you saw a bigger jump. it still meant that the card you bought maybe 2 years ago couldn't play some new games anymore.
it was not the same situation in the sense that they were trying to improve graphics and make something that was not possible in anyway before and not trying to make devs lives easier

the discourse here is not "we cant do this with this type of lightning" it is " we can do but will take more time" and then "you need to accept this because that is how the things always were"

except it wasnt
 
DF is completely oblivious to the pushback. Comparisons to landmark generation defining titles, like Half Life 2 are moot. Those games raised an obvious visual bar and justified their new tech baseline.

Doom on the other hand, looks no better if not less so next to its predecessor, hence the outcry. Why do I need bigger hardware to run this at lower fidelity?

This is the crux of the matter, it's not the tech - it's the final perception.
the moron alex bataglia must be one of the most idiotic human being that i ever seen in my life, the way that hes forcing the devs narratives is absolutely not journalism of any kind, the guy from threat interactive, despite any flaws that he can have himself, was dead right on him, this guy should be ashamed of himself
 
Last edited:
Just looked it up, game took about 3.5 years to make. Doesn't look like the dev claims saying RT helped cut down the development time were exaggerated.
 
That's the thing. Tons of people cannot play. Broadly speaking, most people agree that they could have achieved the look they ended up with in TDA without using RT the way they did.
With pre-computed lighting, no one would be playing the game (as it is) right now, either because it would not have been released, or because the scope would have been changed.

DF is completely oblivious to the pushback. Comparisons to landmark generation defining titles, like Half Life 2 are moot. Those games raised an obvious visual bar and justified their new tech baseline.

Doom on the other hand, looks no better if not less so next to its predecessor, hence the outcry. Why do I need bigger hardware to run this at lower fidelity?

This is the crux of the matter, it's not the tech - it's the final perception.
Alex's main comparison was with titles that required Shader Model 3.0. This happened in 2006/2007 and did not deliver any significant increase in visual quality as compensation, while locking out cards that had been released 2 years before. In comparison, requiring RT locks out cards that were released 6+ years before (slightly less in the case of the 5700 XT). And enabling physics based destruction (with lighting) is arguably raising the visual bar in some respects.


it was not the same situation in the sense that they were trying to improve graphics and make something that was not possible in anyway before and not trying to make devs lives easier

the discourse here is not "we cant do this with this type of lightning" it is " we can do but will take more time" and then "you need to accept this because that is how the things always were"

except it wasnt
I would argue that in 2006/2007 you could have made a similarly good looking game with SM 2.0 as you could have with SM 3.0. It just made developers' lives easier by enabling more complex shaders.
 
Last edited:
Just looked it up, game took about 3.5 years to make. Doesn't look like the dev claims saying RT helped cut down the development time were exaggerated.
This is actually insane considering how it's also an entire playstyle overhaul too.
 
lol imagine believe that devs take 15 years to make a game only because of the type of lighting they use

i mean, story doesn't exist, they dont find other barriers along the way, its everything about the graphics

they will continue to take 15 years to make a game anyway and now with far less consumers to buy and the ones who CAN buy will be angry and wont buy because the game is running worse than it should be and doesnt seen to offer that much change on the graphical side to justify

when you guys wants to know what happened with this industry just take a look at this post, the lengths people go to try to justify the unjustifiable is the root of all evil in the shit show that has become the videogame industry

Far less?
Have you seen the Steam Survey?
Like 90% of PCs on Steam are DXR capable.
The 10% that arent likely wouldnt meet minspec anyway and or would be too poor to afford a 70 dollar game, cuz a 70 dollar game would be half way to them getting a DXR GPU.

RTX3060s easily run this game at above 60 using Ultra Nightmare settings.....cant imaging how fast the thing does using medium or high settings.



And yes iterating on baked lighting adds an incredible amount of time, data and compression time to a games development especially for large games.

If you are making a static corridor shooter you could easily get away with baking but a dynamic open large world which also has indoor areas or large swaths of space adds alot more work/time to the game.
The level design could be finished, gameplay systems on lock, now you are just waiting for lighting artists to finish what they are doing.....ohh the baked lighting makes an area not so much fun to play cuz its too dark, well iterate again.....and bake again rinse and repeat till we get there.

Or use GI; lighting artists, environment artists, texture artists can work on the same repo without waiting for each other to upload the latest build.


Do you guys also complain about DX9.0c features being used in videogames, would you like devs to go back to pre DX9c techniques so that more people can play the game?


You guys are fighting for a team that doesnt actually exists.....you are complaining about a nonexistant problem simply beause complaining is what gamers do these days.

Forward looking techniques in PC gaming have been a thing for as long as PC gaming has been a thing.
People who get left behind dont actually cry foul, they upgrade their machines or buy a Series S.

There is no actual Anti-Forced RayTracing team......you arent on that team, no one is on that team......why are you fighting for it?
 
people that were around for doom 3 will tell you that the diference between doom 3 and whatever came before at least justified the requirements at the time

plus, people who couldn't play doom 3 at launch where playing at max settings less than 4 years after at less than half the price with an 8600 gt, thats not happening anymore, moore law is dead, it will only get worse
And we are forgetting Crysis which didn't run that well on anything at launch and never ran as well as it should with later upgrades because the devs failed to predict industry trends. The only people who want to return to this are true pixel maniacs who are better off in a demo scene than a commercial games industry.
 
DF is completely oblivious to the pushback. Comparisons to landmark generation defining titles, like Half Life 2 are moot. Those games raised an obvious visual bar and justified their new tech baseline.

Doom on the other hand, looks no better if not less so next to its predecessor, hence the outcry. Why do I need bigger hardware to run this at lower fidelity?

This is the crux of the matter, it's not the tech - it's the final perception.

Anyone who thinks Dark Ages looks the same or worse than Eternal needs help.





3GszPR.gif
 
And we are forgetting Crysis which didn't run that well on anything at launch and never ran as well as it should with later upgrades because the devs failed to predict industry trends. The only people who want to return to this are true pixel maniacs who are better off in a demo scene than a commercial games industry.

Do you think the industry trend of RayTracing is wrong?

Crysis expected faster and and faster CPUs instead of slower(at first) with more cores/threads.
Sure simple mistake to make.

But you really think the PS6 is gonna be like cut down the RT and ML support?
You think Nvidia, AMD, Intel arent gonna be leveraging more and more RT and ML support in their nextgen cards?


This isnt a single vs multithread bet.......the RT/ML future is all but guaranteed, devs having given you 8 years to get ready for moments like these with no change in trends in sight, would be an oversight by you.
If you didnt think more and more devs were going to start using RT features to help with development, you are a fool.

1a39a42b-f1cf-4cb4-a5d4-67079115d0f6_text.gif
 
Too many people refuse to let go of their own GTX 1060 and RX 580. My friend was like this until i gave him my old RTX 3060 for a discount because he was crying about not being able to play FF16.
 
Last edited:
Do you think the industry trend of RayTracing is wrong?

Crysis expected faster and and faster CPUs instead of slower(at first) with more cores/threads.
Sure simple mistake to make.

But you really think the PS6 is gonna be like cut down the RT and ML support?
You think Nvidia, AMD, Intel arent gonna be leveraging more and more RT and ML support in their nextgen cards?


This isnt a single vs multithread bet.......the RT/ML future is all but guaranteed, devs having given you 8 years to get ready for moments like these with no change in trends in sight, would be an oversight by you.
If you didnt think more and more devs were going to start using RT features to help with development, you are a fool.
There is nowhere near the same amount of skin in the game. Corporations have to invest heavily in the production of these games while a consumer can just wait and see on hardware upgrades to see if they can care at all. It is totally up to the corporations to convince the wider player base that the content is really so good that the pain level of the high end hardware upgrades are worth it.

The problem for the industry is not if real time ray tracing can work if the investment is high enough but if the market can be disrupted by games that are more compelling to users on more accessible hardware that don't take advantage of it. I suppose it is possible that real time ray tracing can be a disruptive tech in that it makes development less annoying but it is shifting hardware requirements from the developer to the consumer and it is to be seen if the majority of consumers really see the image quality changes as essential to their enjoyment of games. It is a similar problem for client based AI, if people really see paying more often for very expensive GPUs for upscaling, "fake frames", and future buzzwords as a killer feature.
 
There is nowhere near the same amount of skin in the game. Corporations have to invest heavily in the production of these games while a consumer can just wait and see on hardware upgrades to see if they can care at all. It is totally up to the corporations to convince the wider player base that the content is really so good that the pain level of the high end hardware upgrades are worth it.

The problem for the industry is not if real time ray tracing can work if the investment is high enough but if the market can be disrupted by games that are more compelling to users on more accessible hardware that don't take advantage of it. I suppose it is possible that real time ray tracing can be a disruptive tech in that it makes development less annoying but it is shifting hardware requirements from the developer to the consumer and it is to be seen if the majority of consumers really see the image quality changes as essential to their enjoyment of games. It is a similar problem for client based AI, if people really see paying more often for very expensive GPUs for upscaling, "fake frames", and future buzzwords as a killer feature.

I dont know anyone with an RTX card who doesnt use DLSS or DLAA when the option presents itself.

As for RT, mate what the fuck graphics card are you using that isnt DXR capable, youve already made the investment into the tech even if NOT all devs are actually using it.
You are already in, which is why I say there is no Anti-RayTracing team because someone without a DXR capable device shouldnt even be in the AAA game conversation.
 
This is the same PC lot who bang on about how they miss Crisis jumps where you needed a new GFX card that supported DirectX 10 , but now take issue with Doom?
 
I dont know anyone with an RTX card who doesnt use DLSS or DLAA when the option presents itself.

As for RT, mate what the fuck graphics card are you using that isnt DXR capable, youve already made the investment into the tech even if NOT all devs are actually using it.
You are already in, which is why I say there is no Anti-RayTracing team because someone without a DXR capable device shouldnt even be in the AAA game conversation.
This is a different question. With a franchise like Doom you are not really aiming only at people who are generic "full price game a month AAA enjoyers". You might be aiming at people whose main gaming years were the 90s and they don't really give a shit any more because they have a life. You might be aiming at some gym bro who was stoked up by Doom 2016 and only otherwise plays EA FC or something. The "AAA conversation" is supposed to be as wide a market as possible because they need to make big numbers.
 
For me 2080ti 3090 4090 5090…I'm all in for RT

Mine is similar and I still have most of mine.

2080ti, 3090, 3090ti, 4090, 5090

I still own 1080ti, 2080ti, 3090ti, 4090, 5090

I really should do something with the 1080ti and 2080ti but they're only worth a few hundred combined.

My 3090ti needs to go in my flight simulator

My 4090 is in my driving simulator

I sold my 3090

5090 is in my main pc.
 
It absolutely does not. I have a 6750xt and it runs like dog shit at 1080p with a 7800x3D. Can't even maintain 60fps no matter what settings.
I have a 6700 XT and runs smooth as butter with a 5600X at 1440p FSR at around 70 fps never going below 60 ever... You sure as hell have a lot of junk in your windows installation, get rid of all of it
 
The kicker is I think Doom Eternal looks better. Don't remember thinking Far cry looked better than Crysis. It the performance loss relative to the fidelity gained which is different now. Can't compare to the 90s and early 2000s.
 
Not really bothered about forcing RT. It was going to happen eventually, as is what happens every so often with PC.

Just wish the performance was there without the need of up-scalers and frame generation.
 
most people agree? We have the dev literally saying that the game couldnt be possible without RTGI because the levels are too big (5-10x larger
) and too dynamic to be done without RT. At least not with the resources they had on hand.

Timestamped:


People are just salty that their crusade against RT might end up in them eating crows due to how actually beneficial it is, they'd be forced to recognize it. At least that's the case for some of them.

I am against using RT in games but only if it's done like a gimmick, because making it optional and just putting a half assed "solution" is what have it the bad fame it has among gamers. The Doom implements it is the right way and should be mimicked by everyone.
 
Last edited:
I'm asked chatgp calculate how many RT and nonRt users in steam
According to the Steam Hardware & Software Survey for April 2025, about 55% of Steam users have graphics cards that support hardware Ray Tracing. This means that approximately 45% of players are using GPUs without Ray Tracing support.
 
Last edited:
Top Bottom