DF: Doom: The Dark Ages "Forced RT" Backlash - Is It Actually Justified?

the actual real problem is that most budget GPUs perform similarly or worse than PS5
people are just so used to doubling or tripling the ps4 performance with incredibly cheap GPUs

from a certain perspective, the game is just fine. ps4 ran doom eternal at 1080p 60 fps. ps5 runs doom dark ages at 1080p at 60 fps (but reconstructed to 4k which is 30% more demanding and looks much better)

did anyone complain about doom eternal running at 1080p 60 fps on ps4... or would people complain about how it should not use ps4 graphics and stick to ps3 graphics for getting 1080p 120 fps for example? no, I don't think so... that is why console users are mostly happy with what they get . however 1060 users were able to get 120 fps in doom eternal. now they are on GPUs such as rtx 4060, rx 6700xt and they cannot even get 1080p 60 fps at medium settings without upscaling



people should just get used to getting console like performance with console like hardware
 
Last edited:
They are right. Back in the day PC gaming was for enthusiast. He brought up Half life 2 and Crysis and it made us go, ok we need to upgrade or build a PC. Now the PC space is filled with cheap bastards who buy shitty 8GB GPUs while refusing to upgrade their CPUs, ganging up on guys like Todd Howard who dare suggest that maybe you should move on from you 6 year old CPUs.

PC Master Race was annoying but this new PC Peasant Race is the worst. Ragging on devs who are actually embracing new tech that's actually 7 years old so not even new anymore. Shitting on devs who try and use the very tech in the GPUs they paid for. They arent just cheap, they are not gamers. They are posers who want to keep gaming stuck in the 50s. I look at the Steam top ten and am blown away by how many shitty dated fucking games are on there. They are barely better than mobile gamers. Go jerk off to Genshin Impact and leave hardcore gaming to hardcore gamers, thank you.
 
Who cares? Play video games and have fun. This is just the usual daily reminder of why I maintain that gamers have ruined gaming. Imagine caring what people think about lighting and reflections in games that already look amazing as is. This is the segment of the community at large that is embarrassing. Bring on the hate.

On Fire GIF by DefyTV

Exactly right. That embarrassing segment has even sucked the joy out of being fascinated by the tech. I used to love a good technical breakdown to teach me about what was causing my strange tech tingles.
 
They are right. Back in the day PC gaming was for enthusiast. He brought up Half life 2 and Crysis and it made us go, ok we need to upgrade or build a PC. Now the PC space is filled with cheap bastards who buy shitty 8GB GPUs while refusing to upgrade their CPUs, ganging up on guys like Todd Howard who dare suggest that maybe you should move on from you 6 year old CPUs.

PC Master Race was annoying but this new PC Peasant Race is the worst. Ragging on devs who are actually embracing new tech that's actually 7 years old so not even new anymore. Shitting on devs who try and use the very tech in the GPUs they paid for. They arent just cheap, they are not gamers. They are posers who want to keep gaming stuck in the 50s. I look at the Steam top ten and am blown away by how many shitty dated fucking games are on there. They are barely better than mobile gamers. Go jerk off to Genshin Impact and leave hardcore gaming to hardcore gamers, thank you.
Back in the day, hardware wasn't financing a car just to upgrade. Cost of living was down across the board, and people were still making the same money as they are now. In some instances, even more.

Envelope pushing games like Crysis and the like were outliers. Quake 3 games ran on everything. Unreal games ran on everything. Half Life 2 (Source) games ran on everything in short order.

But I do partially blame the Fortnite, etc. boom bringing normie$ into the medium 100 fold. And the endless quest to answer to shareholder's pockets.
 
The problem is the "forced" part. All that id had to do was to give the user an option to turn in on/off. Just like all other games.

And the question is not about we having RT hardware for 7 years already. The problem is that RT causes huge performance drops.
For example, Doom Eternal on a 4080 can 175 fps, at native 4K.
But the 4080 on Doom the Dark Ages, it can only do 51 fps.
Losing 3.5 times the performance for marginal gains in image quality is not a good trade off.
But if the option was there to turn it on/off, there would be no problem, as each person could choose what they wanted.
And this is the crux of the matter, choice.

doom-eternal-3840-2160.png


performance-3840-2160.png
Dude, one is a last gen game designed around a 1.8 tflops gpu while the other is designed around 10 tflops. Of course, its going to be heavier. It's got way bigger environments, way more enemies, destruction physics, hair strands, and a bunch of other next gen features. RT is only a small part of it.

KCD2 i have to run at 4k dlss quality to balanced to get 60 fps maxed out. Sometimes all the way down to performance in the heavy areas like the big city. The first game i can run at native 4k 60 fps without dips. Should i be upset at them for pushing visuals?
 
Dude, one is a last gen game designed around a 1.8 tflops gpu while the other is designed around 10 tflops. Of course, its going to be heavier. It's got way bigger environments, way more enemies, destruction physics, hair strands, and a bunch of other next gen features. RT is only a small part of it.

KCD2 i have to run at 4k dlss quality to balanced to get 60 fps maxed out. Sometimes all the way down to performance in the heavy areas like the big city. The first game i can run at native 4k 60 fps without dips. Should i be upset at them for pushing visuals?

Yes, because we can turn down a lot of settings to improve performance. Except RT.
 
They are right. Back in the day PC gaming was for enthusiast
They not right.
Radeon 9800pro - 400$ High end card
Radeon X800XL - 300$ high end
8800GT - 350$ high end card
Lower cards was 100-200$ cost.
If those price would be today, then people would easily to upgrade.
 
Last edited:
Back in the day, hardware wasn't financing a car just to upgrade. Cost of living was down across the board, and people were still making the same money as they are now. In some instances, even more.

Envelope pushing games like Crysis and the like were outliers. Quake 3 games ran on everything. Unreal games ran on everything. Half Life 2 (Source) games ran on everything in short order.

But I do partially blame the Fortnite, etc. boom bringing normie$ into the medium 100 fold. And the endless quest to answer to shareholder's pockets.
i dont know man. I was working a minimum wage job as a cashier in high school when HL2 was revealed and this is back when minimum wage $5.80 and no one was paying $12-15 like they do now at walmart and mcdonalds. The best i got was working on sundays and getting time and a half. I had to save maybe 2 months of paycheck to build me a $1,000 pc as a high school student working part time.

You can easily build a capable $1,000 pc today. Alex shows that a shitty $300 4060 can max out this game at 1440p. Just turn down some settings to high and you have a 60 fps experience.

I read that something crazy like 80 million only fans subscribers are men. Thats like half of the men in the u.s. maybe stop giving your money to whores and buy a decent graphics card. a 5070 can be bought for $549 today in stores and its easily 2x more powerful than the PS5. thats only $200 more than my second PC in 2011 which also cost $1,000 and had a GTX 570 in it. You dont have to spend $1000+ on a GPU to get good performance.
 
i dont know man. I was working a minimum wage job as a cashier in high school when HL2 was revealed and this is back when minimum wage $5.80 and no one was paying $12-15 like they do now at walmart and mcdonalds. The best i got was working on sundays and getting time and a half. I had to save maybe 2 months of paycheck to build me a $1,000 pc as a high school student working part time.

You can easily build a capable $1,000 pc today. Alex shows that a shitty $300 4060 can max out this game at 1440p. Just turn down some settings to high and you have a 60 fps experience.

I read that something crazy like 80 million only fans subscribers are men. Thats like half of the men in the u.s. maybe stop giving your money to whores and buy a decent graphics card. a 5070 can be bought for $549 today in stores and its easily 2x more powerful than the PS5. thats only $200 more than my second PC in 2011 which also cost $1,000 and had a GTX 570 in it. You dont have to spend $1000+ on a GPU to get good performance.
Inflation and cost of living has ballooned far past people's pay grades now on average. Minimum wage, etc., have not caught up like it used it prior.

It used to scale on a smooth level, now, it shot straight through the damned stratosphere. Even more so knowing that companies can just leverage more debt on the consumer with easy access financing programs (Klarna, Affirm, etc.) giving them the ability to charge 3 times or more for their products.

In 2006 8800GTX was 599$:



So ~1000$ GPUs are not exactly new.
Yes, inflation went way, way up. Most minimum wage and pay grades did not follow like they used to. Thus people feel the pressure of being worse off.
 
Last edited:
They not right.
Radeon 9800pro - 400$ High end card
Radeon X800XL - 300$ high end
8800GT - 350$ high end card
Lower cards was 100-200$ cost.
If those price would be today, then people would easily to upgrade.

In 2006 8800GTX was 599$:

$953.16 in Apr. 2025 equals $599 of buying power in 2006 (Average).

So ~1000$ GPUs are not exactly new.
 
Does not mean that everybody wants it on and take the performance hit

that's like saying "not everyone wants to turn on physically based rendering and take the performance hit", or "not everyone wants to turn on parallax occlusion mapping and take the perfomance hit"

how exactly is raytracing different here? hint: it's not.
 
16xx users can play and game run fine for them


in terms of driver support, technically the 16xx and 10xx both support raytracing. but sadly the support on the side of games has been declining.
you can play nearly all early "RTX" titles on the GTX10xx series with RT enabled. on a 1080ti you even get perfomance out of them that almost rival current gen consoles.

someone tested Watch Dogs 3 on a 1080ti, and it legit beats the PS5/Series X with RT enabled.
 
I dislike forced RT cause it always should be an option. While I have the hardware for it, not everybody does.

PC gaming was always known for the customisation of the settings to cater to your liking. Don't force shit on anybody
it also doesnt have software rendering for your dad's 733mhz pentium3

personally, i want RT forced up everyone's butt so hard that nvidia/amd actually release performant RT GPUs.
what we have now sucks.
 
Last edited:
Maybe people who have a negative view of RT should check out the DF interview with id tech guy on this. He basically said that by having RT over baked lighting, greatly increased their development time and allows you to adjust and see in real-time how it looks.
Hopefully now that RT hardware has been assigned for some time, that developers can start to really use it in an artistic and creative way, and maybe even have AI that responds to changes in light and shadow. That would be cool
 
I dislike forced RT cause it always should be an option. While I have the hardware for it, not everybody does.

PC gaming was always known for the customisation of the settings to cater to your liking. Don't force shit on anybody
You got too used to those terrible and horrible hybrid lighting with optional RT approaches we've gotten since the first RTX cards came out that have a very bad look to RT as a whole, didn't you?

The only way to properly implement and optimize RT and any modern technology is to make it exclusive, else all the resources that should go there are gonna go to fake lighting with rasterized approaches which is expensive in money and probably in hardware too, that's in part why imo Lumen is so shit, Epic want to swallow more than they can bite with it, the reasons why RT is necessary is to avoid having to implement unnecessarily expensive, time consuming and design limiting static lighting.

People must upgrade already, GTX cards are for last gen gaming.
 
I dislike forced RT cause it always should be an option. While I have the hardware for it, not everybody does.

PC gaming was always known for the customisation of the settings to cater to your liking. Don't force shit on anybody
I mean features will be forced if all the hardware that comes out supports it. Arguing that it's too early is one thing, but it's always gonna be forced eventually.
 
The advantages of RT are too good to pass up. From a development standpoint, less disk space wasted on baked lighting, faster development time due to not having to bake lighting, place light sources, or wait for a scene to update when doing the previous two. From a gaming standpoint RT is simply better and more accurate vs prior solutions.

The only main negative so far has been the performance cost and noise. The former is becoming less of a problem as RT techniques and hardware advance, and the latter is being resolved by ML denoising (Ray Reconstruction from Nvidia and Ray Regeneration from AMD/Sony).

As for this specific game, GPUs that can use the technology are almost 7 years old, and you could buy a GPU 2 years ago for $299 that plays the game at 1080p/60 (or 1440p/60 with DLSS).

Back in the late 90s till the late 00s you were lucky if a 7 year old GPU could play anything at all, heck even 2 year old GPUs could struggle, especially the mid to low range. I really don't remember that many complaints when games forced the usage of hardware T/L or programmable shaders.
 
I can get the game to run and look better than DFs Series X in 1080p dlss quality.
Just gotta tinker with the settings.
All of this on a 2060 super. I am going to upgrade my gpu soon though.
 
I remember when the first patch for GTA3 on pc arrived which removed the flying newspapers because a lot of users had huge performance problems on pc and couldn't hold the framerate
 
I dislike forced RT cause it always should be an option. While I have the hardware for it, not everybody does.

PC gaming was always known for the customisation of the settings to cater to your liking. Don't force shit on anybody

I take it you are new to PC gaming then?
Cuz hardware was made obsolete on an almost generational basis.
Your PC would still feel new when a game would come out that requires a new feature set you dont have.

If anything gamers these days have it good cuz generally your PC will meet minspec theres what 5 games that absolutely need DXR?

The latest and greatest AAA games would/should always uses the latest feature sets available.




RTGI is net good cuz it lets devs err dev faster cuz they dont have to bake lights they also dont have to store lighting information on disk, so thats a boon for us.
Using pure probes would be ineffecient, and expecting artists to hand place them for large worlds is a joke.

When textures got better devs didnt need to bake lighting into textures, it made development better/easier/quicker and as a byproduct games looked better.

Over the years alot of techniques have come around and used new featuresets they used to be lauded and applauded over for being forward thinking.......now gamers complain about new features being implemented, and im really lost.


If you see how NaughtyDog does their cutscenes and I believe photomode to make them look that good, youll understand why RTGI is net positive cuz so few studios have the time/talent/budget to hand place and hand navigate lights to make the game look right.



P.S What the fuck GPU are you using that DXR is off the table?

P.P.S Dark Ages looks leagues better both in micro and macro scale compared to Doom'16 and even Eternal, I dunno how people can say it doesnt look much better.

P.P.P.S Werent people complaining early gen that Geometry Engine/Mesh Shaders werent being used even if it was a feature of new hardware, then a game comes out that requires them and you bitch about it?
Pick a lane.....do you want them to use new features or not?
 
i'm talking about 8800GT, much cheaper and not much lower perf
22c3255be885b6eaada8224edda50052.png

72391a74a6343888e346fd505c03cd16.png
44fps at the endgame resolution (for 2007) of 1200p in Bioshock, with zero AA an AF mind you. Adjusting for inflation the equivalent card today would be the 9700, which clears 50fps at 4K native in Doom. With FSR4 that goes even higher. Doesn't seem that RT is a problem here.
 
Inflation and cost of living has ballooned far past people's pay grades now on average. Minimum wage, etc., have not caught up like it used it prior.

It used to scale on a smooth level, now, it shot straight through the damned stratosphere. Even more so knowing that companies can just leverage more debt on the consumer with easy access financing programs (Klarna, Affirm, etc.) giving them the ability to charge 3 times or more for their products.


Yes, inflation went way, way up. Most minimum wage and pay grades did not follow like they used to. Thus people feel the pressure of being worse off.
Literally every GPU released since 2020 supports ray tracing though.
 
Problem is… Actually good artistic baked GI trashes forced RT-only render almost any time performance-wise and sometimes artistic-wise.

Does Doom TDA look good? Yeah, kinda. Does it look 2 generations of GPUs and 100fps performance loss better than Eternal? I highly doubt that. Plus soft IQ on all platforms is just a sad thing to see and many rightfully asking a justification of such sacrifices for so little tangible visual gains in a fast-paced shooter.
 
Problem is… Actually good artistic baked GI trashes forced RT-only render almost any time performance-wise and sometimes artistic-wise.

Does Doom TDA look good? Yeah, kinda. Does it look 2 generations of GPUs and 100fps performance loss better than Eternal? I highly doubt that. Plus soft IQ on all platforms is just a sad thing to see and many rightfully asking a justification of such sacrifices for so little tangible visual gains in a fast-paced shooter.
No soft IQ on PC with DLSS4 quality
 
When devs start to lost their jobs, this will change fast and all this shit this imbecile wants to pass will be gone
They are more likely to lose their jobs because it takes 15 years to make a large game cuz they have to bake all the lighting with every iteration and the lighting data is 100GB by itself.

Not because they lost 4% of gamers because said gamer is too poor to buy a 200 dollar GPU.
Cuz those gamers were likely not your customers anyway.
 
Developers should absolutely invest millions of dollars and engineering years into fallback rasterized lighting so people with pre-2018 hardware get to still feel relevant.
 
No soft IQ on PC with DLSS4 quality
DLSS4 presents a whole slew of artifacts where I can run Eternal at 200+ FPS native with RT on my 4080.

So again, is this tradeoff even worth it? The Dark Ages looks great, but everything I saw in my 2 playthroughs can been achieved with GI baking and other fast yet conservative methods.

And despite owning a 4080 it's a problem for my 240hz qhd OLED. I barely can run TDA in 120hz in decent quality, let alone shot for 240. RT is a burden for fast-paced action games.
 
Last edited:
For years on Playstation or PC the first thing we have been programmed to do is turn Ray Tracing off.

You think that might have something to do with why people don't like the fact that you can't turn it off?

"Get with the times guys," or "don't you own a phone?" works really well on the gaming community.

I don't give a fuck that they want to be the pioneer to saving money on my ass by eliminating a feature.
 
Last edited:
the actual real problem is that most budget GPUs perform similarly or worse than PS5
people are just so used to doubling or tripling the ps4 performance with incredibly cheap GPUs

from a certain perspective, the game is just fine. ps4 ran doom eternal at 1080p 60 fps. ps5 runs doom dark ages at 1080p at 60 fps (but reconstructed to 4k which is 30% more demanding and looks much better)

did anyone complain about doom eternal running at 1080p 60 fps on ps4... or would people complain about how it should not use ps4 graphics and stick to ps3 graphics for getting 1080p 120 fps for example? no, I don't think so... that is why console users are mostly happy with what they get . however 1060 users were able to get 120 fps in doom eternal. now they are on GPUs such as rtx 4060, rx 6700xt and they cannot even get 1080p 60 fps at medium settings without upscaling



people should just get used to getting console like performance with console like hardware

6700 XT runs the game easily way better than consoles, mine is giving me 1440p XeSS/FSR and 65-80 fps without dips below 60, RDNA 2 cards are good for that, a friend of mine beat the game in a 6650 XT and mostly the same story, both using High settings.

The game is amazingly optimized, that's what you get when devs focus on optimizing RT, people here asking for a fallback option or RT being optional don't know about how this work on development side, the sole reason why RT has been optional so far is because they were shitty gimmicky implementations running alongside rasterized lighting thus giving RT a very bad fame.

The only thing is give up is that Id Software shouldn't have prevented old cards from running the game, the RT clearly has an automatic software mode because some people are bypassing the RT verification on the GPU and running the game anyway, just way slower.

Yes guys, it's not that RT is being disabled and that the game is looking exactly the same, RT running but people are just bypassing RT check, that's just it.
 
Last edited:
Yeah. Main reason it's give more fps and people can sit with GPU much longer, than 4k users.
That has always been the case with higher resolutions. Most people that buy a 9700 today probably don't game at 1080p just like most people buying a 8800 GT in 2007 didn't game at 1280x1024.
 
6700 XT runs the game easily way better than consoles, mine is giving me 1440p XeSS/FSR and 65-80 fps without dips below 60, RDNA 2 cards are good for that, a friend of mine beat the game in a 6650 XT and mostly the same story, both using High settings.

The game is amazingly optimized, that's what you get when devs focus on optimizing RT, people here asking for a fallback option or RT being optional don't know about how this work on development side, the sole reason why RT has been optional so far is because they were shitty gimmicky implementations running alongside rasterized lighting thus giving RT a very bad fame.

The only thing is give up is that Id Software shouldn't have prevented old cards from running the game, the RT clearly has an automatic software mode because some people are bypassing the RT verification on the GPU and running the game anyway, just way slower.

Yes guys, it's not that RT is being disabled and that the game is looking exactly the same, RT running but people are just bypassing RT check, that's just it.
That's way worse actually. They are arbitrarily preventing non RT cards from running the game?

Are they getting paid by Nvidia for this? If not, then why are they creating this artificial barrier to playing their game?
 
The advantages of RT are too good to pass up. From a development standpoint, less disk space wasted on baked lighting, faster development time due to not having to bake lighting, place light sources, or wait for a scene to update when doing the previous two. From a gaming standpoint RT is simply better and more accurate vs prior solutions.

The only main negative so far has been the performance cost and noise. The former is becoming less of a problem as RT techniques and hardware advance, and the latter is being resolved by ML denoising (Ray Reconstruction from Nvidia and Ray Regeneration from AMD/Sony).

As for this specific game, GPUs that can use the technology are almost 7 years old, and you could buy a GPU 2 years ago for $299 that plays the game at 1080p/60 (or 1440p/60 with DLSS).

Back in the late 90s till the late 00s you were lucky if a 7 year old GPU could play anything at all, heck even 2 year old GPUs could struggle, especially the mid to low range. I really don't remember that many complaints when games forced the usage of hardware T/L or programmable shaders.
Quick clarification. They can still place light sources even with RTGI. And most still do for cinematic reasons and incase light isnt illuminating an area like it should. It's just like you said, completely realtime and they can see the results immediately.
 
Top Bottom