DF: Doom: The Dark Ages "Forced RT" Backlash - Is It Actually Justified?

Which cards on the Steam Survey do you think would really benefit from this game NOT having RT.
And those gamers would have actually bought Dark Ages and would have played it.

Cuz that 1650, 1050 or 1060 guy was never gonna be able to play this game with or without RT.

The 1080 guys make up 0.sum%.
And dont even talk to me about 970 guys.

So those lost sales would be a rounding error.

Their argument was that Doom 2016 and eternal ran fine.

But yea I doubt without RT it would have been a good experience anyway
 
we are talking about 8 year old GPUs here... you can get a used 160€ GPU that will outperform them and I'll probably be in better shape as well.

if you have a non-RT Nvidia card, the best you'll have is a 1080ti. buy a used 2070 and you'll have a GPU that can play anything modern.
but if you are someone who buys a 1080ti you probably have upgraded a long time ago.

so we are probably mainly talking about people here with a 1070 or below, at which point even a 2060 Super would be an upgrade, which can be had for 140€ used.

I don't think a 140€ upgrade after 8 years, to be able to play all modern game, is bad
its good that you using used cards because good luck trying to find new ones that outpace the old one for the same price, like i said, moore law is dead, there is no such thing as try to advance graphical technology at any cost, they have to and they will work with what they have, either by learning this or by force when people dont buy their games
 
Last edited:
4.) Shader Model 3 removed limitations on shader instructions, and so saved optimization time, as per 3.)
but increased in complexity, suddenly devs had to account for things that wasnt there before, they could do more effects and so on

that its not what RT do
 
Which cards on the Steam Survey do you think would really benefit from this game NOT having RT.
And those gamers would have actually bought Dark Ages and would have played it.

Cuz that 1650, 1050 or 1060 guy was never gonna be able to play this game with or without RT.

The 1080 guys make up 0.sum%.
And dont even talk to me about 970 guys.

So those lost sales would be a rounding error.
I'm talking about types of people they might need to reach for a game like this to be a hit. The peak on Steam was 32k so that is how much steam hardware survey data is worth as a argument. Could they have made the game differently? Overall, not my problem. Furthermore I don't give a fuck.
 
Forced difficulty setting in any From Software title, players raise their flame shields and defend it in a battle to death because "iTs tHeIr viSiOn" and you should "giT gUd".
Forced graphics settings in two games, players be mad and like "yOuR gAme sHoUld bE aCCessiBle tO AnYoNe's hArdWarE".
 
Forced difficulty setting in any From Software title, players raise their flame shields and defend it in a battle to death because "iTs tHeIr viSiOn" and you should "giT gUd".
Forced graphics settings in two games, players be mad and like "yOuR gAme sHoUld bE aCCessiBle tO AnYoNe's hArdWarE".
Dog Reaction GIF
 
I'm talking about types of people they might need to reach for a game like this to be a hit. The peak on Steam was 32k so that is how much steam hardware survey data is worth as a argument. Could they have made the game differently? Overall, not my problem. Furthermore I don't give a fuck.

Those GTX10/16/9 users arent moving the needle with this game.
It would be a rounding error with or without their peasant class GPUs.
 
but increased in complexity, suddenly devs had to account for things that wasnt there before, they could do more effects and so on

that its not what RT do
Developers didn't have to do any of that. They could have made the same quality water shader say, that they would have made with SM 2.0, but in less time. The point is that, for a given complexity level, you can decide either to reinvest the time saved by new technology, or you can decide not to. If artists don't have to wait for their assets to be properly lit, then they can spend more time on those assets and increase their complexity. Or they can simply finish their work sooner. Equally maybe using SM3.0 as the minimum simply means not spending a bunch of time making sure everything works in SM2.0.
 
Last edited:
You can have lighting in the game or not. You want option like that?
Bro they made this shit work on a pos series S. They could've given an option for people who don't have rt cards or don't want to take the fps hit. It's not hard to understand why some people would prefer having the option.
 
Probably complaints are more because the game looks basically the same as Doom 2016 and Eternal, but require a tons more powerful hardware. That's why is difficult to be sold on this technology right now.
This! It barely looks any different than Eternal! And they made it work on a significantly weaker Series S so it's clearly doable on weaker hardware but they just gave no fuck about offering people options!
 
Developers didn't have to do any of that. They could have made the same quality water shader say, that they would have made with SM 2.0, but in less time. The point is that, for a given complexity level, you can decide either to reinvest the time saved by new technology, or you can decide not to. If artists don't have to wait for their assets to be properly lit, then they can spend more time on those assets and increase their complexity. Or they can simply finish their work sooner. Equally maybe using SM3.0 as the minimum simply means not spending a bunch of time making sure everything works in SM2.0.
So why there a big difference between SM3 and SM2 games that people don't see with RT and raster?
 
This! It barely looks any different than Eternal! And they made it work on a significantly weaker Series S so it's clearly doable on weaker hardware but they just gave no fuck about offering people options!

How in fuck's name is this a thing? TDA looks way better and has way more gameplay and much larger worlds. Keep malding. This isn't any better than the steam forums on FF7 rebirth complaining about mesh shaders.
 
In PS6 generation we will see this thread all over again.

"Why are they forcing PATH TRACING in these new games? RTGI is all we need! Lazy greedy devs!"
 
Series S has hardware RT support.
Yeah my bad,I checked now and it's the equivalent of an rtx 2060. Still it's weak as fuck and weaker than a gtx 1660 or 1070 ti for example so I can see why it stings for such players especially in today's age where new gpus act on a "highway robbery" basis for pricing and games like this basically force players to get new hardware even though the gpus they have are better than a Series S which the game works on.

Can you blame people for being mad or dissapointed when the industry is fucking them over!?
 
Last edited:
How in fuck's name is this a thing? TDA looks way better and has way more gameplay and much larger worlds. Keep malding. This isn't any better than the steam forums on FF7 rebirth complaining about mesh shaders.
It's a thing when you have functioning eyeballs and you don't sit 50 feet from your tv. You should try it!
 
Yeah my bad,I checked now and it's the equivalent of an rtx 2060. Still it's weak as fuck and weaker than a gtx 1660 or 1070 ti for example so I can see why it stings for such players especially in today's age where new gpus act on a "highway robbery" basis for pricing and games like this basically force players to get new hardware even though the gpus they have are better than a Series S which the game works on.

Can you blame people for being mad or dissapointed when the industry is fucking them over!?

People that are still on pascal cards and rx5xxx are fucking themselves over.

Outside of RT mesh shaders are started to being required after being present in dx12 ultimate for many, many years.

You have old GPU? Don't whine when new games doesn't work on it. Same story with people that want everything to work on 8GB of vram...
 
People that are still on pascal cards and rx5xxx are fucking themselves over.

Outside of RT mesh shaders are started to being required after being present in dx12 ultimate for many, many years.

You have old GPU? Don't whine when new games doesn't work on it. Same story with people that want everything to work on 8GB of vram...
the funny thing is that doom works just fine on a 8GB card. this is like the worst game they couldve chosen to bitch about. it literally does 1440p 60 fps at high settings on a 8gb 4060.

im easily able to run it at 4k 60 fps dlss quality on my 10 gb 3080. only had to drop it balanced in the 7th level which is completely open world. even the flying level didnt do anything to my gpu. hell, i wish they had pushed the visuals more.
 
People that are still on pascal cards and rx5xxx are fucking themselves over.

Outside of RT mesh shaders are started to being required after being present in dx12 ultimate for many, many years.

You have old GPU? Don't whine when new games doesn't work on it. Same story with people that want everything to work on 8GB of vram...
Eh,I feel you're a bit too hard on this for no reason.

Not everyone has the means nowadays to get a newer card. The whole point of a PC is it offering options.

I personally have a 4070 so I'm not phased about this stuff yet but if high vram requirements and hardware requirements keep going up PC gaming will become a luxury hobby. Is that something you'd want?

I'm more defensive about this because I can see how a person that doesn't have the means to upgrade would feel.

My best friend still has a gtx card and while he still managed to play and finish Clair Obscur at 30 fps,a game that came out 2 weeks prior that works on UE5,"somehow" he's unable to even start Doom Dark Ages,a game built by a prolific team of top devs whose previous games from just a couple years back he would manage to run at 100 fps yet now he can't even start the new one,even at low settings.

That's pretty fucked up in my book but whatever.
 
This! It barely looks any different than Eternal! And they made it work on a significantly weaker Series S so it's clearly doable on weaker hardware but they just gave no fuck about offering people options!

? you can play it on Series S settings on GPUs of the same performance level

what exactly else do you want?
 
the funny thing is that doom works just fine on a 8GB card. this is like the worst game they couldve chosen to bitch about. it literally does 1440p 60 fps at high settings on a 8gb 4060.

im easily able to run it at 4k 60 fps dlss quality on my 10 gb 3080. only had to drop it balanced in the 7th level which is completely open world. even the flying level didnt do anything to my gpu. hell, i wish they had pushed the visuals more.

Yeah, Doom is not exactly very demanding and just like Indiana it runs well on console RT level hardware and GPUs like xx60 class. People think "Ray Tracing" = very demanding but it's not the case for many games (like Resident evil titles).

Eh,I feel you're a bit too hard on this for no reason.

Not everyone has the means nowadays to get a newer card. The whole point of a PC is it offering options.

I personally have a 4070 so I'm not phased about this stuff yet but if high vram requirements and hardware requirements keep going up PC gaming will become a luxury hobby. Is that something you'd want?

I'm more defensive about this because I can see how a person that doesn't have the means to upgrade would feel.

My best friend still has a gtx card and while he still managed to play and finish Clair Obscur at 30 fps,a game that came out 2 weeks prior that works on UE5,"somehow" he's unable to even start Doom Dark Ages,a game built by a prolific team of top devs whose previous games from just a couple years back he would manage to run at 100 fps yet now he can't even start the new one,even at low settings.

That's pretty fucked up in my book but whatever.

Pascal cards are from 2016, 9 years is enough time... I had to change GPUs far more often in the past (dx9, dx11 etc.)

You can't play Demons Souls remake on PS4, and PS4 was 7 years old when game released on PS5.
 
This is not hardware manufacturers, this moore laws ending

If you have a physical barrier that stop the price to decrease it's the devs that need to understand this

The thing is they don't care, they live in their own realty and only start to care when their jobs are at risk

This talk is false, so much so that Jensen himself stopped talking about it to say that now GPUs for AI have improved with each interaction.

What happens is that with each generation, Nvidia uses less and less of the main chip to make mid-range cards. If it kept the cut as it was before, the evolution would be much greater. However, by saving on silicon, it can dedicate many more production lines to AI chips.

AMD even evolved with RDNA 4, but it stagnated for about two generations and lost market share.


cuda_percent_7_full-4x_foolhardy_Remacri.png.webp
 
This talk is false, so much so that Jensen himself stopped talking about it to say that now GPUs for AI have improved with each interaction.

What happens is that with each generation, Nvidia uses less and less of the main chip to make mid-range cards. If it kept the cut as it was before, the evolution would be much greater. However, by saving on silicon, it can dedicate many more production lines to AI chips.

AMD even evolved with RDNA 4, but it stagnated for about two generations and lost market share.


cuda_percent_7_full-4x_foolhardy_Remacri.png.webp
No it's not, i am not talking about the talk of huang here

Moore law indeed stopped at 28nm

 
So why there a big difference between SM3 and SM2 games that people don't see with RT and raster?
Did Rainbow Six Vegas, which required an SM3.0 card, look notably better than Gears of War, which used the same engine, but didn't? Did Splinter Cell: Double Agent, Hellgate: London and Medal of Honor: Airborne, which all were SM3.0 only titles, look notably better than other games released in 2006/2007?

I would say the "biggest" titles graphically in 2007 were Crysis, Stalker, CoD 4 and BioShock, and of these only BioShock required an SM3.0 card to run.

And in general regarding RT, when all are effects are enabled, there is a big difference. Just look at Indiana Jones as a recent example. Or from the footage we've seen, an even better example would be GTA 6. It's just that, with The Dark Ages id pretty much just used RT to replace the existing baked lighting system, with "full" RT coming in the PC path tracing patch.
 
Last edited:
I think it is legitimate for a developer to say, "we're using this technology, and we require hardware that properly supports it for our game." Is that a bad move for maximizing the potential customer base? Probably. But games in the 90s (and early 2000s) did that shit, too. And RT cards aren't new tech by now. And ray tracing itself is even older than rasterization. It makes certain things significantly easier for developers, while also more accurate. From shit like dynamically blending decals to more known stuff like global illumination or real-time reflections to just making parts of the renderer easier to understand.

The real culprit here are Nvidia and their fucked up pricing and bullshit marketing.
 
In PS6 generation we will see this thread all over again.

"Why are they forcing PATH TRACING in these new games? RTGI is all we need! Lazy greedy devs!"
Nah, I don't think we will.
The "issue" is that there are people who can't afford a GPU with h/w RT right now so there's some vocal group of people who are against RT becoming a requirement.
It's not a big demographics even now (current Steam data has non-RT GPU share at less than 15% IIRC; which is also why we are seeing games which require RT h/w now) and by the time PS6 launches it'll be invisible.
Eventually no one will care that a game has RT just like no one cares now if a game has tessellation or some shader model 6.5 - because most games do, by default, and there is no way any of this could be "optional", especially RT which is baked right into content production process.
 
And in general regarding RT, when all are effects are enabled, there is a big difference. Just look at Indiana Jones as a recent example. Or from the footage we've seen, an even better example would be GTA 6. It's just that, with The Dark Ages id pretty much just used RT to replace the existing baked lighting system, with "full" RT coming in the PC path tracing patch.
and how much costs to activate full RT? Do you really believe that we ever gonna get to the point where GPUs capable of full RT will be budget products?
 
please, anyone here, take a print from this comment and then ask me when time comes

THERE WONT BE A PS6, THEY WILL CANCEL WHATEVER THEY HAVE BEEN COCKING MIDWAY THROUGH THE JOURNEY
 
Last edited:
tell me sir, when, when those gpus will be less expensive?

moore law is over, they cant even put more than 8gb without take a hit at every gpu sold like intel with recently battlemages
They are getting less expensive all the time.
5060 is 25% less expensive than a similar performing 4060Ti.
9070XT is 33% less expensive than a similar performing 7900XT.
The myth that GPUs aren't getting less expensive is just that - a baseless myth.
You could say that the *speed* with which perf/price is improving has decreased considerably. But the same thing affects every market and through that affects gaming production as well making advancements like RT becoming required waaaaaay slower too.
 
Last edited:
They are getting less expensive all the time.
5060 is 25% less expensive than a similar performing 4060Ti.
9070XT is 33% less expensive than a similar performing 9700XT.
The myth that GPUs aren't getting less expensive is just that - a baseless myth.
You could say that the *speed* with which perf/price is improving has decreased considerably. But the same thing affects every market and through that affects gaming production as well making advancements like RT becoming required waaaaaay slower too.
the 5060? the one with half of the vram?

and what is an 9700xt? you mean the 7900xt? the one who is less expensive overall right now on newegg?
 
and how much costs to activate full RT? Do you really believe that we ever gonna get to the point where GPUs capable of full RT will be budget products?
Well, I would hope that the next gen consoles would be capable of it, if GTA 6 already can do ray traced reflections and GI. If we compare the 5700 XT to the 9600 XT which is about to release, then in almost 6 years we managed a ~60% performance improvement at $350 (RRP) + adding RT that probably is around PS5 Pro levels. Another 60% and we would be at 9070 XT levels, which is where I would expect the next gen consoles to be. So if the consoles release in 2028, then 2-3 years into the next console generation, I expect there will be a ~$350 card that can deliver "full RT" at console settings. At prices above that, we would expect to see much more significant improvements, as the size of the improvement between generations scales as you go up the stack.

Edit: Actually, the 5700 (non XT) launched at $349, so we're looking at an 80% improvement in 6 years.
 
Last edited:
and how much costs to activate full RT? Do you really believe that we ever gonna get to the point where GPUs capable of full RT will be budget products?
The 500 buck bracket Nvidia GPUs can already run even cyberpun's heavy PT rather well thanks to DLSS and there's no doubt that the PS6 gen consoles will be at least as capable as that, too.
your "ever" is just absolutely silly. We're talking about 1, max 2 GPU gens here for the level of PT we have at the moment to become viable on entry level hardware.
 
Last edited:
Well, I would hope that the next gen consoles would be capable of it, if GTA 6 already can do ray traced reflections and GI. If we compare the 5700 XT to the 9600 XT which is about to release, then in almost 6 years we managed a ~60% performance improvement at $350 (RRP) + adding RT that probably is around PS5 Pro levels. Another 60% and we would be at 9070 XT levels, which is where I would expect the next gen consoles to be. So if the consoles release in 2028, then 2-3 years into the next console generation, I expect there will be a ~$350 card that can deliver "full RT" at console settings. At prices above that, we would expect to see much more significant improvements, as the size of the improvement between generations scales as you go up the stack.
first, gta 6 its not even out yet and already received a massive downgrade and leakers have said that another delay its probably on the way

second, we have to wait to see if the 9060xt will maintain the msrp, probably wont

and last, a console its not made of only the gpu, there memory, cpu, storage, the whole lot, this thing wont be cheap by any means, if happens to exists, ps5 pro has tanked badly and that is a very bad signal for them
 
Nah, I don't think we will.
The "issue" is that there are people who can't afford a GPU with h/w RT right now so there's some vocal group of people who are against RT becoming a requirement.
It's not a big demographics even now (current Steam data has non-RT GPU share at less than 15% IIRC; which is also why we are seeing games which require RT h/w now) and by the time PS6 launches it'll be invisible.
Eventually no one will care that a game has RT just like no one cares now if a game has tessellation or some shader model 6.5 - because most games do, by default, and there is no way any of this could be "optional", especially RT which is baked right into content production process.

But then the peoples who jumped to RT hardware capable cards in the bottom echelon will complain that path tracing is too hard to run, should just have RTGI, etc. I agree with L Landr300

There's no winning with these kind of whiners.

Personally, I'm holding off playing Doom DA for path tracing patch, I want it

Saturday Night Live Love GIF by The Lonely Island
 
The 500 buck bracket Nvidia GPUs can already run even cyberpun's heavy PT rather well thanks to DLSS and there's no doubt that the PS6 gen consoles will be at least as capable as that, too.
your "ever" is just absolutely silly. We're talking about 1, max 2 GPU gens here for the level of PT we have at the moment to become viable on entry level hardware.
500 bucks? and that is only the gpu lol

and yes there a lot a doubt
 
first, gta 6 its not even out yet and already received a massive downgrade and leakers have said that another delay its probably on the way

second, we have to wait to see if the 9060xt will maintain the msrp, probably wont

and last, a console its not made of only the gpu, there memory, cpu, storage, the whole lot, this thing wont be cheap by any means, if happens to exists, ps5 pro has tanked badly and that is a very bad signal for them
You're full of positivity! My point is that GTA 6 proves that transformative RT effects are possible even on the current limited console hardware. Yes, the next generation consoles will not be cheap, but I am talking about a 2.5X improvement in raster performance, which is super low by historical standards. And finally, graphics cards generally tend towards their MSRP over time as supply increases, which we are already seeing with the 5070. The 5700 XT actually launched at $399 as I corrected, with the 5700 at $349, so there is some room for the price to be out.
 
The problem is the "forced" part. All that id had to do was to give the user an option to turn in on/off. Just like all other games.

And the question is not about we having RT hardware for 7 years already. The problem is that RT causes huge performance drops.
For example, Doom Eternal on a 4080 can 175 fps, at native 4K.
But the 4080 on Doom the Dark Ages, it can only do 51 fps.
Losing 3.5 times the performance for marginal gains in image quality is not a good trade off.
But if the option was there to turn it on/off, there would be no problem, as each person could choose what they wanted.
And this is the crux of the matter, choice.

doom-eternal-3840-2160.png


performance-3840-2160.png
Basically ID didnt feel like spending money on baked lighting. It's unfortunate for sure confiding the performance downsides.
 
You're full of positivity! My point is that GTA 6 proves that transformative RT effects are possible even on the current limited console hardware. Yes, the next generation consoles will not be cheap, but I am talking about a 2.5X improvement in raster performance, which is super low by historical standards. And finally, graphics cards generally tend towards their MSRP over time as supply increases, which we are already seeing with the 5070. The 5700 XT actually launched at $399 as I corrected, with the 5700 at $349, so there is some room for the price to be out.
boy, you're using a game that its not even out yet and you think its a matter of being positive or negative? seens to me that people are just in full denial right now

there is no such a thing of "not being cheap", we're talking about simple economics here, depending on the price point we're talking about few millions units sold at best, how does that low install base would be able to sustain an AAA at 100 million budget even at 100$ full price? it would be a tremendous risk to launch a game for that
 
Top Bottom