DF: Doom: The Dark Ages "Forced RT" Backlash - Is It Actually Justified?

Stop being stupid for the sake of.

The CEO brought it up in an interview and their earnings report. They were going to release it Nov/Dec of this year. The team asked more time to polish things up and said they only need 6 more months, the CEO said that it would not cost them any more money to delay compared to the polish.

That is the official information we have. Not gamer tinfoil doom & gloom every online autist wants to prorogate. If it gets delayed again, so be it.

But the fact of the matter, officially, it was not "delayed a whole year."
The CEO said that the game would not be delayed in the first time when the "tinfoil" doom and gloom doubt and insiders said will be delayed

But again, i am not saying that it's wrong, i am just saying that i personally never seen people brought up internal release date on other games even through we know that when a game get delayed it doesn't mean that it would launch tomorrow
 
The CEO said that the game would not be delayed in the first time when the "tinfoil" doom and gloom doubt and insiders said will be delayed

But again, i am not saying that it's wrong, i am just saying that i personally never seen people brought up internal release date on other games even through we know that when a game get delayed it doesn't mean that it would launch tomorrow
I only brought it up because you were wrong about saying it was delayed "a whole entire year." It would have to have released this month.

It wasn't. The official on-record information is here for all to see.
 
Last edited:
People are going to believe in whatever they want apparently
Alanis Morissette Reaction GIF by MOODMAN
 
It's always the same thing: back then people complained about games requiring Pentium CPUs, they complained about needing a hardware-accelerated video card, they complained about needing a GPU with hardware T&L, about Crysis requirements, about mandatory DX11, about ambient occlussion doing nothing significant visually yet incurring huge performance losses, about nVidia Physx (and now we're going to lose it), etc.
 
I finally watched that long ass interview with the doom developer. Even if his claims were exaggerated, he has a really good case for the rt requirement.

It runs fast as fuck anyway. It's not like you need to spend $500 on a card. It will run on some fairly cheap stuff. Today $200 is cheap for a card though. Ten years ago $100 was. When I got back into pc, I picked up a 750ti for $110 and just gamed away for like 3 years with it not worrying about settings too much. Even then though, $200 was still in the impulse-buy window.
 
PC space was always filled with those people - it's just the difference that when og Doom released and it was only playable as intended by like... half a milion PCs - that's about all it ended up selling too after a few years, and nooned cared.
Now a game sells less than 20M in first week* and every forum/social media platform is full of people screaming how it bombed and the studio should just die etc. (and it happens to all - apparently everyone has people wanting them to fail). And yea - if you want to have massive sales (and keep selling for years to come) - you have to cater to 5-10years of hw-tail, there's no ways around it.

I was just thinking back to that time and remembering how many kids packed into the school computer lab to play Doom. So few of us had computers which could play the game at home (and the ones of us who did have a "powerful" home PC had a Mac, which meant we didn't get Doom until Doom II came out.)

And remember how having a semi-decent version of Doom was a bragging right in the 16-bit console race? Was the 108x144x2 resolution of the SNES version good enough for you, or were you going to have to pony up for a 32X or a Jaguar or a 3DO for a port with less of a postage stamp slideshow gameplay experience?

360px-3do_doom_screen1.jpg


We tend of think of Doom as historically being so simple an engine that even a toaster can run it, but the thing is, Doom didn't change, toasters got better. Every device in your home is capable of doing math now has the minimum capacity that Doom 1 was shooting for back in '93. Doom isn't about needing "power", it's about having support for the the basic but specific capabilities that the incredibly efficient and intelligent code runs on.

I don't know that Doom TDA will get to that same point of being considered rudimentary (although hey, even toasters are powered with AI now...) but using RT as the base-level functionality for a game, rather than a piecemeal add-on for certain effects, will pretty clearly be the way games get made wherever possible moving forward. It's a little ahead of its time, and some gamers will unfortunately miss out if they don't have the PC for it and cannot change their situation in this economy, but it's a proper (and verging on necessary) way to make games for this generation of game platforms
 
New? Shit has been around for years. Don't blame the devs because you bought a sub-standard card for the task.
Not sure if you read my post, but it said that ray tracing is not new. Looking at perf charts and card prices, the only cards that can do ray tracing are the 4090 and 5090. Value prop just isn't there for my use. If more and more games start running like TDA without upscaling I'll just start playing less. For what I would expect, anyone with a 5080 has a substandard card, and they paid 2x the price of a flagship for it.
 
Not sure if you read my post, but it said that ray tracing is not new. Looking at perf charts and card prices, the only cards that can do ray tracing are the 4090 and 5090. Value prop just isn't there for my use. If more and more games start running like TDA without upscaling I'll just start playing less. For what I would expect, anyone with a 5080 has a substandard card, and they paid 2x the price of a flagship for it.

Eh, resolution? Most RTX cards will do a decent job at 1440P and below. You'll obviously have to adjust settings. If the consoles can run this shit, then so can most PC's. The PC master race(I fucking hate that term) has turned in the PC whiney race. This is no different than rebirth when people were crying about mesh shaders.
 
Watched it and the thing that annoyed me the most was the ego in some of the statements followed by sarcastic smiles. These DF folks need to realize they are in a position they get access to GPUs and sent free GPUs versus the everyday person.

Forced RTX is forced when they remove the option to even play the game on some systems and have no option to adjust or turn off it in game, but I kinda agree it has been like 7 years now worth of RTX GPUs. The problem with the last few gens is the crazy extortionate pricing that simply blocks people out of it.

I still think the performance hit is something I can live without, even 7 years later. If I generally have to use DLSS (and the RTX version) and Frame Gen to get the fps high enough to be enjoyable, maybe they should try to optimize it instead of seemingly focusing on making it more extreme with something like Path Tracing.

Very few games have blown me away with RTX and half the time the DF videos are showing a reflection in a puddle or how a light bounces underneath a park bench....who actually cares or more importantly, who sees this when playing a game for the most part? I know Doom has RTX built into the game in other ways that helps the developer, e.g. hit boxes, but RTX is still a heavy feature.
 
Eh, resolution? Most RTX cards will do a decent job at 1440P and below. You'll obviously have to adjust settings. If the consoles can run this shit, then so can most PC's. The PC master race(I fucking hate that term) has turned in the PC whiney race. This is no different than rebirth when people were crying about mesh shaders.
Were you here for COD sub native rendering gate in 360 era?
 
Watched it and the thing that annoyed me the most was the ego in some of the statements followed by sarcastic smiles. These DF folks need to realize they are in a position they get access to GPUs and sent free GPUs versus the everyday person.

Forced RTX is forced when they remove the option to even play the game on some systems and have no option to adjust or turn off it in game, but I kinda agree it has been like 7 years now worth of RTX GPUs. The problem with the last few gens is the crazy extortionate pricing that simply blocks people out of it.

I still think the performance hit is something I can live without, even 7 years later. If I generally have to use DLSS (and the RTX version) and Frame Gen to get the fps high enough to be enjoyable, maybe they should try to optimize it instead of seemingly focusing on making it more extreme with something like Path Tracing.

Very few games have blown me away with RTX and half the time the DF videos are showing a reflection in a puddle or how a light bounces underneath a park bench....who actually cares or more importantly, who sees this when playing a game for the most part? I know Doom has RTX built into the game in other ways that helps the developer, e.g. hit boxes, but RTX is still a heavy feature.
lol I buy my own shit and I still stand by what he said in the video
 
Forced RTX is forced when they remove the option to even play the game on some systems and have no option to adjust or turn off it in game....

Sorry, but no, you are not understanding how games are made.

They didn't remove anything.

They decided not to build the thing that would have given you the option, ie all the baked-in lighting data and tricks that would need to be rendered and baked and packaged into the product you buy.

Could they have done all the work to produce a fallback? I suppose, depending on what the new engine supports. (And if so, they still could; now that it's done and out they could put a team on making a baked version for several months then offer it as an optional 100+GB patch on top of the 100GB game file.) But RT is fundamentally how the game works. Even shooting uses their RT system.

You're used to thinking of RT as an add on and enhancement (and one that, in your opinion, has not been worth the tradeoff of performance with the effects slathered on top of the existing rendering systems. ) This is not an added-in option. It's what the game is.
 
Last edited:
Sorry, but no, you are not understanding how games are made.

They didn't remove anything.

They decided not to build the thing that would have given you the option, ie all the baked-in lighting data and tricks that would need to be rendered and baked and packaged into the product you buy.

Could they have done all the work to produce a fallback? I suppose, depending on what the new engine supports. (And if so, they still could; now that it's done and out they could put a team on making a baked version for several months then offer it as an optional 100+GB patch on top of the 100GB game file. Or they could just offer a flat-shaded option with no lighting and you could just play it like an N64 game.) But RT is fundamentally how the game works. Even shooting uses their RT system.

You're used to thinking of RT as an add on and enhancement (and one that, in your opinion, has not been worth the tradeoff of performance with the effects slathered on top of the existing rendering systems. ) This is not added in. It's what the game is.
Well I did say at the end that Doom does use RTX for other things like hit boxes etc so why are preaching at me like I ignored it? I didn't go into detail about it because the topic was forced rtx, not just Doom. Doom was just the latest game yo engorce it / demand it, and the first time they discussed the comments.

Try not to talk to people when you can't read what they wrote.
 
Last edited:
All I know is, this is the first game I cannot run at 4K/60 with decent settings on my 3080 10GB. I can get close, but it averages in the upper 50s, with dips into the 40s. No bueno.

That said, I'm getting by just fine at 1440, getting a locked 60 at max settings with DLSS on Quality.

The difference is noticeable if you're looking for it, but it's not diminishing the experience at all. Actually, the fact that I can run it maxed makes it worth it.

Fucking love the game, btw.
 
Maybe people who have a negative view of RT should check out the DF interview with id tech guy on this. He basically said that by having RT over baked lighting, greatly increased their development time and allows you to adjust and see in real-time how it looks.
Hopefully now that RT hardware has been assigned for some time, that developers can start to really use it in an artistic and creative way, and maybe even have AI that responds to changes in light and shadow. That would be cool
Why can't they use the hardware GPU to calculate the lighting values and baked these values to the texture?

If the scene is not dynamic I fail to see baked lighting will take much time, they can even use the RT scene as reference.
 
Not sure if you read my post, but it said that ray tracing is not new. Looking at perf charts and card prices, the only cards that can do ray tracing are the 4090 and 5090. Value prop just isn't there for my use. If more and more games start running like TDA without upscaling I'll just start playing less. For what I would expect, anyone with a 5080 has a substandard card, and they paid 2x the price of a flagship for it.

if you play Doom without DLSS you literally play it at a lower quality... that's an objective fact.
but even if you want to use TAA and play at a lower quality, even a mid range GPU can play the game at 1080p native.
 
Nvidia can bin and price things however they please. There was absolutely a precedent from the 200 series, through 400, 500, 600, 700, 900, 1000, 2000 and 3000. Every consumer could expect a certain performance profile for a xx60, 70 or 80. Nvidia changed the terms of that in the 4000s.
There is no "performance profile" either. Your past expectations has zero relation to new products which are made in new production realities.
 
objective fact


Yes! Now we're getting to the good stuff. "DLSS is better than native rendering". Amazingly, when DLSS4 released, DLSS3 all of a sudden looked terrible and smeary to a lot of people. Grousing occurred when games launched without tranformer support. When just the week before DLSS3 looked "better than native". Makes ya wonder when DLSS5 releases, if the same people will have bad things to say about DLSS4.

How do you come to terms with that? "Oh, we just thought DLSS3 was great. Now we really know it wasn't. It's not gonna happen again though. DLSS4 will stand the test of time."

Stay tuned. I'm going to link to this very post when DLSS5 comes out and people are bitching about "x new release" only having DLSS4 (which is totally better than native).

What I really want to know if how you measure the objective superiority of it.
 
Last edited:
Yes! Now we're getting to the good stuff. "DLSS is better than native rendering". Amazingly, when DLSS4 released, DLSS3 all of a sudden looked terrible and smeary to a lot of people. Grousing occurred when games launched without tranformer support. When just the week before DLSS3 looked "better than native". Makes ya wonder when DLSS5 releases, if the same people will have bad things to say about DLSS4. How do you reckon that. "Oh, we just thought DLSS3 was great. Now we really know it wasn't. It's not gonna happen again though. DLSS4 will stand the test of time."

Stay tuned. I'm going to link to this very post when DLSS5 comes out and people are bitching about "x new release" only having DLSS4 (which is totally better than native).

What I really want to know if how you measure the objective superiority of it.
DLSS 3 still looks similar to "native" rendering in quality mode. What DLSS 4 does is achieve the same thing with performance mode. So it's not that DLSS 4 opened everybody's eyes, but rather that it extended the range of acceptable compromises.

But I think all this talk of native rendering is missing the point. The pristine, "ground truth" image is the 16xSSAA one that uses 16x the pixels to anti-alias the on screen image. The "native" TAA image is a messy approximation that tries to use the pixel data from from previous frames, instead of rendering at a higher resolution. What it's trying to do is exactly what DLSS is doing. It just does a way worse job, because it's a manually tuned algorithm that doesn't have any real understanding of what is changing from frame to frame. The one advantage it has is, in the case of native rendering, a higher input resolution. But since for a given input resolution, DLSS does a much better job, it will always be possible to lower that resolution until parity is reached.

It's fine not to like the compromises introduced by DLSS, but in that case you should also be against TAA, and should be playing your games using SSAA or some other AA technology.
 
Last edited:
It's fine not to like the compromises introduced by DLSS, but in that case you should also be against TAA, and should be playing your games using SSAA or some other AA technology.
I can't stand TAA either. Forza Horizon 5 is a great example since it offers TAA and MSAA. You do get aliasing with 4xMSAA that isn't there with TAA, but TAA smears the entire image to achieve this. I'll take the sharp, resolved MSAA all day over TAA. Even though TAA might look "better" at a glance, race a couple races and it's clear that the 4xMSAA produces a much more stable and well resolved image.

Where the post I replied to really went wrong is using the word "objective".
 


Yes! Now we're getting to the good stuff. "DLSS is better than native rendering".


it is in many games. TAA is also temporal reconstruction, and it's often bad. DLSS is TAA with ML error correction.

also why do you link an unrelated game?


Amazingly, when DLSS4 released, DLSS3 all of a sudden looked terrible and smeary to a lot of people. Grousing occurred when games launched without tranformer support. When just the week before DLSS3 looked "better than native". Makes ya wonder when DLSS5 releases, if the same people will have bad things to say about DLSS4. How do you come to terms with that? "Oh, we just thought DLSS3 was great. Now we really know it wasn't. It's not gonna happen again though. DLSS4 will stand the test of time."

DLSS3 can both look better than native TAA and worse than DLSS4. if you think thats some gotcha there's something wrong with the part of your brain responsible for logical thought I guess.

but to break that down for really stupid people in the back real quick:

Thing A being better than Thing B doesn't mean that Thing C can not be better than Thing A... got it? good!


Stay tuned. I'm going to link to this very post when DLSS5 comes out and people are bitching about "x new release" only having DLSS4 (which is totally better than native).

What I really want to know if how you measure the objective superiority of it.

how to measure that? image sharpness, amount of ghosting, amount of artifacts in motion.

DLSS in quality mode OBJECTIVELY is sharper and has less ghosting in every iD Tech game I tested.
I have posted comparison shots taken while motion in Indiana Jones in another thread, and it looked objectively better and anyone looking at both images side by side will tell you the same.
I could do the same with Doom Eternal and Doom TDA and it would have the same result.
 
Last edited:
it is in many games. TAA is also temporal reconstruction, and it's often bad. DLSS is TAA with ML error correction.

also why do you link an unrelated game?
It makes a point about DLSS. I don't think it's superior to native rendering + MSAA. I'll take a little unresolved aliasing over soft temporal stew. Even the wizards at id with their own in-house engine have left a search engine trail of "blurry" and "soft" keywords. That is in a best-case scenario where the developer is technically adept, most games won't have this luxury. Nvidia wants to drive us down this path. Consumers could offer some resistance, but we're out here justifying our purchases instead. They can cut these chips down more and more and as long as Digital Foundry is making videos touting the superiority of "I can't believe it's not pixels".

iu
 
Uh, one other point... I am guessing that people unwilling to buy a newer than 8 year old GPU probably aren't going to be buying a $70 game.
 
Last edited:
I think PC gamers got spoiled for two generations.

It was incredibly cheap to build a PC that could match a PS3 and PS4. Also those gens lasted long.
With the PS5 we're back to normal.
 
Nvidia wants to drive us down this path.
Developers want to use incompatible techniques and performance freed up by not using MSAA, zero to do with what Nvidia wants.
Nvidia provided a state of the art temporal AA+upscaling solution so that developers wouldn't need to create their own which were rather shite and didn't do much upscaling at all.
Yet somehow it's Nvidia who's to blame again. Cmon.
 
Developers want to use incompatible techniques and performance freed up by not using MSAA, zero to do with what Nvidia wants.
Nvidia provided a state of the art temporal AA+upscaling solution so that developers wouldn't need to create their own which were rather shite and didn't do much upscaling at all.
Yet somehow it's Nvidia who's to blame again. Cmon.

MSAA was already broken with UE3. Then we had "the dark ages" when games only had dog shit techniques like FXAA/MLAA/SMAA that didn't do shit to aliasing. THEN TAA appeared and we finally got rid of aliasing - but the cost was heavy blur to the image

DLSS4 fixes that.
 
Last edited:
It makes a point about DLSS. I don't think it's superior to native rendering + MSAA. I'll take a little unresolved aliasing over soft temporal stew. Even the wizards at id with their own in-house engine have left a search engine trail of "blurry" and "soft" keywords. That is in a best-case scenario where the developer is technically adept, most games won't have this luxury. Nvidia wants to drive us down this path. Consumers could offer some resistance, but we're out here justifying our purchases instead. They can cut these chips down more and more and as long as Digital Foundry is making videos touting the superiority of "I can't believe it's not pixels".

iu
2X MSAA will gut your performance by half if used on a deferred renderer. You would take that running at 30fps over DLSS Q running at 70fps?

Canadian Lol GIF


It has nothing to do with NVIDIA.
 
Last edited:
I think PC gamers got spoiled for two generations.

It was incredibly cheap to build a PC that could match a PS3 and PS4. Also those gens lasted long.
With the PS5 we're back to normal.
You forgot to mention that the PS3 and PS4 have had a couple of price cuts this far into their lifecycle so GPUs had to compete with that.

For the price of a PS5 you can still get a GPU that obliterates it. It's just the price entry has been raised across the board, console and PC.
 
Last edited:
He's another one of those "nativisits" who refuses to use upscaling methods like DLSS as if native didn't have its own set of problems.
"Native" and less temporal data reliant algorithms would not suffer from hallucinations, motion trails / ghosting, disocclusion issues, etc… the gap between when these AI and/or temporal reconstruction algorithms get it right and when they get it wrong widens… aka modern UE5 when pushed.
 
"Native" and less temporal data reliant algorithms would not suffer from hallucinations, motion trails / ghosting, disocclusion issues, etc… the gap between when these AI and/or temporal reconstruction algorithms get it right and when they get it wrong widens… aka modern UE5 when pushed.
DLSS doesn't suffer from hallucinations because it is not trying to "guess" what the frame should look like, and ghosting and disocclusion issues are seen with TAA and reflect the problem of whether to keep or discard data from prior frames. DLSS is simply a more intelligent form of TAA.
 
Last edited:
Nah, let devs spend that time baking light on stuff that actually matters. At this point, forced RT is long overdue and most cards, the consoles and even phones support RT now.

People who bought a 1080 in 2016 can't expect it to handle games well that release almost 10 years later. The only group that is fucked is RDNA1 owners as they bought it over Turing, but I've warned them many years ago, so no pity.

Doom TDA runs and looks great on my 2060 laptop.
 
Last edited:
2X MSAA will gut your performance by half if used on a deferred renderer. You would take that running at 30fps over DLSS Q running at 70fps?

Canadian Lol GIF


It has nothing to do with NVIDIA.
Not always. 8xMSAA is Forza Horizon 5 is remarkably lean. Depends on the implementation. Our boy TI has a nice video about it if you're curious.
 
Not always. 8xMSAA is Forza Horizon 5 is remarkably lean. Depends on the implementation. Our boy TI has a nice video about it if you're curious.
Forza Horizon 5 uses a forward renderer, that's my point. Same for Source Engine. Most modern games use a deferred renderer and those get annihilated by MSAA when it comes to performance.
 
Last edited:
Developers want to use incompatible techniques and performance freed up by not using MSAA, zero to do with what Nvidia wants.
Nvidia provided a state of the art temporal AA+upscaling solution so that developers wouldn't need to create their own which were rather shite and didn't do much upscaling at all.
Yet somehow it's Nvidia who's to blame again. Cmon.
DLSS allows Nvidia to continue selling xx50 chips as xx60Ti. If it was soundly rejected by the market, they'd actually have to make their bins appealing. There are plenty of examples of lean MSAA implementation. Forza Horizon as mentioned above. Half Life Alyx was also an MSAA banger. Plenty examples of trashy temporal implementation as well. Oblivion Remaster anyone? I don't think VR is going to take off, but that's probably a realm where temporal ghosting is unacceptable.
 
DLSS allows Nvidia to continue selling xx50 chips as xx60Ti. If it was soundly rejected by the market, they'd actually have to make their bins appealing. There are plenty of examples of lean MSAA implementation. Forza Horizon as mentioned above. Half Life Alyx was also an MSAA banger. Plenty examples of trashy temporal implementation as well. Oblivion Remaster anyone? I don't think VR is going to take off, but that's probably a realm where temporal ghosting is unacceptable.
"Plenty" you mean two? Half-Life Alyx is a VR game design to run at 90fps per lens, so by nature it's very lean but also doesn't hold a candle to modern AAA titles graphically. FH5 uses forward rendering. You can only point to a handful of games. 99% of AAA titles don't even have MSAA as an option because it's effectively obsolete.
 
Last edited:
"Plenty" you mean two? Half-Life Alyx is a VR game design to run at 90fps per lens, so by nature it's very lean but also doesn't hold a candle to modern AAA titles graphically. FH5 uses forward rendering. You can only point to a handful of games. 99% of AAA titles don't even have MSAA as an option because it's effectively obsolete.
It's only obsolete for cost cutting reasons. You just named two games that look incredible and use it effectively. You're telling me a 2D game made on the Alyx engine made with Valve quality would lose sales to AC Shadows solely due to graphics? I don't think graphics take sales away from truly great games, they can only give a sales bump to mediocre or bad games. You could certainly build on Alyx and make it look perfectly modern. When Half Life 3 releases in September with glorious MSAA options I expect a written apology.
 
It's only obsolete for cost cutting reasons. You just named two games that look incredible and use it effectively. You're telling me a 2D game made on the Alyx engine made with Valve quality would lose sales to AC Shadows solely due to graphics? I don't think graphics take sales away from truly great games, they can only give a sales bump to mediocre or bad games. You could certainly build on Alyx and make it look perfectly modern. When Half Life 3 releases in September with glorious MSAA options I expect a written apology.
It's obsolete because it doesn't play nice with deferred renderers. The performance penalty is insane. It's not about cutting cost, it's about the tech being too old and not adapting to new rendering realities and paradigms.
 
It's obsolete because it doesn't play nice with deferred renderers. The performance penalty is insane. It's not about cutting cost, it's about the tech being too old and not adapting to new rendering realities and paradigms.
i feel like goalposts have moved and you guys are playing into his hand
just saying
 
No idea how you're able to separate the two things. They go hand in hand logically.
Because they are entirely different things, proper RT is not an add-on, it's a way of rendering that is finally becoming possible to use in realtime and will replace rasterisation. Many current implementations are just using parts of RT to replace certain parts of the rendering pipeline, but even just replacing GI in the example of doom or Metro or AC Shadows is making a big difference for development time and visual quality, but also what they can do with world size and destruction. Ray Tracing itself has been around for decades, this isn't just some marketing fluff to justify buying new cards, we just finally have the hardware power to run it. I just don't think you actually understand RT at all.
 
Not sure if you read my post, but it said that ray tracing is not new. Looking at perf charts and card prices, the only cards that can do ray tracing are the 4090 and 5090. Value prop just isn't there for my use. If more and more games start running like TDA without upscaling I'll just start playing less. For what I would expect, anyone with a 5080 has a substandard card, and they paid 2x the price of a flagship for it.
requiring x80/x90 class gpus is nonsense, even the series s runs the RT fine in doom lol. Basically any modern mid range from the last few years can run basic RT fine.
 
Top Bottom