DF: Doom: The Dark Ages "Forced RT" Backlash - Is It Actually Justified?

I just recently accepted that they replaced horses by cars ,that there's a better way to know the time of day than looking at the sun position (apparently it's called a watch)
and now they are putting RT into to video games?
It's a deep state trick to make us believe that the earth is not flat! 🤬
 
I'm asked chatgp calculate how many RT and nonRt users in steam
According to the Steam Hardware & Software Survey for April 2025, about 55% of Steam users have graphics cards that support hardware Ray Tracing. This means that approximately 45% of players are using GPUs without Ray Tracing support.
It probably just used this Reddit post from 10 months ago.



It should be higher now.
 
The problem with RT was always that the visual gains were never worth the performance cost. Maybe in the future when the overall fps drop will be smaller. But back when it premiered or even now? Nope.

I always turn it off (is possible) to get more performance, but forcing people to have it is not a good idea. Again - I'm talking about the current situation, where it can have significant impact on performance or completely lock out people with older GPU's.

It's even more depressing when you look at 2016 Doom and Eternal. Both games could be played even on very weak specs. Now it's almost the opposite - not allowing many people to play on PC. Same with the VRAM requirement. You have a gaming laptop with a 6gig 3060 GPU? Well sucks for you.
 
Last edited:
i was give to him, steam survey url. He's even write April 2025. So dunno about it, maybe something missing in code.
I just asked:

"Estimating the Percentage

By aggregating the usage percentages of all GPUs known to support hardware-accelerated ray tracing, we arrive at an approximate total of 55% of Steam users with such capable hardware. This estimation aligns with a detailed analysis conducted by a community member, who compiled a comprehensive list of ray tracing-capable GPUs and their respective shares"

And then it cites that Reddit post. So yeah, don't trust generative A.I.
 
People are just salty that their crusade against RT might end up in then eating crows sure to fire actually beneficial it is, they'd be forced to recognize it. At least that's the case for some of them.

I am against using RT in games but only if it's done like a gimmick, because making it optional and just putting a half assed "solution" is what have it the bad fame it has among gamers. The Doom implements it is the right way she should be mimicked by everyone.

People are dummies. 4 years ago we had this video from DF about Metro Exodus EE (first game requiring RT) and how it speeds up development:



How fucking great this bounce lighting looks. Lighting up single room:

bdZi5s5.jpeg


Now imagine doing that to huge levels, how much time can be saved? And fully dynamic lighting will end up looking better anyway (it can also interact with dynamic objects).

Based on some replies in this thread no wonder Sony devs are stuck with PS4 graphics quality but in 4k/60 (without RT lighting, mesh shaders, virtual shadow maps, Nanite like tech etc.).
 
Last edited:
The funniest part is you know it's PC users crying about it. The narrative that console users hold back gaming hasn't been true at all this generation. It's PC users with their shitty rigs who refuse to move on from their Pascal cards and cry foul when a new game doesn't run on their 8 year old GPUs.

Pascal had a good run. It was a legendary generation, but it's a console generation old. Time to move on from it.
 
The funniest part is you know it's PC users crying about it. The narrative that console users hold back gaming hasn't been true at all this generation. It's PC users with their shitty rigs who refuse to move on from their Pascal cards and cry foul when a new game doesn't run on their 8 year old GPUs.

Pascal had a good run. It was a legendary generation, but it's a console generation old. Time to move on from it.

Yup

SlimySnake SlimySnake nailed it with PC peasant race

But devs shouldn't hold back. Majority of cards are RT capable now.

Devs should push the envelope again on PC. This is what a PC used to be, not some deal saver keeping a 970 GPU alive for a over a decade and expect to run modern games
 
i was give to him, steam survey url. He's even write April 2025. So dunno about it, maybe something missing in code.
Just calculated myself, 64% of the GPUs have RT support. But 9% are marked as 'other' as the numbers are too small to be included. So of the known GPUs listed, 70% of them have RT support.
 
Last edited:
The question isn't how many people technically have RT cards. The question is how many people still have technically-RT-enabled cards that fall into the dogshit "low video settings" column, and the answer is "a whole hell of a lot."
 
The question isn't how many people technically have RT cards. The question is how many people still have technically-RT-enabled cards that fall into the dogshit "low video settings" column, and the answer is "a whole hell of a lot."
you get 55-70 fps at 1080p dlss quality with a 3060 at high preset (which adds ray traced reflections that is not enabled on base consoles. so you can disable it to get it to be 60+ fps all the time)


dlss 4 quality in this game looks better than native 1080p

so it is all fine really

doom eternal also targeted 1080p 60-80 fps on a 1060 at high settings



you could argue that 1060 did it at native 1080p and 3060 relies on upscaling. but it all falls apart when 1060 has to use blurry taa at native 1080p while 3060 gets sharp and clean looking image with DLSS 4. so technically 3060 gets a better graphics/physics/destruction experience in doom dark ages than 1060 did in doom eternal at similar framerates

even at "low" preset doom dark ages looks better than eternal thanks to ray traced gi anyways
 
Last edited:
Just calculated myself, 64% of the GPUs have RT support. But 9% are marked as 'other' as the numbers are too small to be included. So of the known GPUs listed, 70% of them have RT support.
That's far too low. It should be 90%, but NVIDIA cutting Lovelace supply short in favor of Blackwell garbage didn't help.
 
The question isn't how many people technically have RT cards. The question is how many people still have technically-RT-enabled cards that fall into the dogshit "low video settings" column, and the answer is "a whole hell of a lot."
Not bad, looking at the GPUs that offer the same performance/quality as a PS5 in this game it comes to about 50% of the known GPUs on the Steam survey.
 
Last edited:
That's far too low. It should be 90%, but NVIDIA cutting Lovelace supply short in favor of Blackwell garbage didn't help.

The 9070XT which apparently « everyone » is buying over nvidia on social networks has not even managed to enter the list. I don't get it. Fake supply fake MSRP I guess

Honestly fuck both of them

I hope Intel foundry makes them a good cheap GPU
 
Not bad, looking at the GPUs that offer the same performance/quality as a PS5 in this game it comes to about 50% of the known GPUs on the Steam survey.
problems arise when even ps5 user themselves gaslight others into thinking ray tracing is useless or worthless

you can see a lot of ps5 users complaining how doom eternal had higher resolutions at 120 fps and how doom dark ages is blurry or does not have a 120 fps mode so on

i've seen countless people saying how ac shadows suck, running at 1440p 30 fps for the sake of ray tracing and does not look better than this or that 1st party game running at 4k 60 fps with crisp visuals :messenger_tears_of_joy:

I'm quite sure that majority of this backlash is actually caused by ray tracing capable hardware owners. they just want their lastgen 120 fps 4k experience in this new game lol
 
Last edited:
Who cares? Play video games and have fun. This is just the usual daily reminder of why I maintain that gamers have ruined gaming. Imagine caring what people think about lighting and reflections in games that already look amazing as is. This is the segment of the community at large that is embarrassing. Bring on the hate.

On Fire GIF by DefyTV

Same thing happens to just about anything. It starts out as some really cool shit then someone goes and builds a band wagon for it, and a bunch of goofy mutha fucks hop on it. It's like going from Public Enemy to Lil Pump, or from Black Sabbath to that trash I hear passing by Hot Topic at the Florida Mall.
 
Last edited:
it was not the same situation in the sense that they were trying to improve graphics and make something that was not possible in anyway before and not trying to make devs lives easier

the discourse here is not "we cant do this with this type of lightning" it is " we can do but will take more time" and then "you need to accept this because that is how the things always were"

except it wasnt

well, it's also not quite the same because back then it was a 2 year old card that couldn't play a rapidly growing majority of new games anymore,
while these days it's 8 year old cards that can't play a small minority of new games anymore.

so it was different back then, but not in a positive sense.
 
Last edited:
I would argue that in 2006/2007 you could have made a similarly good looking game with SM 2.0 as you could have with SM 3.0. It just made developers' lives easier by enabling more complex shaders.
The only thing that can make devs lives easier is literally try to simulate reality since they don't need to literally fake everything and that's why they're pushing for things like RT

SM 3 did no such thing, it's only increased the complexity of games
 
I place more blame on the hardware manufacturers, Nvidia and AMD.

A person who bought a GTX 1060 before, had the power of a GTX 980, the previous high-end. And it still had more VRAM. The same thing in the 3000 generation, with the RTX 3060 being very generous in VRAM, although this time it was only comparable to the 2070.

Now that RT games are starting to appear in earnest, the current 4060 and 5060 are not very good at handling these games, they are little better than the previous generations and even have less VRAM. And the prices are also higher, because as mentioned before, the cost of living has increased.

It seems that people do not understand the extent of the damage that the pandemic has done to the economy and people's lives. They do not have money to spare for major hardware upgrades.

One thing I notice about Alex Battlagia, and even because he is a member of the Beyond3D Forum, is that he and the users there don't play games on a daily basis. They like technology and studying the subject. That's why they seem disconnected from the real world. They say that a 5060 with 8 GB is enough, just lower the quality, but if they played with these GPUs, they would see that it's not quite that simple and the experience isn't the best. And as I said above, people used to have better experiences with their mid-range cards.

Nvidia doesn't like this, because people have been spending years with the same GPU. So they've adopted Apple's way of segmenting the market, where you find flaws in the lower tiers until things only get fixed in the more expensive products. It's a huge scam.
 
The only thing that can make devs lives easier is literally try to simulate reality since they don't need to literally fake everything and that's why they're pushing for things like RT

SM 3 did no such thing, it's only increased the complexity of games
It's a given that games were and are increasing in complexity since developers are constantly trying to improve the fidelity of the simulation.

But for a given performance and graphics target, anything that saves optimization time, makes developers lives easier. So if you don't need to fit a shader down into 96 instructions (SM 2.0), that's going to make your life easier.
 
Far less?
Have you seen the Steam Survey?
Like 90% of PCs on Steam are DXR capable.
The 10% that arent likely wouldnt meet minspec anyway and or would be too poor to afford a 70 dollar game, cuz a 70 dollar game would be half way to them getting a DXR GPU.

RTX3060s easily run this game at above 60 using Ultra Nightmare settings.....cant imaging how fast the thing does using medium or high settings.



And yes iterating on baked lighting adds an incredible amount of time, data and compression time to a games development especially for large games.

If you are making a static corridor shooter you could easily get away with baking but a dynamic open large world which also has indoor areas or large swaths of space adds alot more work/time to the game.
The level design could be finished, gameplay systems on lock, now you are just waiting for lighting artists to finish what they are doing.....ohh the baked lighting makes an area not so much fun to play cuz its too dark, well iterate again.....and bake again rinse and repeat till we get there.

Or use GI; lighting artists, environment artists, texture artists can work on the same repo without waiting for each other to upload the latest build.


Do you guys also complain about DX9.0c features being used in videogames, would you like devs to go back to pre DX9c techniques so that more people can play the game?


You guys are fighting for a team that doesnt actually exists.....you are complaining about a nonexistant problem simply beause complaining is what gamers do these days.

Forward looking techniques in PC gaming have been a thing for as long as PC gaming has been a thing.
People who get left behind dont actually cry foul, they upgrade their machines or buy a Series S.

There is no actual Anti-Forced RayTracing team......you arent on that team, no one is on that team......why are you fighting for it?
First, there is a lot of other things at play in making a game other than graphics and that will never change

There is literally 2d games taking years to make

If you can save 5 years of development of a 15 years cicle is still a 10 year cicle, is still a lot of money

The comparisons with the dx9 are absurd not only because people could see the difference back then but because moore law was still in effect

And then lead to last point, what the devs are going to do now that moore law is done? Will they continue to increase the use of techniques that requires a lot of GPU power until they're developing only for the 80 or 90 series GPUs? We already do not have the 50 series anymore and the 60 series cost far more than it did in the past

We literally in the worst of all worlds now, the mechanic that make computers affordable is gone and devs are pushing for more heavy techniques without have the means to make hardware less expensive
 
Last edited:
well, it's also not quite the same because back then it was a 2 year old card that couldn't play a rapidly growing majority of new games anymore,
while these days it's 8 year old cards that can't play a small minority of new games anymore.

so it was different back then, but not in a positive sense.
Moore law is done, those were different times, people just can't understand why they have to pay double for a card to play games with the very similar levels of graphic fidelity and they're right, it's not the people who consume games that need to adapt to this new world, its the people who make them
 
Based on some replies in this thread no wonder Sony devs are stuck with PS4 graphics quality but in 4k/60 (without RT lighting, mesh shaders, virtual shadow maps, Nanite like tech etc.).
It's funny isnt it? We have weekly threads on how graphics have stagnated, how games barely look any better than they did last gen, and then you have the same people praising Sony games that offer native 4k or 1440p 60 fps performance modes without any next gen features. They look nice, but cross gen and people are ok with it. Until of course, its time to bitch about how nothing looks next gen.

People are just dumb. They dont know what they want. They dont understand the tech. They dont understand that consoles have limited power and cant do everything at high resolutions and 60 fps. But they still want it all.

I wish Doom looked better than it does. I think not investing in a nanite like solution hurt them here especially since the game world is massive and they needed better asset quality. They ended up going with low quality assets with minimal pop-in but it doesnt look as nice and makes people think its a downgrade from Eternal.

We should be having a discussion around ID Software not using mesh shaders. Not chastising them for actually using a next gen feature like RT. Our priorities are all fucked up.
 
Last edited:
It's a given that games were and are increasing in complexity since developers are constantly trying to improve the fidelity of the simulation.

But for a given performance and graphics target, anything that saves optimization time, makes developers lives easier. So if you don't need to fit a shader down into 96 instructions (SM 2.0), that's going to make your life easier.
No, RT does not equal increase in complexity of the games

Its equal increase complexity in processing

Those are two different things

But of course, RT its only part of the problem, there is other things that do increase complexity like the quantity and quality of models, draw distance and so on
 
I place more blame on the hardware manufacturers, Nvidia and AMD.

A person who bought a GTX 1060 before, had the power of a GTX 980, the previous high-end. And it still had more VRAM. The same thing in the 3000 generation, with the RTX 3060 being very generous in VRAM, although this time it was only comparable to the 2070.

Now that RT games are starting to appear in earnest, the current 4060 and 5060 are not very good at handling these games, they are little better than the previous generations and even have less VRAM. And the prices are also higher, because as mentioned before, the cost of living has increased.

It seems that people do not understand the extent of the damage that the pandemic has done to the economy and people's lives. They do not have money to spare for major hardware upgrades.

One thing I notice about Alex Battlagia, and even because he is a member of the Beyond3D Forum, is that he and the users there don't play games on a daily basis. They like technology and studying the subject. That's why they seem disconnected from the real world. They say that a 5060 with 8 GB is enough, just lower the quality, but if they played with these GPUs, they would see that it's not quite that simple and the experience isn't the best. And as I said above, people used to have better experiences with their mid-range cards.

Nvidia doesn't like this, because people have been spending years with the same GPU. So they've adopted Apple's way of segmenting the market, where you find flaws in the lower tiers until things only get fixed in the more expensive products. It's a huge scam.
This is not hardware manufacturers, this moore laws ending

If you have a physical barrier that stop the price to decrease it's the devs that need to understand this

The thing is they don't care, they live in their own realty and only start to care when their jobs are at risk
 
Last edited:
I place more blame on the hardware manufacturers, Nvidia and AMD.

A person who bought a GTX 1060 before, had the power of a GTX 980, the previous high-end. And it still had more VRAM. The same thing in the 3000 generation, with the RTX 3060 being very generous in VRAM, although this time it was only comparable to the 2070.

Now that RT games are starting to appear in earnest, the current 4060 and 5060 are not very good at handling these games, they are little better than the previous generations and even have less VRAM. And the prices are also higher, because as mentioned before, the cost of living has increased.

It seems that people do not understand the extent of the damage that the pandemic has done to the economy and people's lives. They do not have money to spare for major hardware upgrades.

One thing I notice about Alex Battlagia, and even because he is a member of the Beyond3D Forum, is that he and the users there don't play games on a daily basis. They like technology and studying the subject. That's why they seem disconnected from the real world. They say that a 5060 with 8 GB is enough, just lower the quality, but if they played with these GPUs, they would see that it's not quite that simple and the experience isn't the best. And as I said above, people used to have better experiences with their mid-range cards.

Nvidia doesn't like this, because people have been spending years with the same GPU. So they've adopted Apple's way of segmenting the market, where you find flaws in the lower tiers until things only get fixed in the more expensive products. It's a huge scam.
Yes, they are both to blame, but if you care about pixels so much then invest in a better card. they were selling 4070s and 7800xts for $500 last year. That would give you 2x the performance of a PS5. same as the 1060 did.

We had $300 consoles in 2016, the year 1060 came out for $300. We now have $550-600 consoles in 2025. Everything has gone up. Should we blame MS and Sony and not buy these consoles? And stick with our PS4s? of course not. I just bought a $699 PS5 Pro despite hating it because if i want to play DS2 and GoT at its best, i had to spend that extra money. i traded in my ps5 to cover most of the cost, but it still cost me an extra $400 after taxes.

PC gamers need to stop going for the 60 series cards and upgrade to the 70 series cards. Yes, they are more expensive but they offer more bang for the buck. things have become more expensive everywhere. the GPU is not something you should skimp on if you are a gamer who doesnt want to play games at 1080p 60 fps medium settings.
 
The problem is the "forced" part. All that id had to do was to give the user an option to turn in on/off. Just like all other games.

And the question is not about we having RT hardware for 7 years already. The problem is that RT causes huge performance drops.
For example, Doom Eternal on a 4080 can 175 fps, at native 4K.
But the 4080 on Doom the Dark Ages, it can only do 51 fps.
Losing 3.5 times the performance for marginal gains in image quality is not a good trade off.
But if the option was there to turn it on/off, there would be no problem, as each person could choose what they wanted.
And this is the crux of the matter, choice.

doom-eternal-3840-2160.png


performance-3840-2160.png
This isn't a game where RT is an additional extra to enable extra effects or higher quality, it's now the core lighting system, you can't just turn it off as it's that's the only lighting system in ID Tech 8. Devs would need to spend a huge amount of time manually placing and tweaking lights which would then look crap with the destruction and dynamic elements in this game, it would take far longer to make the game. RT only is the obvious future, many other developers are going this way too.
 
Unlike Control that forgot how to rasterize cubes/SSR and complete blank textures so they can sell ya on RT.

That was a real lazy way of going about it.

Insomniac and others handle it better by still rasterizing for those who choose not to use RT, instead of low-effort 20+ year old placeholder looking junk.
Insomniac games are now RT only, they dropped fallback reflections in Spiderman 2 and it's the same for future projects.
 
They are right. Back in the day PC gaming was for enthusiast. He brought up Half life 2 and Crysis and it made us go, ok we need to upgrade or build a PC. Now the PC space is filled with cheap bastards who buy shitty 8GB GPUs while refusing to upgrade their CPUs, ganging up on guys like Todd Howard who dare suggest that maybe you should move on from you 6 year old CPUs.

PC Master Race was annoying but this new PC Peasant Race is the worst. Ragging on devs who are actually embracing new tech that's actually 7 years old so not even new anymore. Shitting on devs who try and use the very tech in the GPUs they paid for. They arent just cheap, they are not gamers. They are posers who want to keep gaming stuck in the 50s. I look at the Steam top ten and am blown away by how many shitty dated fucking games are on there. They are barely better than mobile gamers. Go jerk off to Genshin Impact and leave hardcore gaming to hardcore gamers, thank you.
Hey the thing that ruined PC gaming was the 1080ti and Titan XP, highend parts never got this much traction to the point even RT only titles like Alan Wake 2 decided to not annoy those users and they went back and scaled features down to make it work.

AMD 6000 series/Nvidia 3000 should be everyone's minimum
 
I'm asked chatgp calculate how many RT and nonRt users in steam
According to the Steam Hardware & Software Survey for April 2025, about 55% of Steam users have graphics cards that support hardware Ray Tracing. This means that approximately 45% of players are using GPUs without Ray Tracing support.

Your prompt was probably wrong or ChatGPT is misreading the data.

Its closer to 30%.....but thats including a bunch of Radeon Internals, Intel Iris, Intel UHD, Intel Genuine and GT1030s.
Basically designed only to display shit not actually game with, take out the RT and the game still wouldnt run.


If you count actual graphics card the number drops closer to 20%.
And bunch of those are 3 and 6GB GPUs with so few cores they probably wouldnt be able to load the menu of this game even if it didnt have RayTracing.


So in reality Doom Dark Ages has locked out the 20 or so percent of people still running GTX 10,16,9 and RX5s.


Dont pity them, the computer was probably a hand me down and they dont even know there is a physical GPU in their systems.



I just counted the GPU that dont support DXRits much faster thaan trying to count the GPUs that do.
 
This isn't a game where RT is an additional extra to enable extra effects or higher quality, it's now the core lighting system, you can't just turn it off as it's that's the only lighting system in ID Tech 8. Devs would need to spend a huge amount of time manually placing and tweaking lights which would then look crap with the destruction and dynamic elements in this game, it would take far longer to make the game. RT only is the obvious future, many other developers are going this way too.
Yep. nearly all the major devs have switched to RTGI.

- Rockstar - GTA6
- Ubisoft - AC Shadows , Far Cry 7
- Massive - Star Wars Outlaws/Avatar
- Respawn - Star wars Jedi Survivor
- Capcom - Re Village and Dragons Dogma 2
- Konami - Professional baseball Spirits, MGS3 Snake Eater, Silent Hill F, Silent Hill 2 Remake (Software realtime GI)
- Bethesda - Starfield (software based realtime GI)
- ID software - Doom
- Machine Games - Indy
- Remedy - Alan wake 2 (PC only)
- EA Motive- Dead Space Remake (software based)
- Warhorse - KCD2 - Software based
- Every single UE5 game - Software Lumen
- Fortnite - Hardware Lumen
- Insomniac - Wolverine - according to leaks. Could be software based but its fully realtime GI.
- NCAA Football - First Frostbite game to use RTGI
- CD Project - Cyberpunk PC only - Witcher 4 everywhere
- Forza Motorsport - Software based. Hardware RTGI on PC.
- Metro Exodus - First RTGI game on consoles.


It seems most if not all of the industry has embraced RTGI, be it software based or hardware based. it's very obviously the present, let alone the future of games development.

The only ones being stubborn are sony studios for some reason, but Insomniac is moving to realtime GI with Wolverine and while realtime GI is not necessary to push great visuals, it's definitely allowed smaller devs like Bloober, Sandfall, Chinese and south korean developers to create some massive games without massive teams or budgets.
 
You can play it without RT and it uses the same raterized cube map reflections it always has.
Strange, must be a PC only option for older cards, the PS5 version only has RT for reflections. Wolverine will have RTGI, so more RT is definitely the direction they're pushing in.
 
I dislike forced RT cause it always should be an option. While I have the hardware for it, not everybody does.

PC gaming was always known for the customisation of the settings to cater to your liking. Don't force shit on anybody
If you follow that line of reasoning we would never see any technical advancement.
 
Strange, must be a PC only option for older cards, the PS5 version only has RT for reflections. Wolverine will have RTGI, so more RT is definitely the direction they're pushing in.
Yes. On the PC it can be disabled. Which is the entire "backlash" of a PC game of this thread.
 
Moore law is done, those were different times, people just can't understand why they have to pay double for a card to play games with the very similar levels of graphic fidelity and they're right, it's not the people who consume games that need to adapt to this new world, its the people who make them

we are talking about 8 year old GPUs here... you can get a used 160€ GPU that will outperform them and I'll probably be in better shape as well.

if you have a non-RT Nvidia card, the best you'll have is a 1080ti. buy a used 2070 and you'll have a GPU that can play anything modern.
but if you are someone who buys a 1080ti you probably have upgraded a long time ago.

so we are probably mainly talking about people here with a 1070 or below, at which point even a 2060 Super would be an upgrade, which can be had for 140€ used.

I don't think a 140€ upgrade after 8 years, to be able to play all modern game, is bad
 
Last edited:
This is a different question. With a franchise like Doom you are not really aiming only at people who are generic "full price game a month AAA enjoyers". You might be aiming at people whose main gaming years were the 90s and they don't really give a shit any more because they have a life. You might be aiming at some gym bro who was stoked up by Doom 2016 and only otherwise plays EA FC or something. The "AAA conversation" is supposed to be as wide a market as possible because they need to make big numbers.

Which cards on the Steam Survey do you think would really benefit from this game NOT having RT.
And those gamers would have actually bought Dark Ages and would have played it.

Cuz that 1650, 1050 or 1060 guy was never gonna be able to play this game with or without RT.

The 1080 guys make up 0.sum%.
And dont even talk to me about 970 guys.

So those lost sales would be a rounding error.
 
No, RT does not equal increase in complexity of the games

Its equal increase complexity in processing

Those are two different things

But of course, RT its only part of the problem, there is other things that do increase complexity like the quantity and quality of models, draw distance and so on
I never said RT made games more complex. I was making four points:

1.) The fidelity (or complexity) of games has always been increasing, driven by the desire to create ever more realistic worlds
2.) Achieving the desired level of fidelity often requires "optimizing" for the technology the game is running on, and its inherent limitations. And that this is part of the "work" developers perform.
3.) Removing technological limitations can save optimization time, making developers' lives easier.
4.) Shader Model 3 removed limitations on shader instructions, and so saved optimization time, as per 3.)
 
No, RT does not equal increase in complexity of the games

Its equal increase complexity in processing

Those are two different things

But of course, RT its only part of the problem, there is other things that do increase complexity like the quantity and quality of models, draw distance and so on
RT simplifies/accelerates other processes, allowing more resources to be funneled elsewhere.
 
Last edited:
Top Bottom