• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nearly 7 years later, Ray Tracing still isn’t worth the performance hit nor does it enhance the experience much.

Man, come on, i have a 4080 and even that is not capable of decent RT on games that matter with path tracing.
But it's capable of great rtgi which is already a huge step up from baked lighting

Don't be brainwashed into thinking you have to have PT to get an amazing lighting solution ...you're 70% of the way there with a full suite of regular old RT
 
I used to believe Ray Tracing was a gimmick until CDProjekt sent out the update for path tracing. I was at my apartment noticing a few of the lights from the equipment spilling out nicely but it felt unimpressive given the hype.

Then I opened the window screen and my jaw dropped. Other than real life, I couldn't imagine the experience of opening the curtain from the morning sun and the way in just casts the light to your face and the room was a very surreal experience. From that point on, I will measure any game with RT with this standard in mind and I cannot play Cyberpunk without it.
 
Last edited:

BbMajor7th

Member
For me, The Witcher 3 is an appreciable upgrade and now runs very solidly on PS5 - it's the only (console) upgrade that seemed worth it overall and makes for an excellent replay.
 
Last edited:

bundylove

Member
Rt in vr games is next gen to me.

In regular games....i dont think its worth it.

But out of all the rt features only reflections and GI that make a visual impact. To me.

But i would trade everything rt for more fluid physics like any type of fluid simulation or smoke simulation etc.

Also give me more gameplay mechanics through physics .

Guys i dont know what the hell happened the past 20 years . If you played red faction you would know what i mean. I could go into any room through walls by blowing through stuff. Could blow a hole in the ground to go one floor below. Even the enemies you killed were going cold in real time. Meaning with the thermal vision gun you could see the bodys going cold . I have never seen anything like it ever since.

 

Ronin_7

Member
It'll take another decade before Ray Tracing is deployed across every game but many quantum leaps are still inbound.
 

Mattyp

Not the YouTuber
I really wish everyone can get and play on a 4090. There are times where I'm literally blown away and other times where the performance hit is so bad id rather not use it. Indiana Jones and Cyberpunk are examples of a must play with everything turned on and enjoy. Some other games are a solid, NO.

This. It’s 100% worth it and the jump forward of a generation on Indy and Cyber.

If you have the hardware it’s more than worth enabling everything watching the light shine down.
 
Man, come on, i have a 4080 and even that is not capable of decent RT on games that matter with path tracing.
Nonsense...
A 4080 gets around 60fps in CP2077 Pathtracing in 1440p with DLSS / RESTIR without having to resort to FG.
About the same goes for Indy with PT, Wukong, AW2 etc.
And those are the heaviest RT/PT systems currently out there.....
 
Last edited:

Jesb

Member
Lately I’m finding the best option to be able to play with ray tracing is to just drop the res to 1440p. Not sure what the best solution is since it varies from game to game on how well raytracing is implemented. But with 1440p the majority of games will run flawlessly but at 4K it could be a hot mess in certain games.
 

winjer

Gold Member
A 4070 can enable Ultra RT at DLSS Quality in Cyberpunk 2077. Close to 100fps with frame generation enabled. How is that not fine?

That is terrible. It means the game is running at a base frame rate of 40-50 fps. Even with Reflex, that is too much input lag.
I would rather turn off RT and play at a real 100 fps, only with DLSS quality enabled. And also turn on Reflex, to have really smooth frame rate and very low input lag.
CP 2077 already looks very good without RT.
 
RT can be quite transformative if used correctly:


I don't think anyone critiquing RT or PT is saying it doesn't look good. It absolutely does. What we're doing saying is the performance and resolution hits required to get there are not worth the tradeoff. If I have to choose between RT/PT on at 900p upscaled to 4k running at 30fps vs. a native 4k 60fps, I'm going native 4k 60 every time. The visual clarity and response time is far more important to me while gaming than pretty lighting.
 

Zathalus

Member
That is terrible. It means the game is running at a base frame rate of 40-50 fps. Even with Reflex, that is too much input lag.
I would rather turn off RT and play at a real 100 fps, only with DLSS quality enabled. And also turn on Reflex, to have really smooth frame rate and very low input lag.
CP 2077 already looks very good without RT.
If you don’t like FG then just disable it and run at a mostly locked 60fps. Personally I think the visual gains for Ultra RT in Cyberpunk is more than worth it to go from ~90 to ~60 fps.
 

winjer

Gold Member
If you don’t like FG then just disable it and run at a mostly locked 60fps. Personally I think the visual gains for Ultra RT in Cyberpunk is more than worth it to go from ~90 to ~60 fps.

That is what I said. Turn off FG and RT. Turn on Reflex and DLSS quality. And enjoy high frame rate and very low input lag.
 

Zathalus

Member
That is what I said. Turn off FG and RT. Turn on Reflex and DLSS quality. And enjoy high frame rate and very low input lag.
Yes, and I’d rather enjoy the Ultra RT. 60fps and Reflex is more than enough for a single player game.
 
Last edited:

Axelon

Neo Member
Lately I’m finding the best option to be able to play with ray tracing is to just drop the res to 1440p. Not sure what the best solution is since it varies from game to game on how well raytracing is implemented. But with 1440p the majority of games will run flawlessly but at 4K it could be a hot mess in certain games.
I noticed for some reason that on my 3070Ti that 4K DLSS is much worse than just doing 1440p DLSS even with them at the opposite quality modes (1440p quality vs 4k ultra performance). RAM issue I'm guessing? Not sure as 4k ultra performance is lower resolution than quality 1440p.

(going off of reddit's list of resolutions from Control).
4K Ultra Perf = 1280x720p
1440p Quality = 1707x960p

1440p looks great on my 48" LG OLED anyways with a very slight sharpness reduction from where I sit (about 6.5 feet). Waiting for the 6xxx series so it's something I'll have to deal with until then.
 

DirtInUrEye

Member
Man, come on, i have a 4080 and even that is not capable of decent RT on games that matter with path tracing.

As someone with a card the next step down from yours (Ti Super), I find this comment puzzling. With a couple of smart adjustments to raster settings, I've played through the vast majority of Indiana Jones with the full suite of path traced effects enabled at 60fps (locked for consistency) with DLSS Performance upscaled to 2160p and without frame gen. I've noticed most biomes will run with Balanced, but I can't be bothered dipping in and out of settings. Only in the jungle zones do I completely disable the path traced reflections on materials, as that is a killer - whilst easily being the most unimpactful RT feature in the game.

I'm guessing 60fps isn't deemed a high enough performance target for your preferences. I get it, but I don't mind it at all in slow paced titles myself. Plus Indy has a really nice post processing pipeline anyway.
 
Last edited:

darrylgorn

Member
None of us is playing Minecraft nor is anyone that plays Minecraft playing with RTX, just a small minority. You failed. Next (y)

Anyway, OP you are correct.

eRNSoz5.jpeg

Lens flare can stay. I would also consider volumetric clouds and soft shadows with modern cards.

Everything else is trayash.
 

MacReady13

Member
I sometimes wonder how much better gaming would’ve been if ray tracing was around when I was younger. I mean let’s face it- Super Mario World is a brilliant game but imagine Ray tracing as Mario is flying through the underground caverns? Game would’ve been elevated to astronomical levels.

And Super Metroid? Imagine Ray tracing in THAT game?! It’s a classic now but Ray tracing would’ve turned it into the greatest game of all time! Ray tracing elevates gameplay. It enhances control methods. It’s EASILY worth spending every cent to have Ray tracing for every game available. I’d rather play a game at 15 frames a second to have some real time reflections in a game as that is what makes me more immersed in a game.
 
Oh god, there is a lot of nonsense to unpack here.
  • If you have problems finding examples where the addition of RT has improved the visuals of a video game a dramatically, then you are not familiar enough with the topic
  • Yes, RT is welcomed by marketing departments because it allows for easier, visual marketing. Doesn't mean that the tech is less relevant
  • It's absolut nonsense that HL2 has improved ingame physics to a point that was never surpassed later. No idea where that comes from. The difference is that HL2 put physics front and center because that was the innovation at the time. For other titles, the level of physics simulation that HL2 had, simply became the minimum that every player expects and is therefor integrated. It's just not put front and center. And the level of physics that HL2 had wasn't even that great. You will notice that once a grenade launches that Combine soldier on a 100 feet ragdoll toss. That looked so silly that a lot of game devs moved away from HL2's overly simplistic ragdoll modell. The real problem however with ingame physics is that they don't sync well over the wire in multiplayer games. And this we do only multiplayer games these days, that topic is dead
  • Actually, RT does allow for interesting gameplay elements, it's just that devs can't pursue that, since the base they target owns a mix of RT and non-RT hardware. This arguement even goes back to id tech 4 and Doom 3's multiplayer, where people said that the real time stencil shadows could make for interesting multiplayer moments, where a player's shadow gives away his position etc.

Now, is Ray Tracing nicer? Yes, in some cases, but even then not by a whole lot. Certainly not where it’s worth spending an extra $1,000 or more just to be able to run it well. Or to play with a massive performance penalty in turn.

I feel like Ray Tracing is a lazy add in to help sell video cards and drive marketing initiatives. I wish more effort was put into developing vibrant worlds that you can interact with. Less focus on upscaling and Ray Tracing and more focus on physics and how gameplay changes with the environment.

Two decades later Half Life 2 is still one of the best example of physics. I expected way more evolution on this front than better shadows and contrasting effects.

And yes, gamplay innovation over graphics innovation.
 
Neither Lumen nor SVOGI rely on pre-computed lighting data. Both are a form of voxel based ray traced lighting. Although Lumen mixes in some screen space techniques. Lumen can use pre-calculated data for static parts of the scene to speed up rendering, but it is not a requirement.
Ah, thanks for the clarification. I thought using pre-computed data e.g. for indirect lighting was a requirement.
 
RT and PT is awesome for screenshots, but the trade-offs & costs are not worth the investment imho. If video games were predominantly enterprise-grade products, then sure I'd be all in favour for it. But gaming is fundamentally a B2C market, the vast majority of buyers couldn't give a rats ass about shadows and reflections. They'd prefer that the game just plays well. Hence why most prefer FPS over fidelity. Though of course we'd all choose the RT/PT options if FPS wasn't the tradeoff.
 
Rt in vr games is next gen to me.

In regular games....i dont think its worth it.

But out of all the rt features only reflections and GI that make a visual impact. To me.

But i would trade everything rt for more fluid physics like any type of fluid simulation or smoke simulation etc.

Also give me more gameplay mechanics through physics .

Guys i dont know what the hell happened the past 20 years . If you played red faction you would know what i mean. I could go into any room through walls by blowing through stuff. Could blow a hole in the ground to go one floor below. Even the enemies you killed were going cold in real time. Meaning with the thermal vision gun you could see the bodys going cold . I have never seen anything like it ever since.



Red Faction was amazing.

I agree with you. What the hell happened in the past 20 years?
 

simpatico

Member
Fully agree. I'm playing Uncharted 4 right now on Steam and there are absolutely moments when it reaches UE5 lighting quality. Here's the kicker: there is no temporal soup and I'm running it on a GTX 1080 at 80fps. Sharp as can be. Compare that to a Silent Hill 2 or Immortals of Avenum where you need 4x the flops for a 0.2x gain. You could even argue there is no gain because of the temporal diarrhea everywhere.

The argument that it's "worse" is flimsy and subjective. I'll give you that. But the cost of going from Uncharted 4 to Immortals of Avenum is absolutely the worst return on investment in gaming hardware history. That cannot be denied.
 

manfestival

Member
The only game that really makes me care about is cyberpunk but I havent played it in a minute. Looking forward to trying out Phantom Liberty at some point. I do run every game with ray tracing. even the competitive ones. Doesn't ever feel like it matters. even with a 4090. The only thing about the 4090 is that for the first time out of all of the GPUs that I have used. It is the only GPU where Ray Tracing is like a 90% smooth experience. Every other GPU has some or consistent FPS drops "chugging." Still find this to be mostly a gimmick. Even with people constantly shilling for Nvidia cause of MUH RAY TRACING cores.
 

CrustyBritches

Gold Member
Fully agree. I'm playing Uncharted 4 right now on Steam and there are absolutely moments when it reaches UE5 lighting quality. Here's the kicker: there is no temporal soup and I'm running it on a GTX 1080 at 80fps. Sharp as can be. Compare that to a Silent Hill 2 or Immortals of Avenum where you need 4x the flops for a 0.2x gain. You could even argue there is no gain because of the temporal diarrhea everywhere.

The argument that it's "worse" is flimsy and subjective. I'll give you that. But the cost of going from Uncharted 4 to Immortals of Avenum is absolutely the worst return on investment in gaming hardware history. That cannot be denied.
Isn't this an apples to oranges to mangoes comparison?
 
The problem with RT is that it tanks performance, engineers and software developers need to implement RT in a way that it no longer tanks performance (measured in the number of pixels and framers per second). Similar to how back in the old days the number of polygons = performance limitations. Yes, we have DLSS and FSR, and XeSS (or whatever Intel's version is called), so we will see what the latest versions have to offer. We are gettin there...
 
Minecraft is the prime example, where rt makes a massive difference because the game was built with a simplistic voxel based lighting model in mind so that's an isolated case. As for other games, it's a hit or miss and it often eats performance like crazy to give you shinier looking reflections in places where none should've been, nothing like making a hardwood floor look like it's covered in jello or creating artificial shiny water puddles in areas where there's a ceiling amirite?

They then try to push an image reconstruction ai down your throat, so you could get that performance back, but you have to deal with the fact that what you're getting, is that ai's approximation of what the image is supposed to look like at that target resolution, not what it actually looks like at the native resolution. They can take their dlss and pssr and ram it right up their fucking arse and i don't give a fuck if they want it whole or diced.

They also try to give you the illusion of higher framerates by giving a "get fake frames for free, not so free" type of thing in order to justify their ridiculous gpu prices. Sorry dlss afficionados, but having something at native 4k vs having an ai's approximation of what 4k is supposed to look like are two very different things with their own very distinct connotations. If you don't believe me, go read nvidia's whitepaper on dlss, you'd be surprised at how It's actually described.

Nvidia and Amd have been promising native 4k at 60 fps since 20 fucking 16, where is it? Once you get to that point, you can actually go after ray tracing, which in its current, immature state, is noisy in most games and gives reflective properties in areas and objects where it doesn't make sense for it to be found (think chalkboards in Hogwarts Legacy). Try and solve those issues first, instead of jamming dlss down people's throats to hide the lack of optimisation in games.

Now, i know that this will make most dlss diehards rain "LOL" reactions down upon this post, but think about it. Is it really this post, that you're loling about, or the imperfections in dlss, ai reconstruction and ray tracing noise? You tell me.
 
Last edited:

CrustyBritches

Gold Member
Minecraft is the prime example, where rt makes a massive difference because the game was built with a simplistic voxel based lighting model in mind so that's an isolated case. As for other games, it's a hit or miss and it often eats performance like crazy to give you shinier looking reflections in places where none should've been, nothing like making a hardwood floor look like it's covered in jello or creating artificial shiny water puddles in areas where there's a ceiling amirite?

They then try to push an image reconstruction ai down your throat, so you could get that performance back, but you have to deal with the fact that what you're getting, is that ai's approximation of what the image is supposed to look like at that target resolution, not what it actually looks like at the native resolution. They can take their dlss and pssr and ram it right up their fucking arse and i don't give a fuck if they want it whole or diced.

They also try to give you the illusion of higher framerates by giving a "get fake frames for free, not so free" type of thing in order to justify their ridiculous gpu prices. Sorry dlss afficionados, but having something at native 4k vs having an ai's approximation of what 4k is supposed to look like are two very different things with their own very distinct connotations. If you don't believe me, go read nvidia's whitepaper on dlss, you'd be surprised at how It's actually described.

Nvidia and Amd have been promising native 4k at 60 fps since 20 fucking 16, where is it? Once you get to that point, you can actually go after ray tracing, which in its current, immature state, is noisy in most games and gives reflective properties in areas and objects where it doesn't make sense for it to be found (think chalkboards in Hogwarts Legacy). Try and solve those issues first, instead of jamming dlss down people's throats to hide the lack of optimisation in games.

Now, i know that this will make most dlss diehards rain "LOL" reactions down upon this post, but think about it. Is it really this post, that you're loling about, or the imperfections in dlss, ai reconstruction and ray tracing noise? You tell me.
Are you a console gamer who can't have it, or a PC gamer who won't turn it off?
 

Madflavor

Member
The more we've been inching and inching with absolute baby steps towards better visual fidelity these past 10 years, the more I've begun to realize how fruitless of a pursuit it's all been. I mean yes it looks great, and there's definitely an audience for it, but the cost of it has been that games take MUCH longer to develop these days. And at the end of the day, most games I've been drawn to have prioritized style over realism.
 

DoubleClutch

Gold Member
The more we've been inching and inching with absolute baby steps towards better visual fidelity these past 10 years, the more I've begun to realize how fruitless of a pursuit it's all been. I mean yes it looks great, and there's definitely an audience for it, but the cost of it has been that games take MUCH longer to develop these days. And at the end of the day, most games I've been drawn to have prioritized style over realism.

I’ll take COD4 online multiplayer in its heyday over any modern game.
 
Are you a console gamer who can't have it, or a PC gamer who won't turn it off?
I was a console gamer, 15 years ago. Now, I'm on pc and heavily favouring playing games at native resolution. Sorry, but subservience on AI is not my thing, especially one that generates frames and tries to approximate how something looks.
 
Last edited:

FireFly

Member
They then try to push an image reconstruction ai down your throat, so you could get that performance back, but you have to deal with the fact that what you're getting, is that ai's approximation of what the image is supposed to look like at that target resolution, not what it actually looks like at the native resolution. They can take their dlss and pssr and ram it right up their fucking arse and i don't give a fuck if they want it whole or diced.
The A.I isn't trying to approximate how the image is supposed to look. That's what spatial A.I upscalers like DLSS 1 did, and is the method used for the PS5 Pro's legacy "Enhance Image Quality" option (not PSSR) and Auto SR in Windows 11.

This method of upscaling generates hallucinated detail, since the algorithm is essentially "guessing" what the missing information should be. We saw this with DLSS 1, and you can see it in Auto SR in the "painterly look" it can create. (Though it seems Auto SR delivers much better results).

Nvidia purposefully wanted to avoid this with DLSS 2, which uses A.I not to guess how the image should look, but to decide which information captured from past frames is still valid and can be used to infer the colour of a given pixel. It's essentially a better form of TAA, that can capture more detail. And as games use TAA in their "native" resolution, DLSS at native will provide much better image quality. As you lower the input resolution, the difference between DLSS and native TAA will diminish, with the general consensus being that DLSS Quality (66% resolution scale) is similar to the native TAA output.
 
Last edited:
Top Bottom