• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nearly 7 years later, Ray Tracing still isn’t worth the performance hit nor does it enhance the experience much.

Trunx81

Member
I haven’t seen RT in action yet, apart from YT videos and I just don’t see the appeal yet. What I really miss, though, is environment that you can destroy. Red Alert was a first take on that. But imagine GTA6 with the possibility to destroy entire blocks and when you come back later they rebuild it.
 

mrqs

Member
You won't be able to play several games without ray tracing in 2025. Even less in 2026. By 2027 it'll be just impossible for most games to be played without ray tracing.
 

amigastar

Gold Member
I haven’t seen RT in action yet, apart from YT videos and I just don’t see the appeal yet. What I really miss, though, is environment that you can destroy. Red Alert was a first take on that. But imagine GTA6 with the possibility to destroy entire blocks and when you come back later they rebuild it.
Thats a rather naive wish. It's comparable to the old thinking of Zelda having stick the arrow you shot in the beginning of the game still being there at the end.
 

LordOcidax

Member
Easy to wow gamers with uber lighting and shadows, than make good gameplay clips, smart enemy AI, or physics.

Just imagine how awesome it would be for an open world RPG or action game to have crazy physics like an old BF game with crumbling buildings, castle walls destroyed or cave-ins. Yet it seems the only games that have that are tech demo indie games or a shooter.
We already have some kind of that, and is considered one of the best game ever, on a shitty hardware… now image that in a bigger scale on a more powerful hardware.
 

NomenNescio

Member
I think the biggest benefit ray tracing could bring is to shorten development time, because instead of wasting time baking lightning in every scene they can just ray-trace the F out of everything and it would be even more accurate and technically impressive.
 

Spukc

always chasing the next thrill
None of us is playing Minecraft nor is anyone that plays Minecraft playing with RTX, just a small minority. You failed. Next (y)

Anyway, OP you are correct.

eRNSoz5.jpeg
2004 called they want their boring complaints back
 

LordOcidax

Member
I think the biggest benefit ray tracing could bring is to shorten development time, because instead of wasting time baking lightning in every scene they can just ray-trace the F out of everything and it would be even more accurate and technically impressive.
Yes, but we don’t have the horse power yet, especially on consoles.
 

Filben

Member
Control, Alan Wake 2 & CP2077 & Indy with PT really did it for me. I see so many flaws in rasterized graphics when going back.

Especially those ugly SSRs in motion really hurts. Without indirect lighting, so many "floating" geometry, unnatural shadows.

Lucky you if you don't notice. I enjoy a (good) game even more when it has ray tracing.
 

Zathalus

Member
Of course it is worth it, provided you have capable hardware and a game with a good implementation. Cyberpunk, Black Myth, Witcher 3, Indiana, Alan Wake, Spider-Man, Metro Exodus, Control, DD2, and Dying Light 2 are all examples where the implemented RT really improves the overall visual package.
 

LordOcidax

Member
Of course it is worth it, provided you have capable hardware and a game with a good implementation. Cyberpunk, Black Myth, Witcher 3, Indiana, Alan Wake, Spider-Man, Metro Exodus, Control, DD2, and Dying Light 2 are all examples where the implemented RT really improves the overall visual package.
The issue here is that the majority of people only have the hardware to play Quake 2 and Minecraft full path tracing… And forget about consoles, is a blurry mess as result to achieve an “aceptable” performance. The “fake” lightning still a better solution for the moment. Old games like Uncharted 3 or Last of Us 2 on PS5 the end result looks better and cleaner than 90% of RT games and with a better performance.
 
Last edited:
Ray tracing itself is the evolution and assuming performance continues to scale into the future, it WILL be the standard. As of today however, it is frighteningly busted and hacky. Things like temporal denoisers, sample counts that are a fraction of what they should be, and the very concept of a BVH structure with reduced geometry and shading complexity, means that we are no closer to realizing true ray traced beauty than we were a decade ago. It's still the holy grail that's decades away.
 

clarky

Gold Member
Firstly, this is a testament to how good devs are at fake lighting and shadows using raster.

Now, is Ray Tracing nicer? Yes, in some cases, but even then not by a whole lot. Certainly not where it’s worth spending an extra $1,000 or more just to be able to run it well. Or to play with a massive performance penalty in turn.

I’ve been far more impressed by ultra high displays (think 480Hz) that handle motion really well.

I feel like Ray Tracing is a lazy add in to help sell video cards and drive marketing initiatives. I wish more effort was put into developing vibrant worlds that you can interact with. Less focus on upscaling and Ray Tracing and more focus on physics and how gameplay changes with the environment. Such as pouring gasoline to increase flammability, or breaking something to combine a piece of it with another, stuff like that. It’s like these aspects of gaming have gotten much dumber over the years, generally speaking. Look at Star Wars Outlaws for example. And that’s a high budget production no less.

Two decades later Half Life 2 is still one of the best example of physics. I expected way more evolution on this front than better shadows and contrasting effects.

Don’t me wrong, Ray Tracing is welcome, it just hasn’t fundamentally changed any game from not fun to fun, or fun to suddenly very fun.

EDIT: Another special thanks to clarky clarky for being the best donor a guy could ever have. Thank you for another gold my brother.
Im very drunk again but i find your OP's honest and actually quite discussion worthy. We need more like you round these parts. You wear your heart on your sleeve, thats commendable.

Have a good one x.
 
Last edited:

Zathalus

Member
The issue here is that the majority of people only have the hardware to play Quake 2 and Minecraft full path tracing… And forget about consoles, is a blurry mess as result to achieve an “aceptable” performance. The “fake” lightning still a better solution for the moment. Old games like Uncharted 3 or Last of Us 2 on PS5 the end result looks better and cleaner than 90% of RT games and with a better performance.
True, which is why I said capable hardware. 4k is extremely demanding but a 4070 Super is good enough for 1440p and for 1080p a 4060 Ti would suffice. Consoles (other than the Pro) are mostly a lost cause.
 

StreetsofBeige

Gold Member
Yeah sure, we're hearing that on a daily basis. It'd be nice to have some numbers, too. Do we know how much of their profits is attributed to DGPU?

de6758ecd5b0e399a529055d35def296


More up to date numbers with Profit numbers. Page 21.

 
Last edited:
Just look at their quarterly reports. AI dwarfs gaming by a huge margin.
Even Jensen now says that Nvidia is no longer a graphics company. Because their main focus is AI.
So 4-5 fold, Jesus. Will be a good time to buy after the inevitable slowdown of the ai craze began? They seem to be overrated rn.
 

SF Kosmo

Al Jazeera Special Reporter
Good implementations of RT, like the ones found in Cyberpunk, Indiana Jones, and Alan Wake II offer clear and tangible benefits to visual quality, but they're also some of the heaviest out there.

Can you save a lot of compute by turning these features off? Certainly. But what are you going to do with it? At some point resolution and frame rate offer diminishing returns as well. At least RT is a new frontier to conquer.

Whether or not that leap is enough to justify a pricey upgrade is a matter of personal preference, but the tech is going to continue to advance either way. Next gen consoles are going to heavily exploit RT And AI as a key differentiator.
 

SF Kosmo

Al Jazeera Special Reporter
Wtf, even with a 4090 RT tanks your fps?
If you're trying to run it at native 4K like an ass, sure. RT load scales up proportionally to pixel count, so it's advisable to leverage DLSS. That has image quality implications, though probably pretty minor/hard to tell at 4K.

People who buy $1500 graphics cards are often not very good at figuring out how to balance settings. They buy these cards precisely because they want to put everything on max and not think about that.
 

StreetsofBeige

Gold Member
So 4-5 fold, Jesus. Will be a good time to buy after the inevitable slowdown of the ai craze began? They seem to be overrated rn.
Nvidia is going full MS. There's a consumer product side to them, but the real money is in the corporate side. But part of promoting corporate is to have the customer facing consumer side even if the products dont align.

It's like why does SAP (a company that makes stodgy corporate ERP programs) do TV ads? It shouldnt have anything to do with watching a basketball game. But it happens.
 

Parazels

Member
Devs seemed to have focused way too hard on something hardly noticeable to most players.
1) That's because we have gotten used to fake lighting, that mimics RT.
2) That's because RT is often limited and low quality in the modern games. But it looks fantastic, when 100% applied (Quake 2 RT).
 
Last edited:

Madflavor

Member
If you're trying to run it at native 4K like an ass, sure. RT load scales up proportionally to pixel count, so it's advisable to leverage DLSS. That has image quality implications, though probably pretty minor/hard to tell at 4K.

People who buy $1500 graphics cards are often not very good at figuring out how to balance settings. They buy these cards precisely because they want to put everything on max and not think about that.

Ok cause I have 1440p, and I was able to run Cyberpunk 2077 with RT maxed out on high settings, and was able to average 45 fps. So I was confused how someone with a 4090 is taking a big hit to their performance.
 
Last edited:
Ray tracing itself is the evolution and assuming performance continues to scale into the future, it WILL be the standard. As of today however, it is frighteningly busted and hacky. Things like temporal denoisers, sample counts that are a fraction of what they should be, and the very concept of a BVH structure with reduced geometry and shading complexity, means that we are no closer to realizing true ray traced beauty than we were a decade ago. It's still the holy grail that's decades away.
Decades? That would be pathetic if the industry would take that long ...though I could see that happening on consoles. I bought the PS5 Pro, a console that released 4 years after this gen started and priced $300 higher than base PS5, only to realize how pathetically underpowered it still is.

My big wish was that games that had a good RTGI solution on PC (Alan Wake 2, Cyberpunk, Dying Light 2, Control, SH2, Black Myth) would be able to achieve something like what a 3080 could provide ...none of these games have achieved anything like that on the Pro, even with its supposed 3x RT performance boost and ai upscaling.

Cyberpunk didn't even get a patch and that is THE poster child and litmus test for RT. Rtgi and reflections at 30 fps i would have settled for but nope, and DF has speculated that the reason CDProject didn't make a patch is because they couldn't achieve significant gains on the fucking "Pro".

Console makers are not really interested in pushing Ray tracing or graphics. Cerny is a bit of a snake oil salesman at this point imo. Forget Path tracing they can't even get RTGi running in any demanding game.
 

SF Kosmo

Al Jazeera Special Reporter
Decades? That would be pathetic if the industry would take that long

I think what he means is that it will take that long for real-time games to be doing everything we see in pre-rendered films that take hours to render a frame.

And that might be true to an extent, the RT we use in games is sparse and noisy and usually limited to one or two bounces, there are a lot of cut corners. But when you look at the rate it's accelerating and the ways AI is multiplying those gains, it's easy to imagine we'll get something that LOOKS pretty close, even if it's not actually doing the amount of raw computation.
 

rofif

Can’t Git Gud
None of us is playing Minecraft nor is anyone that plays Minecraft playing with RTX, just a small minority. You failed. Next (y)

Anyway, OP you are correct.

eRNSoz5.jpeg
Forgot film grain.
Almost all of those are good. I don't like Lens flare personally

CA - without it, some games look to raw, too sharp, too compuetr-graphics-like. Like re4 remake with and without it.
Motion blur - Essential. 120/60 and especially 30fps is not enough data for our brains. if 60fps frame is on screen for 16ms, that frame needs to capture that 16ms movment... just in order to cheat our brains and make things appear smoother.
Depth of field - great effect.
Film Grain - in rare few games helps with gradients banding (it does dithering). I disable most of time.

Who dislikes depth of field here? monsters
9zS7TLw.jpeg


RT? yeah I dont care about it too much. It's good but raster techniques can be good too
 
Last edited:

64bitmodels

Reverse groomer.
nobody would be complaining about raytracing if graphics cards and console hardware were accelerating at the rate they did 10 years ago- we'd have 200 dollar GPUs with 4070 raytracing performance, and overall life would be good.

I've come to the realization that Nvidia and their greed with their hardware (and equally, AMD's incompetence with adapting to the times) are far more to blame than any lighting solution. As the years pass it's gonna become a necessity and we're gonna be using it in games whether we like it or not, like hardware 3d, pixel shaders or global illumination it's gonna become a requirement. This normal advancement would be far more quickly accepted if the only cards capable of pulling it in a decent fashion weren't 600+ dollars. So because of this price to performance deficit, people would rather just not bother.
 
Last edited:

Soltype

Member
We need to go back to the two graphic card setup like how it was with physx. I hate the fact that they try to put everything into one card.
 
Last edited:

64bitmodels

Reverse groomer.
Motion blur - Essential. 120/60 and especially 30fps is not enough data for our brains. i
.... yeah no. motion blur is still the target framerate. You can't get more frames by adding motion blur.

moreover.... all of these framerates are enough, and playable. it's not until 15fps where you can start saying that "there's not enough data for our brains". Hence why older SuperFX games are virtually unplayable.
 

Bridges

Member
I remember playing Control Ultimate Edition on Series X and enabling ray tracing mode for one encounter. I thought "wow that looks really good" and then immediately shut it off because the game became a powerpoint slideshow. I don't think I've ever played anything with RT since then, or if I have I didn't notice.

Honestly it seems pretty cool but if it makes the game run like shit then why bother.
 

64bitmodels

Reverse groomer.
genuinely though. Imagine if the first 3D accelerated GPUs like Voodoo 2 or Riva TNT were 900, 1000 or even 1500 dollars.

People here would be saying that hardware rendering is overrated and software works just fine, that hardware brings no noticeable improvement, that the textures and their ugly smearing makes the whole deal not worth it.
It's not raytracing. It's the prices of the cards that can do it.
 
.... yeah no. motion blur is still the target framerate. You can't get more frames by adding motion blur.

moreover.... all of these framerates are enough, and playable. it's not until 15fps where you can start saying that "there's not enough data for our brains". Hence why older SuperFX games are virtually unplayable.
He's talking more about 30 fps games and motion blur is essential for that especially on an Oled where the pixel response is instantaneous? Without motion blur it'll look like a slideshow.
 

rofif

Can’t Git Gud
.... yeah no. motion blur is still the target framerate. You can't get more frames by adding motion blur.

moreover.... all of these framerates are enough, and playable. it's not until 15fps where you can start saying that "there's not enough data for our brains". Hence why older SuperFX games are virtually unplayable.
Of course you don;t get more frames.
It's all about cheating your brain. Still frozen in time images with no movement are not the best source to create motion by adding them together.... unless you go to like 240fps and that's my own experience.
So we can make a frame look like it's conveying motion. just like in camera. When you record at 30fps, you need 1/60th shutter speed. if you got too fast, Stuff starts feeling jerki. Flowing water turns into droplets etc.
Compare GTA helicopter blades. Especially gta 4 to 5 because that is form of motion blur.
4 got no motion blur on blades spinning and it looks unnatural.
5 removes the 3d blades and replaces with 2d blurred animation and it looks way better. More like real motion.
Same goes for everything. If you had 240 frames to fill that, you wouldn't need motion blur at all.
Play doom 2016 on 240-480fps screen and then on 60hz screen. I experienced this myself in person on my screens.
If you disable motion blur at 240fps and enable it at 60fps.. it looks close to identical. so many frames are enough to not have to cheat your brain by blurring 60 frames.
it's how our brain and eyes work. of course I am not talking about cheap effect like gta 3 camera blur lol. We are way past that.

 
I remember playing Control Ultimate Edition on Series X and enabling ray tracing mode for one encounter. I thought "wow that looks really good" and then immediately shut it off because the game became a powerpoint slideshow. I don't think I've ever played anything with RT since then, or if I have I didn't notice.

Honestly it seems pretty cool but if it makes the game run like shit then why bother.
Remedy really needs to fucking patch that game for the ps5 pro so we can play with RT at 60 fps ...the console should be able to at least do it with only RT reflections!

That is one game that on PC really does look amazing with RTGI

I just don't know how people can say RT isn't a massive, important improvement after we've seen so many games on PC implementing it well. This thread had shown me that even on an enthusiast gaming forums, people are either not savvy about graphics or really just don't care much about graphics.

All I can say is careful what you wish for. If these publishers could get away without having to spend so much money on graphics while still raking in the same profits, we can say goodbye to better graphics. We've already seen a lessening of graphical standards this gen from the likes of Sony, who is more than happy to prioritize 60 fps on Cross Gen games instead of Ray Tacing.

I personally hope they keep chasing visual leaps until we hit that holy grail of full Ray Tracing/path tracing in games using dense geometry at 60 fps on console. Even if that means we don't get advanced physics I'm ok with that. Games like Control strike a good enough balance i don't need physics beyond that for now. That's not to say I don't want great physics and destruction for certain games ...Battlefield for instance.
 

rm082e

Member
In order to get RT that is noticeably better than raster, it's going to have a big impact on the hardware performance. Anytime I've tried the "low" RT settings, I can't tell a difference, but the frame rate gets worse. Full path tracing is certainly not worth the performance hit if you're on a lower tier or older card. On my 3080, if I turn it on in Cyberpunk, I go from 80+fps to 20+fps (QHD). Maybe if I had a 4090 and could maintain 60+ with the RT on I'd appreciate it, but big nope with my current hardware.

I also think RT reflections look over the top and dumb in almost every case. It tends to give surfaces a mirror finish that shouldn't have them, like the floor of the dining hall in Hogwarts. It's a real uncanny valley effect.
 
Top Bottom