CrustyBritches
Gold Member
I don’t believe this.That’s the problem, before we had better baked lightning than now, especially for new games.
I don’t believe this.That’s the problem, before we had better baked lightning than now, especially for new games.
But it's capable of great rtgi which is already a huge step up from baked lightingMan, come on, i have a 4080 and even that is not capable of decent RT on games that matter with path tracing.
Turn it off?
Check this out. From Lords of the Fallen devs talking about Lumen.Not necessarily. When we say "baked lighting" in the context of it being versus RT, we're using it as a catch all term to encompass all raster-based lighting techniques using pre-computed lighting data. So light maps, pre-computed light probes, SVOGI etc etc.
I really wish everyone can get and play on a 4090. There are times where I'm literally blown away and other times where the performance hit is so bad id rather not use it. Indiana Jones and Cyberpunk are examples of a must play with everything turned on and enjoy. Some other games are a solid, NO.
Nonsense...Man, come on, i have a 4080 and even that is not capable of decent RT on games that matter with path tracing.
A 4070 can enable Ultra RT at DLSS Quality in Cyberpunk 2077. Close to 100fps with frame generation enabled. How is that not fine?Man, come on, i have a 4080 and even that is not capable of decent RT on games that matter with path tracing.
A 4070 can enable Ultra RT at DLSS Quality in Cyberpunk 2077. Close to 100fps with frame generation enabled. How is that not fine?
RT can be quite transformative if used correctly:
If you don’t like FG then just disable it and run at a mostly locked 60fps. Personally I think the visual gains for Ultra RT in Cyberpunk is more than worth it to go from ~90 to ~60 fps.That is terrible. It means the game is running at a base frame rate of 40-50 fps. Even with Reflex, that is too much input lag.
I would rather turn off RT and play at a real 100 fps, only with DLSS quality enabled. And also turn on Reflex, to have really smooth frame rate and very low input lag.
CP 2077 already looks very good without RT.
If you don’t like FG then just disable it and run at a mostly locked 60fps. Personally I think the visual gains for Ultra RT in Cyberpunk is more than worth it to go from ~90 to ~60 fps.
Yes, and I’d rather enjoy the Ultra RT. 60fps and Reflex is more than enough for a single player game.That is what I said. Turn off FG and RT. Turn on Reflex and DLSS quality. And enjoy high frame rate and very low input lag.
I noticed for some reason that on my 3070Ti that 4K DLSS is much worse than just doing 1440p DLSS even with them at the opposite quality modes (1440p quality vs 4k ultra performance). RAM issue I'm guessing? Not sure as 4k ultra performance is lower resolution than quality 1440p.Lately I’m finding the best option to be able to play with ray tracing is to just drop the res to 1440p. Not sure what the best solution is since it varies from game to game on how well raytracing is implemented. But with 1440p the majority of games will run flawlessly but at 4K it could be a hot mess in certain games.
You mean regular rt right? not path tracing.A 4070 can enable Ultra RT at DLSS Quality in Cyberpunk 2077. Close to 100fps with frame generation enabled. How is that not fine?
Man, come on, i have a 4080 and even that is not capable of decent RT on games that matter with path tracing.
Not path tracing. You need a 4070 to at least for 1440p iirc.You mean regular rt right? not path tracing.
None of us is playing Minecraft nor is anyone that plays Minecraft playing with RTX, just a small minority. You failed. Next
Anyway, OP you are correct.
Now, is Ray Tracing nicer? Yes, in some cases, but even then not by a whole lot. Certainly not where it’s worth spending an extra $1,000 or more just to be able to run it well. Or to play with a massive performance penalty in turn.
I feel like Ray Tracing is a lazy add in to help sell video cards and drive marketing initiatives. I wish more effort was put into developing vibrant worlds that you can interact with. Less focus on upscaling and Ray Tracing and more focus on physics and how gameplay changes with the environment.
Two decades later Half Life 2 is still one of the best example of physics. I expected way more evolution on this front than better shadows and contrasting effects.
Ah, thanks for the clarification. I thought using pre-computed data e.g. for indirect lighting was a requirement.Neither Lumen nor SVOGI rely on pre-computed lighting data. Both are a form of voxel based ray traced lighting. Although Lumen mixes in some screen space techniques. Lumen can use pre-calculated data for static parts of the scene to speed up rendering, but it is not a requirement.
Does your final resolve look like this!?!?!??None of us is playing Minecraft nor is anyone that plays Minecraft playing with RTX, just a small minority. You failed. Next
Anyway, OP you are correct.
Rt in vr games is next gen to me.
In regular games....i dont think its worth it.
But out of all the rt features only reflections and GI that make a visual impact. To me.
But i would trade everything rt for more fluid physics like any type of fluid simulation or smoke simulation etc.
Also give me more gameplay mechanics through physics .
Guys i dont know what the hell happened the past 20 years . If you played red faction you would know what i mean. I could go into any room through walls by blowing through stuff. Could blow a hole in the ground to go one floor below. Even the enemies you killed were going cold in real time. Meaning with the thermal vision gun you could see the bodys going cold . I have never seen anything like it ever since.
Isn't this an apples to oranges to mangoes comparison?Fully agree. I'm playing Uncharted 4 right now on Steam and there are absolutely moments when it reaches UE5 lighting quality. Here's the kicker: there is no temporal soup and I'm running it on a GTX 1080 at 80fps. Sharp as can be. Compare that to a Silent Hill 2 or Immortals of Avenum where you need 4x the flops for a 0.2x gain. You could even argue there is no gain because of the temporal diarrhea everywhere.
The argument that it's "worse" is flimsy and subjective. I'll give you that. But the cost of going from Uncharted 4 to Immortals of Avenum is absolutely the worst return on investment in gaming hardware history. That cannot be denied.
Are you a console gamer who can't have it, or a PC gamer who won't turn it off?Minecraft is the prime example, where rt makes a massive difference because the game was built with a simplistic voxel based lighting model in mind so that's an isolated case. As for other games, it's a hit or miss and it often eats performance like crazy to give you shinier looking reflections in places where none should've been, nothing like making a hardwood floor look like it's covered in jello or creating artificial shiny water puddles in areas where there's a ceiling amirite?
They then try to push an image reconstruction ai down your throat, so you could get that performance back, but you have to deal with the fact that what you're getting, is that ai's approximation of what the image is supposed to look like at that target resolution, not what it actually looks like at the native resolution. They can take their dlss and pssr and ram it right up their fucking arse and i don't give a fuck if they want it whole or diced.
They also try to give you the illusion of higher framerates by giving a "get fake frames for free, not so free" type of thing in order to justify their ridiculous gpu prices. Sorry dlss afficionados, but having something at native 4k vs having an ai's approximation of what 4k is supposed to look like are two very different things with their own very distinct connotations. If you don't believe me, go read nvidia's whitepaper on dlss, you'd be surprised at how It's actually described.
Nvidia and Amd have been promising native 4k at 60 fps since 20 fucking 16, where is it? Once you get to that point, you can actually go after ray tracing, which in its current, immature state, is noisy in most games and gives reflective properties in areas and objects where it doesn't make sense for it to be found (think chalkboards in Hogwarts Legacy). Try and solve those issues first, instead of jamming dlss down people's throats to hide the lack of optimisation in games.
Now, i know that this will make most dlss diehards rain "LOL" reactions down upon this post, but think about it. Is it really this post, that you're loling about, or the imperfections in dlss, ai reconstruction and ray tracing noise? You tell me.
The more we've been inching and inching with absolute baby steps towards better visual fidelity these past 10 years, the more I've begun to realize how fruitless of a pursuit it's all been. I mean yes it looks great, and there's definitely an audience for it, but the cost of it has been that games take MUCH longer to develop these days. And at the end of the day, most games I've been drawn to have prioritized style over realism.
I was a console gamer, 15 years ago. Now, I'm on pc and heavily favouring playing games at native resolution. Sorry, but subservience on AI is not my thing, especially one that generates frames and tries to approximate how something looks.Are you a console gamer who can't have it, or a PC gamer who won't turn it off?
The A.I isn't trying to approximate how the image is supposed to look. That's what spatial A.I upscalers like DLSS 1 did, and is the method used for the PS5 Pro's legacy "Enhance Image Quality" option (not PSSR) and Auto SR in Windows 11.They then try to push an image reconstruction ai down your throat, so you could get that performance back, but you have to deal with the fact that what you're getting, is that ai's approximation of what the image is supposed to look like at that target resolution, not what it actually looks like at the native resolution. They can take their dlss and pssr and ram it right up their fucking arse and i don't give a fuck if they want it whole or diced.