- Every Nvidia gen keeps doubling the ray/triangle intersection rate
- Turing 1x ray/triangle intersection rate
- Ampere 2x ray/triangle intersection rate
- Ada 4x ray/triangle intersection rate
- Path tracing's randomness is heavy on SIMD (simultaneous instruction/mutiple data)
- L2 cache increase useful core utilisation
- RTX 4090 72 MB
- RTX 3090 6 MB
- RX 6900 XT 4 MB
- RX 7900 XTX 6 MB
One of my older post
This confuses a whole bunch of peoples on the internet because there's a mismatch papers/offline rendering/games. Even one of DICE's ex engine engineer who has a series on youtube was getting a headache from all the different interpretations.
TLDR for gaming : Path tracing is physically accurate ray bounces while ray tracing is always with a hack/approximation. The bounce in RTXGI/DDGI games (ray tracing) is more often than not a random direction based on material and then goes directly to probe.
Path tracing is really the rays out of camera → hitting objects kind of setup. All DDGI/RTXGI solutions have been probe-grids based solutions (ala rasterization). It's good, and its a step up from just rasterization, but also has limits and is still more complex to setup and can have the same light leaks problems as rasterization if devs aren't looking for the solution everywhere.
As seen here :
So it's valid of highly static scenes and are still touch up by artists to correct the volumes.
Path tracing is..
so your questioning an epic engineer who worked on unreal 5...
hes an epic engineer who worked on unreal 5 and if he is skeptical i think you should too... instead of jumping on the hype train like some occultists... the renderer is obviously still using a raster pipeline.. the primary visibility or isnt pathtraced...
Wonder if the tech preview will be updated for something like this...This is wrong.
Lighting is one of the things that can be path traced, but also shaders, and this is lacking in CP overdrive mode.
With path traced shaders you could have realistic refractions, Sub surface scattering, caustics, realistic fur depth shadow maps, realistic absoption and scattering of glossy materials, dispersion etc.
We're still long ways from full path tracing, but getting closer by the year.
1440p with DLSS Performance will let you play on max with 35+ fps even at load.My 3080 Ti barely hit 30 fps even with DLSS on performance
Yeah because posting memes outsmarts truth. Just because everybody calls chat gpt and before that DLSS AI doesn't mean all this AI term refrence reprensets the true meaning of AI. It's not Artificial Intelligence incarnate. No paper cites is as pathtracing, infact no research paper ever talks in absolutist terms. Infact research paper themselves give greater emphasis on how all this is smart hackery and shouldn't be taken as full solution. All research paper cite it as next step which is what it is. For example: Caustics refraction effect is more easier to identify on transparent subjects but that doesn't mean that's only place where refraction of light is an issue.So you're questioning Nvidia and University of Utah?
Generalized Resampled Importance Sampling: Foundations of ReSTIR | Research
As scenes become ever more complex and real-time applications embrace ray tracing, path sampling algorithms that maximize quality at low sample counts become vital.research.nvidia.com
Even scholars outside of Nvidia are calling it path tracing
The guy you quoted is having questions about how it's done. Some peoples already correct him in that very thread. He doesn't have all the answers, he's asking questions. That's it.
You're extrapolating something that isn't there.
Fucking hell, only Kingyala would
1) Say Lumen is not RT like a dozen times in that other thread when all Epic's documentation is saying it is (Go ask Andrew!) and
2) Pick the first sign of question about what ReSTIR is doing and what the solution is and jump on the bandwagon that it's not path tracing.
If Cyberpunk is actually tracing primary rays, if. One page before he says "I'm not sure if Cyberpunk is actually tracing primary rays even on "path tracing" mode... it's kind of unclear between some of the tech press around it vs. the SIGGRAPH presentation"
Renderer is "obviously" still using a raster pipeline, that's your conclusion.
If you’re not excited because something isn’t scientifically perfect yet, then when are you actually excited? Seems like a sad way to go through life if you ask me, you’re missing out on lots of excitement. I’d say it’s better to stay in the present and enjoy life as it is, smoke and mirrors and all, than just be stuck waiting for something better. Cyberpunk with path tracing is as good as things are for proper full games right now and it will take awhile until something can top it.Yeah because posting memes outsmarts truth. Just because everybody calls chat gpt and before that DLSS AI doesn't mean all this AI term refrence reprensets the true meaning of AI. It's not Artificial Intelligence incarnate. No paper cites is as pathtracing, infact no research paper ever talks in absolutist terms. Infact research paper themselves give greater emphasis on how all this is smart hackery and shouldn't be taken as full solution. All research paper cite it as next step which is what it is. For example: Caustics refraction effect is more easier to identify on transparent subjects but that doesn't mean that's only place where refraction of light is an issue.
Caustics are everywhere, even on floor, depending upon materials, that alone is not small limitation. Then ofc you have bare minimum shadow rays and then on top of it denoiser solutions are highly imperfect. I don't hate raytracing. I actually like it more than Alex Battaglia but so for the hype train is more about marketing and less about improving games or game developer's life. And that needs to be called out as often becuase doing experimental stuff on personal level is great but doing it at industry level is simply being dickhead. And Nvidia needs money to please investors and it needs GPUs to sell. This here is not holy grail of gaming, at best it's a good implentation of what's out there from all of Nvidia's research. Every video by DF on DLSS3 and RT is what Nvidia wants. It's not full pathtracing because if it was we wouldn't be doing more research on it, would we. You are like oh universe came out of big bang and thats it now we know, meanwhile all research papers will say it's so far the best explaination we got and this model is severly limited and further reaserch is needed. DF is like Space.com which says ' How scientists proved universe came out of big bang' and you are like 'oh my god, my mind is blown'. That's marketing and hyperbole not real spice. This 'please be exited' is ruining gaming and thats my point.
This is wrong.
Lighting is one of the things that can be path traced, but also shaders, and this is lacking in CP overdrive mode.
With path traced shaders you could have realistic refractions, Sub surface scattering, caustics, realistic fur depth shadow maps, realistic absoption and scattering of glossy materials, dispersion etc.
We're still long ways from full path tracing, but getting closer by the year.
Yeah because posting memes outsmarts truth. Just because everybody calls chat gpt and before that DLSS AI doesn't mean all this AI term refrence reprensets the true meaning of AI. It's not Artificial Intelligence incarnate. No paper cites is as pathtracing, infact no research paper ever talks in absolutist terms. Infact research paper themselves give greater emphasis on how all this is smart hackery and shouldn't be taken as full solution. All research paper cite it as next step which is what it is. For example: Caustics refraction effect is more easier to identify on transparent subjects but that doesn't mean that's only place where refraction of light is an issue.
Caustics are everywhere, even on floor, depending upon materials, that alone is not small limitation. Then ofc you have bare minimum shadow rays and then on top of it denoiser solutions are highly imperfect. I don't hate raytracing. I actually like it more than Alex Battaglia but so for the hype train is more about marketing and less about improving games or game developer's life. And that needs to be called out as often becuase doing experimental stuff on personal level is great but doing it at industry level is simply being dickhead. And Nvidia needs money to please investors and it needs GPUs to sell. This here is not holy grail of gaming, at best it's a good implentation of what's out there from all of Nvidia's research. Every video by DF on DLSS3 and RT is what Nvidia wants. It's not full pathtracing because if it was we wouldn't be doing more research on it, would we. You are like oh universe came out of big bang and thats it now we know, meanwhile all research papers will say it's so far the best explaination we got and this model is severly limited and further reaserch is needed. DF is like Space.com which says ' How scientists proved universe came out of big bang' and you are like 'oh my god, my mind is blown'. That's marketing and hyperbole not real spice. This 'please be exited' is ruining gaming and thats my point.
It does of course. What's missing right now is proper translucency and refraction, in fact there are some glass materials that glow in white. Maybe they will add it at a later time...Wait can someone explain to me if Overdrive mode handles emissive material or not? I was under the impression it does and you see that from like billboards and neon signs. That guy's comment claims it doesn't? What's going on?
Digital Foundry piece about caches aren't entirely accurate.What are the AMD RT generation equivalents to these quoted, out of curiosity?
No game is "fully path traced". Not yet anyway.yeah but Minecraft RTX has none of that either afaik even tho it's "fully pathtraced", simply due to the fact that the graphics are so simple
No game is "fully path traced". Not yet anyway.
Digital Foundry piece about caches aren't entirely accurate.
And imo, RT performance is dependent on the dev willingness to optimize for each hardware.
This is cherry picking (than again, so is every other test out there that select specific games vs AMD), since the consoles have been putting out RT games. The fact AMD hardware can perform this good against Nvidia in any case say alot about AMD RT implementation potential.
And for a somewhat more accurate comparison of 7900 XTX vs 4080.
AMD would of fared better if they went with 4nm as well.
4nm chips reproduce the power consumption problem, how do advanced process chips crack the "curse" of leakage current
However, this does not mean that the 4nm process is equivalent to 5nm. Although the 4nm process is not a "complete iteration" of the 5nm process, it is also a "contemporary evolution". TSMC has promised that its latest 4nm process will improve performance by 11% and energy efficiency by 22% compared to 5nm.
what exactly is the difference between path tracing and ray tracing? Its still tracing of rays right?
'cyberpunk'
Everything is a matter of accuracy with lighting. From ray tracing which is not physically accurate to physically accurate path tracing, there's a world of computational requirement difference between them.
Look back at the DF analysis of Metro Exodus EE. For the time it was really good but it still had that screen space reflection effect inherent problems where the reflections should revert back to rasterization as you angled the world's geometry a certain way with the camera.
But consoles always seem to push graphics before pc games do tho? Horizon forbidden west and demon souls pushed the paradigm to what it is before any pc game did. Now we have Unrecord, that finally brings pc to those heights.Great to see PC advancing gaming tech further, hopefully next-gen consoles in 2030 will be able to pull this off.
Also hoping path tracing comes to a lot more games now that I have a 40 series GPU
This. I'm not entirely convinced that what Cyberpunk is doing is all that much more impressive then Metros RT GI. Considering Metro runs at 60fps on consoles i cant help but to look at Cyberpunk with disgust at the insane performance cost of its Path Tracing despite the fact that Metros RT GI defeats the pitfalls of rastered lighting just like Cyberpunk does.excuse me?
What about Metro Exodus enhanced edition?
Full rt. Even 1800p60 on consoles.
But consoles always seem to push graphics before pc games do tho? Horizon forbidden west and demon souls pushed the paradigm to what it is before any pc game did. Now we have Unrecord, that finally brings pc to those heights.
Then what games have pushed graphics forward?this comment is giving me a stroke... wtf?
Unrecord doesn't even have ambient occlusion...
there's literally nothing about it that looks in any way special. that thing in its current iteration could run on an Xbox One
as for Horizon and Demon's Souls, neither of them pushed graphics technology forward in any way.
CP2077Then what games have pushed graphics forward?
Digital Foundry piece about caches aren't entirely accurate.
And imo, RT performance is dependent on the dev willingness to optimize for each hardware.
This is cherry picking (than again, so is every other test out there that select specific games vs AMD), since the consoles have been putting out RT games. The fact AMD hardware can perform this good against Nvidia in any case say alot about AMD RT implementation potential.
And for a somewhat more accurate comparison of 7900 XTX vs 4080.
AMD would of fared better if they went with 4nm as well.
4nm chips reproduce the power consumption problem, how do advanced process chips crack the "curse" of leakage current
However, this does not mean that the 4nm process is equivalent to 5nm. Although the 4nm process is not a "complete iteration" of the 5nm process, it is also a "contemporary evolution". TSMC has promised that its latest 4nm process will improve performance by 11% and energy efficiency by 22% compared to 5nm.
No not alwaysBut consoles always seem to push graphics before pc games do tho? Horizon forbidden west and demon souls pushed the paradigm to what it is before any pc game did. Now we have Unrecord, that finally brings pc to those heights.
With ease... consoles tend to launch with GPUs that would sit in the high-end bracket from 2 years prior to their launch. So if PS6/XS2 releases in 2027, they would have GPUs equivalent to a high-end 2025 desktop GPU. If they release in 2030, then they have 2028 high-end hardware which would be in line with 2030 low-end PC hardware.Great to see PC advancing gaming tech further, hopefully next-gen consoles in 2030 will be able to pull this off.
Also hoping path tracing comes to a lot more games now that I have a 40 series GPU
Digital Foundry piece about caches aren't entirely accurate.
And imo, RT performance is dependent on the dev willingness to optimize for each hardware.
This is cherry picking (than again, so is every other test out there that select specific games vs AMD), since the consoles have been putting out RT games. The fact AMD hardware can perform this good against Nvidia in any case say alot about AMD RT implementation potential.
And for a somewhat more accurate comparison of 7900 XTX vs 4080.
AMD would of fared better if they went with 4nm as well.
4nm chips reproduce the power consumption problem, how do advanced process chips crack the "curse" of leakage current
However, this does not mean that the 4nm process is equivalent to 5nm. Although the 4nm process is not a "complete iteration" of the 5nm process, it is also a "contemporary evolution". TSMC has promised that its latest 4nm process will improve performance by 11% and energy efficiency by 22% compared to 5nm.
Consoles aren't monolithic entities. To state that they "push" graphics forward in broad strokes is false. Developers are responsible for graphical advancements, not the machines. They're just tools.But consoles always seem to push graphics before pc games do tho? Horizon forbidden west and demon souls pushed the paradigm to what it is before any pc game did. Now we have Unrecord, that finally brings pc to those heights.
That title belongs to MS flight simulator, its technically more advanced and has far more advanced simulation in there clouds. Also the clouds stay static in Horizon Forbiden West.On the flipside, Sony has also a number of studios that push the envelope. Horizon Forbidden West probably has the best sky rendering and cloud simulation system out there.
Oh, yeah, I had forgotten about that one. It definitely takes it by a wide margin. Still, Forbidden West is very impressive in when it comes to their skybox and clouds and as a total package as well.That title belongs to MS flight simulator, its technically more advanced and has far more advanced simulation in there clouds. Also the clouds stay static in Horizon Forbiden West.
100% agree.Oh, yeah, I had forgotten about that one. It definitely takes it by a wide margin. Still, Forbidden West is very impressive in when it comes to their skybox and clouds and as a total package as well.
The point I was making was that devs are responsible for technological advancements, not the platforms.
Devs put out the most accurate lighting tech ever seen in a AAA game thought impossible in real time just ~2 years earlier targeting the highest end hardware available and even explicitely call it a "tech preview".What should be celebrated is a smart balance between performance and graphical innovation witch is exactly what Metro achieved. This whole Cyberpunk path traced thing is fucking stupid.
forbidden west and demon's souls look better than cp 2077. its not even close. Though the raytracing is now better in overdrive. Crysis 3? Really? Order 1886 was regarded as the benchmark last gen.CP2077
Crysis
Doom 3
Crysis 3
I'm saying I don't see paradigm shifts in graphics unless a new console generation comes up, though I'll grant crysis as an exception. And the reason is because devs largely develop with the lowest common denominator in mind, which means even if every pc gamer suddenly had an rtx 4090, devs still need to cater to console specs. I didn't talk about physics.Consoles aren't monolithic entities. To state that they "push" graphics forward in broad strokes is false. Developers are responsible for graphical advancements, not the machines. They're just tools.
For instance, Crytek who was wildly lauded as being the premier technical developer started off as a PC-only developer but eventually branched out to include consoles. They still kept pushing rendering tech all the way up until Ryse.
Likewise, Remedy had been developing games on consoles for years but Control also push techniques forward but only on the PC side.
'cyberpunk'
Teardown, that little indie game, has some of the best destruction physics in the industry. Hell, even BOTW's physics engine dunks on most AAA games, and that includes PC games as well.
On the flipside, Sony has also a number of studios that push the envelope. Horizon Forbidden West probably has the best sky rendering and cloud simulation system out there.
Advancements get made everywhere by competent devs whether they are on consoles or PC. Going "only consoles push graphics" is utterly false and nonsensical.
Digital Foundry piece about caches aren't entirely accurate.
fosssssssssssssssI am having trouble finding benchmarks that show a big difference between better rt performance per tflop gen on gen. 3080 is equivalent to 4070 and I’m not seeing better rt performance. Does anyone have any other path tracing benchmarks for 4070 ti and 3080 ti?
The ones i found for the 3080 and 4070 showed better performance on the 3080 in nearly all games. Even cyberpunk. And that’s the 10 gb model. The 12gb was 3 fos better in every game.
You really think that full path tracing is where it stops? That’s archaic and you’re thinking that game basically only have to catch up to offline renderer, brute force It and be done with it. That ain’t gonna work for real-time.
They went beyond the usual offline renderer path tracing solution by research exactly because it wouldn’t be possible to have it done at this time on a game as heavy as Cyberpunk 2077.
It’s perfect? No. A reminder that it’s a tech preview. But it’s like years ahead than what we’re supposed to see in complex AAA games. The upcoming research is speeding that up, even less noisy and completely bitch slaps the Quake path tracing and the UE 5 path tracing solution. But according to that UE 5 dev, using cache is a big no no for calling it path tracing… he thinks.. because if and if he thinks this is doing that or..
Which I mean, whatever you call it, it exceeds the older path tracing real-time solution by orders of magnitude so… ok.
Uh, I won’t even go into that drivel about AI/Space or something, take your pills.
You consider us as saying that we’re oversimplifying the tech and thinking we’re there, nothing else to do (who says that? Who? Not even Alex from DF says anything of the sort)
While I would say that you’re basically the equivalent to a guy who when we first go to Mars sometime in our lifetime you’ll be « why not Proxima Centuri? Going to Mars is not really full space travel »
I don’t see anything wrong with DF being hyped about new graphics tech. Would be awful if they were jaded like Giant Bomb covering Wii games or whatever. Excitement is good. They can still criticise a game that comes out in a broken state, and they do that all the time.i actually enjoy things for what they are, I enjoy what we have in present which is why I take it as responsibility to make it better for future with necessary criticism so that things can improve even more. I didn't have problem with this update. I simple had problem with the manner DF approached the coverage. They play moderate stoic non exaggrating reviewers for poor shoddy game launches but when it comes to tech development features everything is in hype mode.
You son of a, how dare you bring logic to his emotional argument?? He’s angry at things because of stuff!! Technology? Scientific papers? Numbers?? F*ck off with that noise! Let’s rage at Nvidia….because!!!Where did DF touch you?
They've called out shitty ports time and time again, actually mostly OPPOSITE of 99% of the fucking praising reviews from the mainstream media who scores shit 10/10 and then you realize its an unplayable mess.
Cyberpunk 2077 is literally something that shouldn't exist just 4 years after the likes of Quake 2 RTX & Minecraft RTX demos. Hell, Portal RTX not even long ago was bringing most rigs to their knees. 3 videos for this tech mind bending achievement is not even enough. The entire industry for path tracing technology was shaken by their ReSTIR papers, not just "GPU" tech, universities & scholars all refer Nvidia's ReSTIR papers nowadays to try to find the next improvement.
Nope, RT has always been a “what could be” but completely impractical currently. Fascinating tech, but a waste of resources for anything other than a 4090 and beyond.Does it matter if it runs at 3fps on anything but a RTX 4090?
forbidden west and demon's souls look better than cp 2077. its not even close. Though the raytracing is now better in overdrive. Crysis 3? Really? Order 1886 was regarded as the benchmark last gen.
Digital Foundry piece about caches aren't entirely accurate.
And imo, RT performance is dependent on the dev willingness to optimize for each hardware.
This is cherry picking (than again, so is every other test out there that select specific games vs AMD), since the consoles have been putting out RT games. The fact AMD hardware can perform this good against Nvidia in any case say alot about AMD RT implementation potential.
And for a somewhat more accurate comparison of 7900 XTX vs 4080.
AMD would of fared better if they went with 4nm as well.
4nm chips reproduce the power consumption problem, how do advanced process chips crack the "curse" of leakage current
However, this does not mean that the 4nm process is equivalent to 5nm. Although the 4nm process is not a "complete iteration" of the 5nm process, it is also a "contemporary evolution". TSMC has promised that its latest 4nm process will improve performance by 11% and energy efficiency by 22% compared to 5nm.
most people use that term to describe games that are mostly, or entirely rendered through rays traced into the environment, not ones that literally simulate every single thing a photon does irl.
which isn't even really possible and you could push that goalpost back every time a new milestone is reached.
basically every pixel on your screen is the color it is due to a ray being traced against a polygon, that's IMO the only definition of pathtracing that realistically makes sense without arbitrarily adding requirements of what needs to be simulated and what doesn't
what exactly is the difference between path tracing and ray tracing? Its still tracing of rays right?