• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RDNA 4 GPUs To Incorporate Brand New Ray Tracing Engine, Vastly Different Than RDNA 3

The video quality is really poor so it doesnt help but it looks super "dirty" in motion to me.
Everything I've heard is that all the AMD frame gen techniques look like shit but you can force them on any game unlike DLSS frame gen

The question is why you would do this because destroying images quality for fake frame rate doubling seems like a bad trade
 

KeplerL2

Neo Member
The chiplet strategy was in reference to scalability, as AMD were struggling to scale up chiplet design to higher end SKU's, also according to MLiD, this is one of the reasons why AMD cancelled the RDNA 4 high end (pretty sure he leaked PCB design shots for that).

The fact that AMD are not even using chiplets on RDNA 4 is proof of this. It's probably why Nvidia haven't gone with such a design for their GPUs either.
Navi4C was cancelled in February 2023, no one outside of AMD has ever seen or touched it to know if the perf scaled up or not.
 
I want a mini PC and laptop/tablet with stylus that is AMD RDNA 4.0 based APU in PCIE Gen 5.0.

Speaking of PCIE, they have already announced PCIE Gen 7.0 and we are still stuck in 4.0

 

DonkeyPunchJr

World’s Biggest Weeb
I want a mini PC and laptop/tablet with stylus that is AMD RDNA 4.0 based APU in PCIE Gen 5.0.

Speaking of PCIE, they have already announced PCIE Gen 7.0 and we are still stuck in 4.0


2JWPlhu.jpeg

I want to have sexual relations with my motherboard like this guy is doing.
 

FireFly

Member
Everything I've heard is that all the AMD frame gen techniques look like shit but you can force them on any game unlike DLSS frame gen

The question is why you would do this because destroying images quality for fake frame rate doubling seems like a bad trade
AFMF works on any game, but FSR 3 requires the developer to expose motion vectors, as with DLSS 3, and consequently produces a better result.
 

llien

Member
Turing 1x ray/triangle intersection rate
Ampere 2x ray/triangle intersection rate
Ada 4x ray/triangle intersection rate
You seem to be quite knowledgable about RT.

With 4000 series we are 3rd gen into |"hardwahrRT" right?

If progress is so cool and stuff, come turning RT on drops FPS even on 4090?

When are we going to get "enough RT cores" please?

Excuse Me What GIF by Bounce
 

kevboard

Member
Couldnt care less about RT.

Performance isnt quite there on NV's side either. I don't see the benefit in 80% of games. Id rather they focus on raw performance at 1440p-4k. Thats where gamers are headed.

every UE5 game these days uses Lumen, which is raytracing.
in many games you can't even turn it off anymore, only switch to software Lumen instead of hardware Lumen
 

llien

Member
This is not going to close their gap obviously, but at least they won't be 2 or 3 generations behind Nvidia
7000 series AMD beats 3090Ti.
How many "generation" a gap is that?

F1oefX2.png


 

Buggy Loop

Member
You seem to be quite knowledgable about RT.

With 4000 series we are 3rd gen into |"hardwahrRT" right?

If progress is so cool and stuff, come turning RT on drops FPS even on 4090?

When are we going to get "enough RT cores" please?

Excuse Me What GIF by Bounce

When AI fully replaces how a game lighting should look to replicate real life, then not until then.

2019 path tracing
quake-ii-rtx-rtx-on-screenshot-001-850px.png


2023 path tracing
Cyberpunk-2077-C-2020-by-CD-Projekt-RED-16-04-2023-5-02-58.jpg



Gladiator GIF
 

Bry0

Member
Navi4C was cancelled in February 2023, no one outside of AMD has ever seen or touched it to know if the perf scaled up or not.
The claim was made by Steve from HUB according to someone he “talked to at CES”. But he also said he was “basically making this up” summarizing what he thought he understood from the conversation. 😂

All that context always gets lost beyond the actual podcast as usual, as I’m sure you understand and have experienced yourself
 
Last edited:

llien

Member
That's not path tracing.
Possibly not. Although this is how they word what it is:

"Lumen takes any given scene and renders a very low-resolution model of it. Light behavior in this low-res model is then recorded, and a rough lightmap is created. This lightmap is then used to trace the path taken by every ray in the scene."
Sauce
 

Buggy Loop

Member
Possibly not. Although this is how they word what it is:

"Lumen takes any given scene and renders a very low-resolution model of it. Light behavior in this low-res model is then recorded, and a rough lightmap is created. This lightmap is then used to trace the path taken by every ray in the scene."
Sauce

Every "ray" has a path otherwise where is the ray going? You would not be in control of your scene.

Big difference between ray tracing just checks if its hits one thing. It hits? You test it. You get result. Path tracing as you hit something, you want to gather more data, so there's a random factor to send back multiple rays, some will bounce in a certain direction, some rays in another, etc. For a single pixel you gather much more data on how the lighting reacts with that hit and you get a more complete picture of how light should behave.

Lumen is a downgrade of ray tracing because it uses inaccurate signed distance fields rather than geometry. Reflections act a bit like screen space where they have this swimming/disappearing depending on camera angle, on top of some light bleeding sometimes.

There's always a tradeoff. Lumen is nice to have and a step-up from raster, but everything has a tradeoff.
 
Last edited:

FireFly

Member
Epic switched to hardware RT reflections and shadows in their latest Megalights demo anyway, and they say the hardware RT path is now performant enough to run at 60 FPS on consoles.
 

ap_puff

Member
Epic switched to hardware RT reflections and shadows in their latest Megalights demo anyway, and they say the hardware RT path is now performant enough to run at 60 FPS on consoles.
i'll believe that when they can get their default lighting to run at 60 fps on consoles :messenger_grinning_sweat:
 

FireFly

Member
i'll believe that when they can get their default lighting to run at 60 fps on consoles :messenger_grinning_sweat:
They can get it to run at 60 FPS (ok, maybe not always locked), just at low resolutions. I don't think that will change, but if you have PS5 Pro it will look a lot nicer.
 

kevboard

Member

vkbest

Member
i'll believe that when they can get their default lighting to run at 60 fps on consoles :messenger_grinning_sweat:
The problem with Unreal engine in consoles and even PC is those games are releasing currently are using older builds. I think Wukong is 5.0, when they are releasing 5.5 with a ton of optimizations.
 

Puscifer

Member
Who cares they already stated they were not going to do high end cards anymore so a mid tier card with half ass tracing isn’t worth discussing. Now if they want to play with Nvidia again that’s a different story but they publicly stated they were out of that game some I’m out of the AMD gpu scene
They meant flagship 4090 tier 7900xtx stuff
 

Crayon

Member
7000 series AMD beats 3090Ti.
How many "generation" a gap is that?

F1oefX2.png



I've noticed that but nobody seems to stop and check the facts. One of those things that gets repeated enough. It does depend on the game, though. I first noticed because when RDNA 3 was revealed with the 7900XTX, the lauded 3090 raytracing performance became dogshit overnight.
 

SolidQ

Gold Member
He's not the only one who's heard RDNA4 sucks.
where did you hear about RDNA4 sucks? low/mid is fine

I first noticed because when RDNA 3 was revealed with the 7900XTX, the lauded 3090 raytracing performance became dogshit overnight.
Yeah, still going same. I remember when people praise 3090 RT and then sudennly after release 7900XTX people saying sucks etc.
It's just "double standards
 
Last edited:

Buggy Loop

Member
I've noticed that but nobody seems to stop and check the facts. One of those things that gets repeated enough. It does depend on the game, though. I first noticed because when RDNA 3 was revealed with the 7900XTX, the lauded 3090 raytracing performance became dogshit overnight.

I mean, "relative RT" is of course all over the place, it can go from a useless "there's RT in this?" game like Dirt 5, to Cyberpunk overdrive, depends how the reviewer stacks the games.

hybrid RT, where you better damn hope that the generational difference in rasterization would push the card ahead of one a gen behind, which they do with RT off, but ends up performing as good as a gen behind, it means RDNA 3 RT blocks perform worse on a CU-clock basis than even Turing. Manages to drag down a 27% lead they had in rasterization.

RT_1-p.webp


Then if we go bye bye hybrid pipeline and go path tracing...we can see up to 3 gens behind, Ada → Ampere → Turing

DPdzytqBz3wQ9CWeWdknEZ-1024-80.png.webp


Putting our heads in the sand and say that AMD has no catch up to do is not helping AMD. I would hope their new RT block at least catches 3 gens up for path tracing. I'm not even putting Blackwell nor the flagships into the equation since they're going mid-range. I really hope their new RT pipeline performs really well.
 

ap_puff

Member
I mean, "relative RT" is of course all over the place, it can go from a useless "there's RT in this?" game like Dirt 5, to Cyberpunk overdrive, depends how the reviewer stacks the games.

hybrid RT, where you better damn hope that the generational difference in rasterization would push the card ahead of one a gen behind, which they do with RT off, but ends up performing as good as a gen behind, it means RDNA 3 RT blocks perform worse on a CU-clock basis than even Turing. Manages to drag down a 27% lead they had in rasterization.

RT_1-p.webp


Then if we go bye bye hybrid pipeline and go path tracing...we can see up to 3 gens behind, Ada → Ampere → Turing

DPdzytqBz3wQ9CWeWdknEZ-1024-80.png.webp


Putting our heads in the sand and say that AMD has no catch up to do is not helping AMD. I would hope their new RT block at least catches 3 gens up for path tracing. I'm not even putting Blackwell nor the flagships into the equation since they're going mid-range. I really hope their new RT pipeline performs really well.
Cyberpunk pathtracing isn't the best example as it's coded specifically for nvidia hardware, being a heavily nvidia-sponsored title. But the general gist remains correct, AMD's pure RT pipeline really sucks comparatively and they haven't put nearly enough resources into improving it gen-on-gen, even the rumors out for rdna4 don't seem like they're going to catch them up to nvidia all the way. And if AMD fans want their GPUs to succeed they need AMD to know that this kind of performance is important to them. Turing gen you could argue that RT performance was meaningless, Ampere you could argue that PT performance just wasn't there to be mass adopted. But the excuses have to stop since more and more games are going to come out with RT/PT as an option.
 

Gaiff

SBI’s Resident Gaslighter
I've noticed that but nobody seems to stop and check the facts. One of those things that gets repeated enough. It does depend on the game, though. I first noticed because when RDNA 3 was revealed with the 7900XTX, the lauded 3090 raytracing performance became dogshit overnight.

where did you hear about RDNA4 sucks? low/mid is fine


Yeah, still going same. I remember when people praise 3090 RT and then sudennly after release 7900XTX people saying sucks etc.
It's just "double standards
Because averaging is insanely misleading and doesn't take the distribution into account. The 7900 XTX has a sizeable advantage in rasterization in the order of 15%. For it to end up neck-and-neck with the 3090 Ti on average means its 15% average gets completely wiped by a random sample of like 5 games. However, the performance impact on ray tracing varies enormously on a per-game basis, as does the implementation, so just taking on average of a small sample of games is incredibly flawed. And you two are also incorrect in stating that people say the 3090 and 3090 Ti were said to suck following the release of the 7900 XTX. They still obliterate it when you go hard with the ray tracing.

performance-rt-2560-1440.png

In Alan Wake 2, the 3090 Ti outperforms it by a whopping 19% with hardware ray tracing turned on. This is despite the 7900 XTX being 18% faster without it.

In Black Myth Wukong, the 3090 Ti leads it by 46% at 1080p, max settings. Using Techspot because it doesn't seem like Techpowerup tested AMD cards.

Cine-VH-1080p-p.webp

In Rift Apart, it loses by 21%.

rt-ratchet-and-clank-2560-1440.png

In Cyberpunk with path tracing, the 3090 Ti is 2.57x faster. However, I think AMD got an update that boost the performance of their cards significantly, though nowhere near enough to cover that gap.

performance-pt-1920-1080.png

Guardians of the Galaxy 4K Ultra, the 3090 Ti is 22% faster.

RT_3-p.webp

Woah, would you look at this? The 3090 Ti is 73% faster in ray tracing according to my sample of 5 games. But of course, the eagle-eyed among you quickly picked up that this was my deliberate attempt at making the 7900 XTX look bad. In addition, removing Cyberpunk, a clear outlier, from the results would probably drop the average down to the 20s. The point I'm making is that the average of 8 games does not represent the real world. You want to use ray tracing where it makes a difference and shit like Far Cry 6 or REVIII ain't that, explaining in part why they're so forgiving on performance (and why they came out tied). Expect to see the 3090 Ti outperform the 7900 XTX by 15% or more in games where ray tracing is worth turning on despite losing by this exact amount when it's off. The 3090 Ti's RT performance doesn't suddenly suck because it's equal to the 7900 XTX (it isn't, it's faster), but AMD is at least one generation behind, and in games where ray tracing REALLY hammers the GPUs, AMD cards just crumble.

tl;dr don't let averages fool you. Look at the distribution and how ray tracing looks in the games.
 

Moochi

Member
Path tracing crushes lumen in actual fidelity. Cyberpunk 2077 is like a game from the future with Path Tracing, but only sorta nice with lumen.
 

Crayon

Member
I mean, "relative RT" is of course all over the place, it can go from a useless "there's RT in this?" game like Dirt 5, to Cyberpunk overdrive, depends how the reviewer stacks the games.

hybrid RT, where you better damn hope that the generational difference in rasterization would push the card ahead of one a gen behind, which they do with RT off, but ends up performing as good as a gen behind, it means RDNA 3 RT blocks perform worse on a CU-clock basis than even Turing. Manages to drag down a 27% lead they had in rasterization.

RT_1-p.webp


Then if we go bye bye hybrid pipeline and go path tracing...we can see up to 3 gens behind, Ada → Ampere → Turing

DPdzytqBz3wQ9CWeWdknEZ-1024-80.png.webp


Putting our heads in the sand and say that AMD has no catch up to do is not helping AMD. I would hope their new RT block at least catches 3 gens up for path tracing. I'm not even putting Blackwell nor the flagships into the equation since they're going mid-range. I really hope their new RT pipeline performs really well.

I always forget that raytracing actually means pt mode in cyberpunk depending on who I'm talking to. I'll stick with the multigame averages.
 

Gaiff

SBI’s Resident Gaslighter
Different games, different results'

Yes, my point, so to claim that the 3090 Ti and 7900 XTX are equal is very misleading. At best for the 7900 XTX, they're close and at worse, the 3090 Ti destroys it.
 

Buggy Loop

Member
Cyberpunk pathtracing isn't the best example as it's coded specifically for nvidia hardware, being a heavily nvidia-sponsored title. But the general gist remains correct, AMD's pure RT pipeline really sucks comparatively and they haven't put nearly enough resources into improving it gen-on-gen, even the rumors out for rdna4 don't seem like they're going to catch them up to nvidia all the way. And if AMD fans want their GPUs to succeed they need AMD to know that this kind of performance is important to them. Turing gen you could argue that RT performance was meaningless, Ampere you could argue that PT performance just wasn't there to be mass adopted. But the excuses have to stop since more and more games are going to come out with RT/PT as an option.

Haha no

Cyberpunk is one of the few, you can count on one hand, games that supported in-line ray tracing DXR 1.1 which AMD heavily favors. One of the game that is heavily on the side of computer shaders rather than dynamic shaders, making it perfect for in-line RT. Ray tracing wise it performs very well, one of the best "heavy" RT performance you can have on AMD.

Path tracing is just not AMD's forte on that architecture. AMD liked scheduled well organized compute/RT/upscaler solutions to not choke the pipeline. Path tracing is too chaotic.

that already old info with 8fps. Even at 1080p 30+ fps

anyway that with mod RT/PT mod


With frame gen? An 2080 Ti can do that too

But seeing you guys defend, why would AMD ever need to change ray tracing engine? It's perfect already!
 

Moochi

Member
All of this is basically a moot argument. The diffuser overlay engines are going to crush everything, forever.
 

Crayon

Member
Because averaging is insanely misleading and doesn't take the distribution into account. The 7900 XTX has a sizeable advantage in rasterization in the order of 15%. For it to end up neck-and-neck with the 3090 Ti on average means its 15% average gets completely wiped by a random sample of like 5 games. However, the performance impact on ray tracing varies enormously on a per-game basis, as does the implementation, so just taking on average of a small sample of games is incredibly flawed. And you two are also incorrect in stating that people say the 3090 and 3090 Ti were said to suck following the release of the 7900 XTX. They still obliterate it when you go hard with the ray tracing.

performance-rt-2560-1440.png

In Alan Wake 2, the 3090 Ti outperforms it by a whopping 19% with hardware ray tracing turned on. This is despite the 7900 XTX being 18% faster without it.

In Black Myth Wukong, the 3090 Ti leads it by 46% at 1080p, max settings. Using Techspot because it doesn't seem like Techpowerup tested AMD cards.

Cine-VH-1080p-p.webp

In Rift Apart, it loses by 21%.

rt-ratchet-and-clank-2560-1440.png

In Cyberpunk with path tracing, the 3090 Ti is 2.57x faster. However, I think AMD got an update that boost the performance of their cards significantly, though nowhere near enough to cover that gap.

performance-pt-1920-1080.png

Guardians of the Galaxy 4K Ultra, the 3090 Ti is 22% faster.

RT_3-p.webp

Woah, would you look at this? The 3090 Ti is 73% faster in ray tracing according to my sample of 5 games. But of course, the eagle-eyed among you quickly picked up that this was my deliberate attempt at making the 7900 XTX look bad. In addition, removing Cyberpunk, a clear outlier, from the results would probably drop the average down to the 20s. The point I'm making is that the average of 8 games does not represent the real world. You want to use ray tracing where it makes a difference and shit like Far Cry 6 or REVIII ain't that, explaining in part why they're so forgiving on performance (and why they came out tied). Expect to see the 3090 Ti outperform the 7900 XTX by 15% or more in games where ray tracing is worth turning on despite losing by this exact amount when it's off. The 3090 Ti's RT performance doesn't suddenly suck because it's equal to the 7900 XTX (it isn't, it's faster), but AMD is at least one generation behind, and in games where ray tracing REALLY hammers the GPUs, AMD cards just crumble.

tl;dr don't let averages fool you. Look at the distribution and how ray tracing looks in the games.

You are being specific and detailed, which is good. But can you effectively boil all that down to "2-3 gens behind"?

And I think the reasoning for cherry picking games here is actually sound enough, but still, throwing out the majority of games with the light rt workloads is misleading in it's own way.
 

Gaiff

SBI’s Resident Gaslighter
destroys it, if 2-3x times, but not 3-4 fps


All companies going change RT
Don't know how reliable that benchmark is and it's rt, not pt, where the difference is much larger.

You are being specific and detailed, which is good. But can you effectively boil all that down to "2-3 gens behind"?
No, I wouldn't say 2-3 gens behind in general. In extreme cases, the difference can be enormous, but these are outliers.
And I think the reasoning for cherry picking games here is actually sound enough, but still, throwing out the majority of games with the light rt workloads is misleading in it's own way.
Definitely not throwing them out. They count, but most of us don't actually care about turning RT on and not seeing a difference. I rather have a game like Cyberpunk where the ray tracing massacres performance, but there is a transformative difference over a game like Far Cry 6 that has relatively good performance but looks the same.
 

SolidQ

Gold Member
cherry picking games here is actually sound enough, but still, throwing out the majority of games with the light rt workloads is misleading in it's own way.
cherry picking is always irrelevant results, avg is best. Games going to RT when PS6 is coming, for now it's beta test for future.

not pt, where the difference is much larger.
still doesn't matter because it's Cyber/Alan only. When more games coming with PT 3090/4090 will be super slow for this.
 
Last edited:

Crayon

Member
cherry picking is always irrelevant results, avg is best. Games going to RT when PS6 is coming, for now it's beta test for future.

I'm in analytical so cherrypicking is technically massive fraud lol.

What I meant was more like Gaiff Gaiff said: weighting. I do get the idea of weighting the really heavy implementations more. There may be less of them but they expose the differences more clearly and they show what's around the corner. Still, once you start doing things like that, it's good to include the multiple ways it can be tallied up. To circle back around, saying it's 2-3 gens back is just not very helpful and as misleading as anything else.
 

rm082e

Member
where did you hear about RDNA4 sucks? low/mid is fine

Steve from Hardware Unboxed talks about it in the video that was posted above. He spoke with someone at Computex who had seen the early testing on it and was bummed out:



The "sucks" part is my summary, not his words. I say "sucks" because AMD has consistently priced their cards roughly in line with Nvidia when compared on raw FPS-per-dollar. But they're consistently behind in other ways, which is they have such low market share. 10-12 years ago, their cards were hair dryers or underpowered. Unless you were a Battlefield super fan, it wasn't worth it. Once they finally got the heat and cooling issues straightened out, Nvidia came out with the 2000 series with the RT cores. Sure, RT was complete vaporware, but they quickly figured out how to offload antialiasing in a way that was really impressive.

As Steve and Tom discuss in the video, it seems like AMD needs to be ahead on both price and performance for a couple of generations to really move the needle on market share. They managed to do it on CPUs, but it seems much less likely for their GPU business.
 
Top Bottom