• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RDNA 4 GPUs To Incorporate Brand New Ray Tracing Engine, Vastly Different Than RDNA 3

Because averaging is insanely misleading and doesn't take the distribution into account. The 7900 XTX has a sizeable advantage in rasterization in the order of 15%. For it to end up neck-and-neck with the 3090 Ti on average means its 15% average gets completely wiped by a random sample of like 5 games. However, the performance impact on ray tracing varies enormously on a per-game basis, as does the implementation, so just taking on average of a small sample of games is incredibly flawed. And you two are also incorrect in stating that people say the 3090 and 3090 Ti were said to suck following the release of the 7900 XTX. They still obliterate it when you go hard with the ray tracing.

performance-rt-2560-1440.png

In Alan Wake 2, the 3090 Ti outperforms it by a whopping 19% with hardware ray tracing turned on. This is despite the 7900 XTX being 18% faster without it.

In Black Myth Wukong, the 3090 Ti leads it by 46% at 1080p, max settings. Using Techspot because it doesn't seem like Techpowerup tested AMD cards.

Cine-VH-1080p-p.webp

In Rift Apart, it loses by 21%.

rt-ratchet-and-clank-2560-1440.png

In Cyberpunk with path tracing, the 3090 Ti is 2.57x faster. However, I think AMD got an update that boost the performance of their cards significantly, though nowhere near enough to cover that gap.

performance-pt-1920-1080.png

Guardians of the Galaxy 4K Ultra, the 3090 Ti is 22% faster.

RT_3-p.webp

Woah, would you look at this? The 3090 Ti is 73% faster in ray tracing according to my sample of 5 games. But of course, the eagle-eyed among you quickly picked up that this was my deliberate attempt at making the 7900 XTX look bad. In addition, removing Cyberpunk, a clear outlier, from the results would probably drop the average down to the 20s. The point I'm making is that the average of 8 games does not represent the real world. You want to use ray tracing where it makes a difference and shit like Far Cry 6 or REVIII ain't that, explaining in part why they're so forgiving on performance (and why they came out tied). Expect to see the 3090 Ti outperform the 7900 XTX by 15% or more in games where ray tracing is worth turning on despite losing by this exact amount when it's off. The 3090 Ti's RT performance doesn't suddenly suck because it's equal to the 7900 XTX (it isn't, it's faster), but AMD is at least one generation behind, and in games where ray tracing REALLY hammers the GPUs, AMD cards just crumble.

tl;dr don't let averages fool you. Look at the distribution and how ray tracing looks in the games.
You wrote way too much to say simply that AMD's approach to RT kind of copes when doing the basic RT reflections/shadows seen on console-level RT but completely falls over and dies when forced to do the real advanced Path Tracing stuff that the highest level PC games are doing (Cyberpunk 2077, Portal RTX, Star Wars Outlaws, Black Myth Wukong)
 

Gaiff

SBI’s Resident Gaslighter
cherry picking is always irrelevant results, avg is best. Games going to RT when PS6 is coming, for now it's beta test for future.
No, that's completely false, especially in this case where results can vary massively between 1% to over 50%. Anyone with more than a passive understanding of statistics will tell you so. You do NOT want an average. You want to look at the distribution because the average is completely skewed by outliers. Furthermore, the average from Techpowerup was a sample of 8 games. If you want a relevant average, you'd need a sample of 30+ games.
still doesn't matter because it's Cyber/Alan only. When more games coming with PT 3090/4090 will be super slow for this.
No, the 4090 is perfectly fine. I play Cyberpunk with path tracing and DLSS Quality at 3440x1440 and it runs well. Throw in Frame Generation and Reflex+ Boost, and you have an amazing experience.
 

SolidQ

Member
. He spoke with someone at Computex who had seen the early testing on it and was bummed out:
maybe because they was sure it's about 4090 level? There always was minimum 7900XT - max around 4080, but not 7900XTX

massive fraud lol.
Yeah, last time when i was buying for specific game it's was Crysis 1, but now i'm looking only avg perfomance, and won't looking for RT perfomance, until game gonna requirement only RT game, but that far future. UE5 have nice graphic and that fine.

it seems like AMD needs to be ahead on both price and performance for a couple of generations
They have, Nvidia have zero cards up 500$ now, but users take always worse cards. So won't change. Only Nvidia gonna help do with this. like 5060\5070 will have 8/12gb only, so that why they force me to buy RDNA4

No, the 4090 is perfectly fine. I play Cyberpunk
and again Cyber, i'm talking about future games.

that's completely false, especially in this case where results can vary massively between 1% to over 50%.
Never was false. You mistake here. Why you need buy worse card, where in 99% games lower results.
 
Last edited:

rm082e

Member
maybe because they was sure it's about 4090 level? There always was minimum 7900XT - max around 4080, but not 7900XTX

We've seen how well that worked out for them this generation. I don't expect any better performance for RDNA4 in the midrange tier next year. I would be happy to be surprised, but there hasn't been a single card in over a decade where they've really topped Nvidia.
 

SolidQ

Member
but there hasn't been a single card in over a decade where they've really topped Nvidia.
Because budget, they're CPU + GPU company, NV GPU only. They've started get budget since Ryzen release. So they need split it to Data center/gaming/servers etc. Anyway UDNA is right way for them, save money and make strong uArch

Well I wish them huge gains

And inspirations from PSSR
We don't have yet comparision for PSSR vs DLSS, but all upscalers need much work yet, for now upscalers still bad. About huge gaing we don't know what NV/AMD planning in far future.
 

Gaiff

SBI’s Resident Gaslighter
Never was false. You mistake here. Why you need buy worse card, where in 99% games lower results.
We were talking about ray tracing, weren’t we? The 3090 Ti certainly doesn’t get worse results 99% of the time. In fact, it’s much more common that it wins out, and comfortably a lot of the time.
 

SolidQ

Member
We were talking about ray tracing, weren’t we?
Yep, but that diffent story. Different games, different results, but RT is bad for all cards if it's not 4090

The 3090 Ti certainly doesn’t get worse results 99% of the time.
it's about pure perfomance, but overall both bad for heavy RT
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Yep, but that diffent story. Different games, different results, but RT is bad for all cards if it's not 4090
Not true. The 4080 is also very good. Hell, even the 7900 XTX can be pretty good a lot of the time. It just loses more performance than I’d like.
 
Top Bottom