RDNA3 rumor discussion



According to MLID

7900xtx will trade blows with 4090, but most likely not beat it. The reference model will not push the card dramatically but there is overhead both in power and size for AIB partners. AMD is going for reasonable pricing, power and cooling for the reference model.
7900xtx ~ 350W <- Reference mode. Could go higher for AIB.
7900xt ~ 300W <- Same as above
Late Nov / Early Dec launch. We have heard nothing about N32, I wonder whether they'll even mention the 7800 but I guess it won't be the case being a different chip.
They're leaving room for a 7950 in the future, probably to deal with the 4090ti.
Prices are not know, but the 7900xtx won't be cheap. The 7900xt should offer much, much better value than the 4080 16 GB though.
 
Last edited:
Based on the leaked 3.3 Ghz clock speed with 12,288 stream processors, it would be about 81 TFLOPS compute.

Raytracing is compute related path with BVH acceleration hardware (search engine accelerator).

RX 6950 XT has about 24 TFLOPS compute with 2 of 3 BVH acceleration hardware features, the result is about half of RTX 3090 FE (>38 TFLOPS FP32 compute with 3 of 3 BVH acceleration hardware features).

For NAVI 31, AMD is throwing a lot of compute FLOPS at the raytracing problem.

Real-time raytracing is a compute TFLOPS and search engine problem.
RT is a struggle but I hope AMD's focus is to save the PC gaming industry by having a lower priced card that can deliver just the right performance for what is actually needed for the CPUs and screens people have. It's cool to see 300fps on a benchmark graph but, yeah, maybe not needed lol
 


According to MLID

The world sees reference RX7900 with 2x150W 8pins. (300W)
Insiders tell him its a ~350W card. (75W from PCIE).

The world sees reference RX7000 being slightly bigger than 6900XT.
Insiders tell him the RX7000s are slightly bigger than 6900XT.

AMD says they plan on launching soon after announcing in Nov.
Insiders tell him launch in Nov or Dec.

The RTX 4080'16G isnt seen as good value.
Insiders tell him the AMD card should offer better value.

How would we live without this guys leaks and insider knowledge.
Fuck dat ninja.
 
The world sees reference RX7900 with 2x150W 8pins. (300W)
Insiders tell him its a ~350W card. (75W from PCIE).

The world sees reference RX7000 being slightly bigger than 6900XT.
Insiders tell him the RX7000s are slightly bigger than 6900XT.

AMD says they plan on launching soon after announcing in Nov.
Insiders tell him launch in Nov or Dec.

The RTX 4080'16G isnt seen as good value.
Insiders tell him the AMD card should offer better value.

How would we live without this guys leaks and insider knowledge.
Fuck dat ninja.
Yeah, there really isn't anything new there. We know the specs and yet we're all still guessing about price and performance and will be until tomorrow.
 
I missed the day of full or partially full product stack releases. These high end models are more than I am willing to spend anymore, so it's wait, wait and wait some more.
 
I missed the day of full or partially full product stack releases. These high end models are more than I am willing to spend anymore, so it's wait, wait and wait some more.
I'm toying with the idea of just getting a 6700XT or sonething and calling it quits. Probably waiting like another 6 months for low/mid-range RDNA3 anyway.
 
AMD is holding an iron fist on these products, no leaks one day before press conference is world record
 
Last edited:
I'm toying with the idea of just getting a 6700XT or sonething and calling it quits. Probably waiting like another 6 months for low/mid-range RDNA3 anyway.
That's what $375 right now? I've thought about it as well, but, shit I've skipped 2 gens already. May as well wait I guess...
 
That's what $375 right now? I've thought about it as well, but, shit I've skipped 2 gens already. May as well wait I guess...
Out of interest how much are you looking to spend? I have a GTX 1650 that is still chugging along. It runs most stuff I want to play at 1080p medium settings 60fps or higher (obviously not the much newer stuff).

I could probably get just under $100 for the 1650 now anyway (they hold their value well despite their being a lot of better options) so considering that I have an AMD APU, I could feasibly survive during the interval between selling my 1650 and getting a new (or rather used) GPU delivered.
 
Out of interest how much are you looking to spend? I have a GTX 1650 that is still chugging along. It runs most stuff I want to play at 1080p medium settings 60fps or higher (obviously not the much newer stuff).

I could probably get just under $100 for the 1650 now anyway (they hold their value well despite their being a lot of better options) so considering that I have an AMD APU, I could feasibly survive during the interval between selling my 1650 and getting a new (or rather used) GPU delivered.
I own a GTX 1080, I think I paid $500 for it the proceeding year it was released. I'm looking around $500-$700 for 1440p, preferably I'd let to get to 144hz on high settings. I don't mind optimizing settings, Ultra is usually a waste of resources. Also, I'd like to see what this ray tracing is all about. I think the only RT game I own is CP2077.
 

Smells like a FSR 2.0 title!

More Callisto Protocol....IN.

I hope they show off FSR3 or whatever its going to be called.

Hopefully whatever the fuck Wave Matrix Multiply-Accumulate is it actually brings ML technology so they have a DLSS competitor.
Kepler seemed excited about it.
 
Don't you understand the hype cycle? The thing has to be revealed with a slight flaw and then youtube will explode with how that flaw makes AMD DEAD.

Reviews: "It's just as fast and $300 cheaper and the rt is finally competitive and FSR2 is a good answer to DLSS. Unfortunately, the RX7900XTX only weighs 4 pounds so we have to give the win to nvidia."
 
So seems like MLiD is now backtracking on some of the performance claims for RDNA 3, as have severel other "leakers" in the past few weeks.

I remember at one point RedGamingTech stated that Nvidia's Lovelace would struggle in RT against RDNA 3, but it's likely the other way around. Now we have MLiD stating that RDNA 3's highest variant, the XTX (not even XT) will struggle to compete in raster against the 4090. This is looking a little bleak for AMD.

However I think the best way for AMD to salvage this (if leaks are true), is to be very competitive on price against the already overpriced Lovelace. I haven't been following the PC market long enough but judging by replies from other posters AMD have always done a historically shit job in taking advantage in such market conditions.
 
So seems like MLiD is now backtracking on some of the performance claims for RDNA 3, as have severel other "leakers" in the past few weeks.

I remember at one point RedGamingTech stated that Nvidia's Lovelace would struggle in RT against RDNA 3, but it's likely the other way around. Now we have MLiD stating that RDNA 3's highest variant, the XTX (not even XT) will struggle to compete in raster against the 4090. This is looking a little bleak for AMD.

However I think the best way for AMD to salvage this (if leaks are true), is to be very competitive on price against the already overpriced Lovelace. I haven't been following the PC market long enough but judging by replies from other posters AMD have always done a historically shit job in taking advantage in such market conditions.
Those are just claims could be desinformation.Like others had pointed out based on all calculations the 7900 is twice the performance of the 6900.And twice the performance of the 6900 is already more powerful than a 4090 on paper.Then there is the incredible speed of the cards 7900 is around 3,5 GHZ while some say even more overclockable.I am certain that they will be atleast equal in Raster performance or AMD will win in raster performance.RT Nvidia will be ahead but not that much I think. AMD will win on price, performance, on futures and wattage overall winner will be AMD in every class more so in the entre and middle class also in the Mobil class clear winner
 
So seems like MLiD is now backtracking on some of the performance claims for RDNA 3, as have severel other "leakers" in the past few weeks.

I remember at one point RedGamingTech stated that Nvidia's Lovelace would struggle in RT against RDNA 3, but it's likely the other way around. Now we have MLiD stating that RDNA 3's highest variant, the XTX (not even XT) will struggle to compete in raster against the 4090. This is looking a little bleak for AMD.

However I think the best way for AMD to salvage this (if leaks are true), is to be very competitive on price against the already overpriced Lovelace. I haven't been following the PC market long enough but judging by replies from other posters AMD have always done a historically shit job in taking advantage in such market conditions.
That is not what he's saying. He's saying that the reference model is somehow restrained compared to the AIB partners, by being limited to 350W. They don't want to sell a massive and expensive card like Nvidia. But AIB partners will be able to push the card more and it should be able to trade blows with the 4090, by using the same coolers they built for Nvidia.

I don't remember him saying the card will beat Nvidia like many others, only that their goal was to double RDNA2 performance and that they apparently did it.

The truth is, no one really knows what the real performance will be and even tomorrow we'll only hear about the most optimistic scenarios. Some people out there believe the 7900xtx will be 10-20% faster than the 4090 at a cheaper price. That is what I call delusional.
 
Those are just claims could be desinformation.Like others had pointed out based on all calculations the 7900 is twice the performance of the 6900.And twice the performance of the 6900 is already more powerful than a 4090 on paper.Then there is the incredible speed of the cards 7900 is around 3,5 GHZ while some say even more overclockable.I am certain that they will be atleast equal in Raster performance or AMD will win in raster performance.RT Nvidia will be ahead but not that much I think. AMD will win on price, performance, on futures and wattage overall winner will be AMD in every class more so in the entre and middle class also in the Mobil class clear winner

I hope so but I'm trying to keep my expectations in check.

I was never expecting AMD to compete with Nvidia in ray-tracing, but I know the more reliable leakers like Keplar have said the RT improvement over RDNA 2 to 3 is more than 2x, which is interesting. I won't be basing my purchase of RT performance because it's still at a poor adoption rate from developers, and most (not all) RT implementations are mediocre despite what Nvidia marketing has people thinking. I think we're still years away from on RT being a standard, and by RT I mean full multi-bounce global illumination and not other gimmicky features like RT shadows.

That being said, if AMD can offer better value on performance per dollar then I'm all in.
 
I hope so but I'm trying to keep my expectations in check.

I was never expecting AMD to compete with Nvidia in ray-tracing, but I know the more reliable leakers like Keplar have said the RT improvement over RDNA 2 to 3 is more than 2x, which is interesting. I won't be basing my purchase of RT performance because it's still at a poor adoption rate from developers, and most (not all) RT implementations are mediocre despite what Nvidia marketing has people thinking. I think we're still years away from on RT being a standard, and by RT I mean full multi-bounce global illumination and not other gimmicky features like RT shadows.

That being said, if AMD can offer better value on performance per dollar then I'm all in.

I'm really interested in RT but you're at least mostly right. If I need a 7600XT to make a worthwhile upgrade, but I have to pay $250 more for a 7800XT to turn on raytracing and get the same framerate... it's just a dumb use of money. It would be a lot cheaper and less time consuming to just watch youtube videos of high end cards running games with rt on/off and try to spot the difference, which is mostly what I would be doing if I had the card anyhow. I'm planning to get through starefield with a 6600XT and FSR2 and then I'll have a long time to figure out the upgrade. I'm interested in a 7600X with a black friday or price drop 12-18 monthes from now.
 
I'm really interested in RT but you're at least mostly right. If I need a 7600XT to make a worthwhile upgrade, but I have to pay $250 more for a 7800XT to turn on raytracing and get the same framerate... it's just a dumb use of money. It would be a lot cheaper and less time consuming to just watch youtube videos of high end cards running games with rt on/off and try to spot the difference, which is mostly what I would be doing if I had the card anyhow. I'm planning to get through starefield with a 6600XT and FSR2 and then I'll have a long time to figure out the upgrade. I'm interested in a 7600X with a black friday or price drop 12-18 monthes from now.
I wouldn't say that dumb use of money for the same framerate with RT.RT makes a big difference in lots of games and you have to consider that all big games Sony Capcom Square and all others will have amazing RT.Resi 4 comes in a few months will look amazing especially with RT spider man 2 Wolverine is coming
TLOU factions is coming will look amazing with RT.You will get way more enjoyment and immersion also fun from those games with RT.You should wait and see which cards offer what kind of RT then you can make your decision.You can always lower the resolution or effects of your games to play with RT with higher frames .A 7700 especially 7800 will be pretty strong card so you will be good
 
I wouldn't say that dumb use of money for the same framerate with RT.RT makes a big difference in lots of games and you have to consider that all big games Sony Capcom Square and all others will have amazing RT.Resi 4 comes in a few months will look amazing especially with RT spider man 2 Wolverine is coming
TLOU factions is coming will look amazing with RT.You will get way more enjoyment and immersion also fun from those games with RT.You should wait and see which cards offer what kind of RT then you can make your decision.You can always lower the resolution or effects of your games to play with RT with higher frames .A 7700 especially 7800 will be pretty strong card so you will be good

I've seen so few games where the difference is really appreciable. It's more thar I'm fascinated with any new effects in a graphics-whorey kind of way. Control and spiderman are standouts, but mostly becasue normal reflections are not too hot. When it comes to lighting and shadows, the effect is... well suffice to say, more often than not I can't pick out the difference aside from staring at a side-by-side. Non-rt techniques for light and shadow is too good to justify it. I could practically flip the rt pitch around and say that tomb raider features "time-consuming manually tuned non-rt shadows that provide a huge boost in performance."

We'll see with the next round of games. Cyberpunk with several effects going at once shows that it can add up to a real next-level improvement but idk if I'll be able to justify the cost when the time comes to choose between a $400 card and a $600 one for just rt.
 
oOG3HD7.jpg
 
I've seen so few games where the difference is really appreciable. It's more thar I'm fascinated with any new effects in a graphics-whorey kind of way. Control and spiderman are standouts, but mostly becasue normal reflections are not too hot. When it comes to lighting and shadows, the effect is... well suffice to say, more often than not I can't pick out the difference aside from staring at a side-by-side. Non-rt techniques for light and shadow is too good to justify it. I could practically flip the rt pitch around and say that tomb raider features "time-consuming manually tuned non-rt shadows that provide a huge boost in performance."

We'll see with the next round of games. Cyberpunk with several effects going at once shows that it can add up to a real next-level improvement but idk if I'll be able to justify the cost when the time comes to choose between a $400 card and a $600 one for just rt.
Yeah don't hurry you will find a great new card the new AMD cards will be fire hopefully for good price only a few hours left will you be watching life ?
 
Yeah don't hurry you will find a great new card the new AMD cards will be fire hopefully for good price only a few hours left will you be watching life ?

I'll be at work but I'm going to go black on all the news till I get home because I know it will be too distracting. I'm usually losing effectiveness by thursday...
 
RX6750XT is on sale for $403 CND on AMD direct site, damn it's tempting.
It will probably be a few months, if not a year before you see anything more powerful for that price.

If you need a non stupidly expensive graphics card now, it's a great time to buy.
 
It will probably be a few months, if not a year before you see anything more powerful for that price.

If you need a non stupidly expensive graphics card now, it's a great time to buy.
I was going to get a 6650xt at $300 but, I think this is worth the extra $100. Prices in Canada are fucked, still have major sellers selling 6650xts for $500-$600.

Hell even 6600xts go for $400-$500.
 
Last edited:
Kudos if you made that yourself lol
I love the "completely ignore ray tracing benchmarks" bit hahaha so true

You can tell they're fucked on RT when they gloss over it

If I was Nvidia I would arrange the release of portal RTX and cyberpunk 2077 overdrive soon just to make a point

A day ago they posted this


Then have a ton of fan favorites remade in RTX remix, and if they're performance king, i think they'll be cozy.
 
Last edited:
RT is a struggle but I hope AMD's focus is to save the PC gaming industry by having a lower priced card that can deliver just the right performance for what is actually needed for the CPUs and screens people have. It's cool to see 300fps on a benchmark graph but, yeah, maybe not needed lol

Doubling the performance from 6950 XT in both raster and raytracing will still have a major graphics pipeline bottleneck with the BVH raytracing stage, hence any raster performance gain is nearly pointless.

AMD's BVH raytracing workloads need to improve beyond 2X.

https://www.tomshardware.com/reviews/intel-arc-a770-limited-edition-review/5

For a given TFLOPS level, Intel Arc A770 LE's BVH raytracing results already beat AMD's RX 6750 XT.
 

From the article:
Performance, power and price will be revealed at the launch event. Unfortunately, our A0 performance profiling results were not indicative of production samples, so we have refrained from detailing our findings publicly. Silicon and board cost is definitely lower than NVIDIA's RTX 4090, so let's see where the two Navi 31 SKUs will be priced.
Seems like now one knows how performant these chips will be. We only know the reference board and therefore the base cards should be much cheaper to make than the 4090.
 
Most excited I've been for an AMD GPU event in forever.

Really hoping they can close the gap on Nvidia feature-set wise. I just really need them to sort out their VR performance issues when paired with the Quest 2.
 
Top Bottom