RDNA3 rumor discussion

Prilimary specs so far as per leaks


AMD Radeon RX 7900 XT "Preliminary" Specifications:

GRAPHICS CARDAMD RADEON RX 7900 XTXAMD RADEON RX 7900 XTAMD RADEON RX 6950 XTAMD RADEON RX 6900 XT
GPUNavi 31 XTXNavi 31 XTNavi 21 KXTXNavi 21 XTX
Process Node5nm+6nm5nm+6nm7nm7nm
Die Size308mm2 (Only GCD)
533mm2 (with MCDs)
308mm2 (Only GCD)
533mm2 (with MCDs)
520mm2520mm2
TransistorsTBDTBD26.8 Billion26.8 Billion
GPU WGPs48424040
Stream Processors122881075251205120
TMUs/ROPsTBDTBD320 / 128320 / 128
Game ClockTBDTBD2100 MHz2015 MHz
Boost Clock>3 GHz>3 GHz2310 MHz2250 MHz
FP32 TFLOPs>75 TFLOPs>65 TFLOPs23.65 TFLOPs23.04 TFLOPs
Memory Size24 GB GDDR620 GB GDDR616 GB GDDR616 GB GDDR6
Infinity Cache96 MB80 MB128 MB128 MB
Memory Bus384-bit320-bit256-bit256-bit
Memory Clock~20 Gbps~20 Gbps18 Gbps16 Gbps
Bandwidth~960 GB/s~800 GB/s576 GB/s512 GB/s
Effective BandwidthTBDTBD1728.2 GB/s1664.2 GB/s
TBP~400W~350W335W300W
PCIe InterfacePCIe 5.0 x16PCIe 5.0 x16PCIe 4.0 x16PCIe 4.0 x16
PriceTBDTBD$1099 US$999 US



 
What time is the conference today? I'm assuming it will be on twitch and YouTube? Can we get the OP updated?
 
imagine a time in which i could get rid of my vega56 without feeling like I have been forcefully taken up the chuff
 
From the article:

Seems like now one knows how performant these chips will be. We only know the reference board and therefore the base cards should be much cheaper to make than the 4090.
The 4090 die itself isn't expensive to make, it's probably only somewhere between $250-350 per chip. A large chiplet design like this will maybe save $100 over a monolith, and that's best case. Essentially meaningless in the flagship price tier. If AMD wants to undercut they'll have take a hit on the likely 200% margin Nvidia enjoys on the 4090, cause they'll not save it on the BoM.

The problem is nobody buying these flagship cards cares about price, they care about performance and features. If AMD can't match or beat, it really doesn't matter how much they cost, and they know full well Nvidia can easily match any price they hit thanks to their 10x higher economy of scale.
 
Last edited:
Most excited I've been for an AMD GPU event in forever.

Really hoping they can close the gap on Nvidia feature-set wise. I just really need them to sort out their VR performance issues when paired with the Quest 2.
As an owner of a 4090 FE, I am excited to see if I made the right choose or not lol.

I don't think I'll regret it since nvidia is nvidia and I was lucky enough to get an FE. But I love when competition happens .
 
Last edited:
The problem is nobody buying these flagship cards cares about price, they care about performance and features. If AMD can't match or beat, it really doesn't matter how much they cost, and they know full well Nvidia can easily match any price they hit thanks to their 10x higher economy of scale.
Yeah I think this is the big problem. If their flagship is like $1400, I think most people willing to spend that kind of money on a GPU will just spend the extra $200 on a 4090 for better ray tracing and DLSS.

If they can price their flagship at $1200, at least then it's competing at the same price as 4080 16GB where it'll be a much more favorable comparison.
 
The 4090 die itself isn't expensive to make, it's probably only somewhere between $250-350 per chip. A large chiplet design like this will maybe save $100 over a monolith, and that's best case. Essentially meaningless in the flagship price tier. If AMD wants to undercut they'll have take a hit on the likely 200% margin Nvidia enjoys on the 4090, cause they'll not save it on the BoM.

The problem is nobody buying these flagship cards cares about price, they care about performance and features. If AMD can't match or beat, it really doesn't matter how much they cost, and they know full well Nvidia can easily match any price they hit thanks to their 10x higher economy of scale.
Well the 4080 must exist for some reason, and AMD could potentially price match that.
 
Lmao

Yg5MObr.png
 
Last edited:
The current rumors suggest 2x-2.5 raytracing performance than RDNA 2, but what about when compared to RX40 series and even ARC?
Apparantly AMD gimped some of the RT hardware on RDNA 2 because it wasn't a cost effective solution for the PS5 and Series X.

I expect more dedicated hardware this time around and some impressive performance gains for RT over RDNA 2. However for AMD to beat Nvidia on RT would be like asking them to climb a mountain, the RT performance increase would have to be around 3 to 4x which which is almost impossible.
 
Apparantly AMD gimped some of the RT hardware on RDNA 2 because it wasn't a cost effective solution for the PS5 and Series X.

I expect more dedicated hardware this time around and some impressive performance gains for RT over RDNA 2. However for AMD to beat Nvidia on RT would be like asking them to climb a mountain, the RT performance increase would have to be around 3 to 4x which which is almost impossible.
Gamers Nexus puts the 4090 over the 6950 XT by 2.44x in Cyberpunk 2077. If those rumors hold true they would seriously catch up to the 4000 series
 
Well the 4080 must exist for some reason, and AMD could potentially price match that.

The 4080 exists to sell 4090's, it's going to bomb catastrophically. No card has ever launched with a 50% core deficit and only 15% price delta (especially in the over $1000 category). It's going to be the most disastrous product launch in Nvidia's history. Bad news for consumers (good news for Nvidia) is 4090 demand will more than double overnight, and they already sell every single unit they make before product pages can even update.
 
Last edited:
the rumors also suggest machine learning cores of sorts correct? Perhaps for FSR 3.0? I think the overall performance for RDNA 3 as an architecture will be 85%-95% close to RX40 series, which i really dont think is that bad as long as price is reasonable.
 
Last edited:
Hope they new ones are good and cheap enough. If they are good and below 1k this is the first amd i'll get since i switched to nvidia's 660
 
Most excited I've been for an AMD GPU event in forever.

Really hoping they can close the gap on Nvidia feature-set wise. I just really need them to sort out their VR performance issues when paired with the Quest 2.
I thought most of the issue was the hardware encoder for sending the video, they broke it in a driver update but fixed it pretty soon after. In general AMD has worse hardware video encoding than Nvidia, really hope they improve that.
 
I thought most of the issue was the hardware encoder for sending the video, they broke it in a driver update but fixed it pretty soon after. In general AMD has worse hardware video encoding than Nvidia, really hope they improve that.

It's to do with the encoder that AMD use, would require a hardware and software change from their end for this new series of cards to remedy the situation.

The solution here sums up the situation nicely:

 
The 4080 exists to sell 4090's, it's going to bomb catastrophically. No card has ever launched with a 50% core deficit and only 15% price delta (especially in the over $1000 category). It's going to be the most disastrous product launch in Nvidia's history. Bad news for consumers (good news for Nvidia) is 4090 demand will more than double overnight, and they already sell every single unit they make before product pages can even update.
You assume people are smart. I had someone telling me the 4080 is better relative tot he 4090 than the 3080 to 4090. I don't speak moron very well but I think he thought the RAM ratio was the driver. In my case, It kept me from even thinking going down from the xx90 tier.
 
"Ray tracing is the best thing ever and is the future of graphics"

Or

"Ray tracing sucks because not every game uses it and it doesn't look good anyway. You'd have to be a sheep to buy a GPU with good ray tracing performance over one with 3.5% better raster performance"


We will know the answer soon!
 
58 billion transistors for Navi 31. For reference a AD102 (RTX 4090) has 76B, so something is gonna have to give performance or feature-wise for sure.
 
Last edited:
56 billion transistors for Navi 31. For reference a AD102 (RTX 4090) has 76B, so something is gonna have to give performance or feature-wise for sure.

Chiplet design, though. Doesn't need as many.

But the answer is ray-tracing. :messenger_grinning_sweat:
 
Last edited:
Chiplet design, though. Doesn't need as many.

But the answer is ray-tracing. :messenger_grinning_sweat:

A chiplet doesn't doesn't use less transistors though. If anything you need more than a monolithic design (more PHYs between the dies etc).
 
Last edited:
Top Bottom