• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX 7900 XT Flagship RDNA 3 Graphics Card Could Reach Almost 100 TFLOPs, Navi 31 GPU Frequency Hitting Over 3 GHz

tusharngf

Member
AMD Radeon RX 7900 XT is turning out to be a completely different RDNA 3 'Navi 31' GPU-powered beast in the latest rumors from Greymon55. The leaker has reported that based on new info, AMD has created a much more powerful design than previously anticipated.





AMD Radeon RX 7900 XT, The Flagship RDNA 3 'Navi 31' GPU Powered Graphics Card, Could Offer Over 3 GHz Clock Speeds & Almost 100 TFLOPs of Performance​

Over the last couple of days, there have been several sightings of the AMD RDNA 3 (GFX11) GPUs which have started appearing within the LLVM project within the Linux operating system's drivers and development tools. It looks like AMD is working on at least four new GPUs for now which have been confirmed through their ID leak which is as follows (Credits: @Kepler_L2):


  • GFX (1100) - Possible Navi 31 GPU
  • GFX (1101) - Possible Navi 32 GPU
  • GFX (1102) - Possible Navi 33 GPU
  • (GFX 1103) - Possible Phoenix APU
Now while these IDs don't reveal much information such as specs or configs, they do tell us that preliminary support is being added and soon, these GPUs will begin testing within the respective platforms for optimized performance delivery. But this topic isn't about the leaked IDs, it's about the latest rumor surrounding the flagship RDNA 3 GPU, the Navi 31, which is going to power the Radeon RX 7900 series cards.



As per the latest rumor from Greymon55, it looks like the clock speeds have seen a major boost from 2.5 GHz to now hovering around and even above the 3 GHz frequency mark. What this will mean is that the flagship chip with 15360 stream processors will be able to deliver almost 100.0 TFLOPs of FP32 compute performance (92 TFLOPs to be precise). That is without adding the OC limits which will definitely help it push closer to the already-close 100 TFLOPs barrier.\


Upcoming Flagship AMD, Intel, NVIDIA GPU Specs (Preliminary)​

GPU NameAD102Navi 31Xe2-HPG
CodenameAda LovelaceRDNA 3Battlemage
Flagship SKUGeForce RTX 4090 SeriesRadeon RX 7900 SeriesArc B900 Series
GPU ProcessTSMC 4NTSMC 5nm+ TSMC 6nmTSCM 5nm?
GPU PackageMonolithicMCD (Multi-Chiplet Die)MCM (Multi-Chiplet Module)
GPU DiesMono x 12 x GCD + 4 x MCD + 1 x IODQuad-Tile (tGPU)
GPU Mega Clusters12 GPCs (Graphics Processing Clusters)6 Shader Engines10 Render Slices
GPU Super Clusters72 TPC (Texture Processing Clusters)30 WGPs (Per MCD)
60 WGPs (In Total)
40 Xe-Cores (Per Tile)
160 Xe-Cores (Total)
GPU Clusters144 Stream Multiprocessors (SM)120 Compute Units (CU)
240 Compute Units (in total)
1280 Xe VE (Per Tile)
5120 Xe VE (In Total)
Cores (Per Die)18432 CUDA Cores7680 SPs (Per GCD)
15360 SPs (In Total)
20480 ALUs (In Total)
Peak Clock~2.85 GHz~3.0 GHzTBD
FP32 Compute~105 TFLOPs~92 TFLOPsTBD
Memory TypeGDDR6XGDDR6GDDR6?
Memory Capacity24 GB32 GBTBD
Memory Bus384-bit256-bitTBD
Memory Speeds~21 Gbps~18 GbpsTBD
Cache Subsystems96 MB L2 Cache512 MB (Infinity Cache)TBD
TBP~600W~500WTBD
LaunchQ4 2022Q4 20222023

Source: https://wccftech.com/amd-radeon-rx-7900-xt-rdna-3-navi-31-gpu-graphics-card-100-tflops-3-ghz-rumor/
 
Last edited:

Chiggs

Gold Member
Meanwhile at Intel....

Acquiesce Whelp GIF
 

SeraphJan

Member
Curious how RT performance will play out. Nvidia and AMD cornering the market on space heaters.
RDNA 3 RT rumored to be 3 times better than RDNA 2, Meanwhile Nvidia Lovelace is rumored to be not far from 2 times (something like 1.8 times) compared to Ampere, so Nvidia might still have a RT lead, but not by much this time.

This Gen GPU is more like this - AMD had slight better rasterization, Nvidia had slightly better RT, FSR 2 is a bit inferior to DLSS 3, but AMD draw way less power compare to Nvidia (which could up to 650W and even more), and their price is likely to be similar
 
Last edited:

DenchDeckard

Moderated wildly
I can't believe intel haven't got their gpus out yet. Last news I heard it was landing around 3070 performance....which would have been fine 3 months ago.

They gonna get pimp slapped if next gen arrives in time for their launch.
 

JohnnyFootball

GerAlt-Right. Ciriously.
RDNA 3 RT rumored to be 3 times better than RDNA 2, Meanwhile Nvidia Lovelace is rumored to be not far from 2 times (something like 1.8 times) compared to Ampere, so Nvidia might still have a RT lead, but not by much this time.

This Gen GPU is more like this - AMD had slight better rasterization, Nvidia had slightly better RT, FSR 2 is a bit inferior to DLSS 3, but AMD draw way less power compare to Nvidia (which could up to 650W and even more), and their price is likely to be similar
Ill believe it when I see it, as RT performance for any GPU still leaves a LOT to be desired.
 
Last edited:
Dat 32GB vram mmmmmm
Don't you know, you only need 8GB 🤭

But seriously 32gb will never be used by the card in gaming, good for 3d modeling and production stuff though. For gaming all the 4070+ tier cards should have 16gb and Nvidia can stop being a cheap con artist
 

winjer

Gold Member
The big question is if RDNA3 and Ada Lovelace will reanimate the mining on GPU market.
If it does, the remaining consumers are f*****
 

Dream-Knife

Banned
The big question is when will the Radeon division get their drivers sorted out. IIRC idle memory spikes are still happening.
 
Last edited:
Why would you as a consumer not want more options?
Yeah man, wtf is wrong with people. We already have heard intel might have not have the strongest cards for their gen 1 lineup, but if the price to performance is there and the drivers are good, why not buy intel?

Esp. since they have a real dlss competitor.

Fanboys of cpu/graphics are even dumber than console fanboys
 
Last edited:
RDNA 3 RT rumored to be 3 times better than RDNA 2, Meanwhile Nvidia Lovelace is rumored to be not far from 2 times (something like 1.8 times) compared to Ampere, so Nvidia might still have a RT lead, but not by much this time.

This Gen GPU is more like this - AMD had slight better rasterization, Nvidia had slightly better RT, FSR 2 is a bit inferior to DLSS 3, but AMD draw way less power compare to Nvidia (which could up to 650W and even more), and their price is likely to be similar

Some of the leakers are backtracking on the 3x RT performance claim. I guess we'll have to wait and see.

My hunch is similar to yours, I think AMD will ahead in rasterisation and Nvidia will edge them out in RT.

However DLSS 3 vs FSR 2? Now that's where things get interesting. DLSS 2 is already such a big advantage.
 
would anybody be able to tell the difference between 92 TFLOPS and 105 TFLOPS? I don't think TFLOPS is relevant anymore..

i personally think it all comes down to cost/performance ratio, if you can get 4k 60fps stable with HDR, VRR, RT, i personally think AMD will win..

If Sony were to make PS5Pro, the number TFLOPS would not be miniscule even if Sony tried because the microarchitecture wouldn't allow it. 10.3 TFLOPS to 80-100 TFLOPS is fucking huge...

Does RDNA 3 have Machine Learning aka Deep Learning aka Neural cores like NVIDIA? Will Intel have ML cores built in to the GPU as well? AMD better not forget
 

Klik

Member
Would love to see RTX 4080 around 60-80TF.

I bought used RTX 3060 because i will upgrade to new one when it comes this year.I know we shouldn't compare consoles vs pc but almost 7x-8x more powerful GPU than PS5 is amazing. Unreal engine 5 and next gen graphics are gonna be incredible on PC.

Imagine moded GTA 6..😱
 
Last edited:

Klik

Member
would anybody be able to tell the difference between 92 TFLOPS and 105 TFLOPS? I don't think TFLOPS is relevant anymore..

i personally think it all comes down to cost/performance ratio, if you can get 4k 60fps stable with HDR, VRR, RT, i personally think AMD will win..

If Sony were to make PS5Pro, the number TFLOPS would not be miniscule even if Sony tried because the microarchitecture wouldn't allow it. 10.3 TFLOPS to 80-100 TFLOPS is fucking huge...

Does RDNA 3 have Machine Learning aka Deep Learning aka Neural cores like NVIDIA? Will Intel have ML cores built in to the GPU as well? AMD better not forget
Always wondered if 4k is worth it on PC right now or with RTX 4xxx. Seems that 1440p 144hz would be better? I don't know if that's much difference between 1440p and 4k on 27" monitor..
 
Last edited:
Always wondered if 4k is worth it on PC right now or with RTX 4xxx. Seems that 1440p 144hz would be better? I don't know if that's much difference between 1440p and 4k on 27" monitor..

Advancements in QD-OLED, MiniLED, MicroLED, OLED, HDMI 2.1, HDR10+, Dolby Vision, Upscaling AI Processors should make 4k more appealing..
 
What pleases me is that the next generation APUs, now thanks to Intel renewed competition, will not lag behind anymore. It'll be Zen 4 + RDNA3, and the iGPU part will be bigger, it seems.
These new APUs should kill a bit more the low end segment.

It's probably true but we are past the point where it matters.

TF is the new bits

But RDNA Flops aren't inflated like Ampere's Flops.
The only thing we need to question is if the memory will keep up, because all points out RDNA3 will really heavy on Infinity Cache (L3) and stacked cache (L4?).
 
Last edited:
What pleases me is that the next generation APUs, now thanks to Intel renewed competition, will not lag behind anymore. It'll be Zen 4 + RDNA3, and the iGPU part will be bigger, it seems.
These new APUs should kill a bit more the low end segment.



But RDNA Flops aren't inflated like Ampere's Flops.
The only thing we need to question is if the memory will keep up, because all points out RDNA3 will really heavy on Infinity Cache (L3) and stacked cache (L4?).
Ampere isn't inflated either.

It's just that more things matter than TF. I am sure you know of the vega vs. pascal battle where vega had much more "TF" but lost.
 

ZywyPL

Banned
Always wondered if 4k is worth it on PC right now or with RTX 4xxx. Seems that 1440p 144hz would be better? I don't know if that's much difference between 1440p and 4k on 27" monitor..

For desktop experience, 1440p144Hz is absolutely, infinitely better, but for living room and its 4K TV sets this is where all the power will go.

But still, with 80-100TF and games build around 10-12TF, I guess 4K @120 will be like a walk in the park.
 
Last edited:

Haint

Member
RDNA 3 RT rumored to be 3 times better than RDNA 2, Meanwhile Nvidia Lovelace is rumored to be not far from 2 times (something like 1.8 times) compared to Ampere, so Nvidia might still have a RT lead, but not by much this time.

This Gen GPU is more like this - AMD had slight better rasterization, Nvidia had slightly better RT, FSR 2 is a bit inferior to DLSS 3, but AMD draw way less power compare to Nvidia (which could up to 650W and even more), and their price is likely to be similar

Nobody buying a $1500 100TF GPU gives a flying fuck about power efficiency. Same goes for anyone buying a desktop or dGPU at any price. Literally no one's going to opt for an objectively slower part to save 100W under load. Unlimited power is the whole point of having a 3 cubic foot box sitting on your desk. You guys who continue to push this narrative for AMD are why they hold less than 5% dGPU marketshare. Based on the OP chart the 4090 will have a 30%+ advantage over a 7900XT when factoring in its RT/DLSS gains. Even if AMD gave them away, people would just sell them and buy 4090s with the proceeds.
 
Last edited:
Intel probably won't compete much in the gaming space anyway, at least at first. But, they might do wonders for availability if lower-level production users snatch them up.
They really got into the graphics business (more than they were, anyway) for the server side of the business.

But I agree, them entering the card arena will help availability and eventually reduce prices I think. Intel are ramping up by building new fab plants.
 

FingerBang

Member
The 3000 series did not wipe AMD's 6000 series. Without DLSS the 6900XT would be the best price/performance card
This is so true. Hard to believe AMD has been able to match and even surpass their competition in the last few years. I am expecting the 7000 series to outperform the RTX 4000 series. They might still have an advantage when it comes to real-time ray tracing though.

Oh well, more than happy to upgrade my 3080 FE if there's availability!
 
That is about 10 PS5s.


All corporations are greedy. Did you not just watch what Nvidia, AMD, MS, and Sony have all done in the last year? At least Intel is boosting American manufacturing.
I was thinking straight perf. Say game runs at 4K 30 on ps5 you might get 150 fps on this at the same settings.
 

Haint

Member
The 3000 series did not wipe AMD's 6000 series. Without DLSS the 6900XT would be the best price/performance card

It couldn't matter any less to the buying public. The desktop 3080 alone (a massively supply constrained $1000+ street priced super enthusiast GPU) has sold around 2.5x more units than AMD's entire 6XXX line COMBINED. The 3060 with ample supply would tower over the entire RX 6XXX family by around 20x. Every generation AMD seeds these rumors of grand performance and superior power efficiency, and every generation they get absolutely dirt stomped by Nvidia. To really drive home how untouchable their monopoly is, Nvidia's hardware margins are insane, they could cut "MSRP's by like 70% and still make a handsome profit off their cards. Meaning AMD would have to sell cards approaching twice as fast at half the price to even have a prayer of gaining meaningful marketshare. Given such a miracle scenario, Nvidia would very likely eat multi-billion dollar losses to undercut AMD and maintain their dominance.
 
Last edited:

Sho_Gunn

Banned
It couldn't matter any less to the buying public. The desktop 3080 alone (a massively supply constrained $1000+ street priced super enthusiast GPU) has sold around 2.5x more units than AMD's entire 6XXX line COMBINED. The 3060 with ample supply would tower over the entire RX 6XXX family by around 20x. Every generation AMD seeds these rumors of grand performance and superior power efficiency, and every generation they get absolutely dirt stomped by Nvidia. To really drive home how untouchable their monopoly is, Nvidia's hardware margins are insane, they could cut "MSRP's by like 70% and still make a handsome profit off their cards. Meaning AMD would have to sell cards approaching twice as fast at half the price to even have a prayer of gaining meaningful marketshare. Given such a miracle scenario, Nvidia would very likely eat multi-billion dollar losses to undercut AMD and maintain their dominance.
None of this matters when it comes to actually playing video games lol. AMD is still pushing great(if not better) cards especially with FSR improving
 

Ozriel

M$FT
Intel probably won't compete much in the gaming space anyway, at least at first. But, they might do wonders for availability if lower-level production users snatch them up.

I think their major contribution could be to more affordable gaming ultrabooks.

Though NVIDIA's done a good job at pushing out mobile GPUs at multiple price points.
 
Top Bottom