• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Radeon RX 7800 XT Specs Leaked. Only 60 CUs.

Leonidas

AMD's Dogma: ARyzen (No Intel inside)

Not looking good. 6800 XT had 20% more CUs. Sad to see AMD give such a GPU the x800 XT name.

This could actually be worse than the predecessor...
 
Last edited:

Bojji

Gold Member

Not looking good. 6800 XT had 20% more CUs. Sad to see AMD give such a GPU the x800 XT name.

This could actually be worse than the predecessor...

It will be faster with higher clocks but not much faster (in RT it will).
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
It will be faster with higher clocks but not much faster (in RT it will).
I'm not so sure, 7600 barely beat the 6650 XT and those had the same number of cores.

7600 didn't beat 6650 XT by 20%, so how could 7800 XT beat 6800 XT when 6800 XT has 20% more CU? Just doesn't seem possible to me, but I'd like to be wrong here, as it would suck to see a worse product launch 3 years later...
 
Last edited:

//DEVIL//

Member
Just when you think Nvidia gave the middle and entry-class market to AMD in a gold platter, AMD comes back and say nope we gonna fuck it more than you. you are not allowed to be the number 1 fucked in logic Nvidia !! we AMD will WIN !!
 

GHG

Gold Member
When everything is being designed around a 4tf box theres very little incentive and demand for gamers on older hardware to upgrade. Basically other than at the enthusiast end of the market (where theres a subset of people who are not price sensitive and were always going to upgrade regardless) there's not much demand across the rest of the stack. The products along with both Nvidia and AMD's tardiness in releasing lower end hardware this gen are reflective of this.

The fact that price to performance is now linear across multiple generations of cards is testament to that and unfortunately it looks to be here to stay for a while. They know most people don't need/want these products so they are just doing the bare minimum.
 
Last edited:

Bojji

Gold Member
I'm not so sure, 7600 barely beat the 6650 XT and those had the same number of cores.

7600 didn't beat 6650 XT by 20%, so how could 7800 XT beat 6800 XT when 6800 XT has 20% more CU? Just doesn't seem possible to me, but I'd like to be wrong here, as it would suck to see a worse product launch 3 years later...

6650 has super high clock for RDNA2, 7800XT will have few hundred MHz advantage over 6800XT.
 

simpatico

Member
AMD and Nvidia need to be investigated for collusion. Full stop.

I'm still rocking a 1080 and getting by at 1080p no problem. If I think back to when I had my 680, there is no way people with 280s were playing modern stuff at good settings. Something has gone horribly wrong.
 
Last edited:

twilo99

Gold Member
When everything is being designed around a 4tf box theres very little incentive and demand for gamers on older hardware to upgrade. Basically other than at the enthusiast end of the market (where theres a subset of people who are not price sensitive and were always going to upgrade regardless) there's not much demand across the rest of the stack. The products along with both Nvidia and AMD's tardiness in releasing lower end hardware this gen are reflective of this.

The fact that price to performance is now linear across multiple generations of cards is testament to that and unfortunately it looks to be here to stay for a while. They know most people don't need/want these products so they are just doing the bare minimum.

Games being designed around 10tf boxes is not that much better considering where the 4090 sits... the major problem is that games are being design around consoles, and not PCs.
 

RoboFu

One of the green rats
Blah it’s refresh tier… cut down on cus to cut cost and give just tad more performance.
 

GHG

Gold Member
AMD and Nvidia need to be investigated for collusion. Full stop.

I'm still rocking a 1080 and getting by at 1080p no problem. If I think back to when I had my 680, there is no way people with 280s were playing modern stuff at good settings. Something has gone horribly wrong.

This is exactly why the situation is what it is. It's not collusion, it's just them responding to what they see and anticipate in the market. Unless you have a desire to be on the bleeding edge (4k 120+hz) then most gamers with cards from the last couple of generations will be content to continue playing on their current hardware, and they are still doing so with commendable settings.

Crypto AI is Nvidia and AMD's new best friend. That's where the demand is coming from and that's the market they are both currently responding to.

Games being designed around 10tf boxes is not that much better considering where the 4090 sits... the major problem is that games are being design around consoles, and not PCs.

Most games being designed around consoles has never not been the case (at least not for a long while now). The issue is the fact that the baseline is much lower than it would typically be for a gen on gen increase. Hence you have a ton of gamers happily and comfortably soldiering on with 2/3 generation old xx60 cards, deep into a new generation of consoles (we are now approaching 3 years). That's unheard of.
 
Last edited:

M1987

Member
The 7800XT is no better than a 6800XT? the 4070 got a lot of shit at launch,but at least it was on par with a 80 class card
 

aclar00

Member
I dont know squat about GPUs, but maybe its time for a radical architectural change? I can only imagine the chaos that would cause in the PC space though.
 

64bitmodels

Reverse groomer.
the major problem is that games are being design around consoles, and not PCs.
Curb Your Enthusiasm Bingo GIF by Jason Clarke
 

FireFly

Member
It's always been rumoured to be 60 CUs, which is similar cutback to the one from the 4070 Ti to 4070. Not sure why anyone expected more.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
It's always been rumoured to be 60 CUs, which is similar cutback to the one from the 4070 Ti to 4070. Not sure why anyone expected more.
The problem is the naming.

They've called it 7800 XT, as such it will be compared to the 6800 XT, and it seems there is a good chance that the 7800 XT will lose in that comparison...

This could be the worst gen/gen improvement this generation. At least 4060 Ti was ~10% faster than 3060 Ti on average at 1080p/1440p.
 

octiny

Banned
It will be faster with higher clocks but not much faster (in RT it will).

It'll be +10% of an RX 6800 (60 CU non-XT) w/ similar RT, at best. So worse than a 6800 XT (72 CU).

An RX 7600 (32 CU) barely beats a 6650 XT (32 CU) in both RT & rasterization, and that's with a 100-150mhz higher in-game clock average.

RX 7600 vs RX 6650 XT

A 7900GRE (80 CU) is 15%-30% slower than a 7900 XT (84 CU) from a leaked in-depth review. Which puts it on par with a 6900 XT (80 CU). Only 5%-12% faster than a 6800 XT (72 CU).

7900 XT vs 7900GRE vs 6800 XT

There is hardly an actual uplift in IPC going from RDNA 2 to 3, if any.

RDNA 3 only fares better at the high-end due to AMD cramming so many CU's compared to their high-end RDNA 2 counterparts. Even then, it's led to highly under utilized CU's leading to poor efficiency.

Just an absolute poor outing by AMD in the low to mid-range segment.

Edit:

TLDR; Grab the higher end RDNA 2 cards before they are officially gone
 
Last edited:

hinch7

Member
Blah it’s refresh tier… cut down on cus to cut cost and give just tad more performance.
Nah they messed with the naming.. and also decided to charge more because that's how Nvidia laid out its pricing scheme this generation.

They practically did a 'Nvidia' when they announced the 7900XT for $900 which is actually their 80 tier class card. But got away with it.

This is more like their 70 tier card, if you go by actual specifications (check TPU for Navi 32>22). And you know its going to be priced like a 4070 because, well modern AMD.
 
Last edited:

StereoVsn

Member
AMD just shit the bed this generation. 7900XT and 9700XTX aren't bad, but we're overpriced and inefficient. The rest of the cards are just terrible.
 

Loxus

Member
How you mean only?
It's been known to have 60CUs a year ago.

AMD's RDNA 3 Graphics
AUG 12, 2022

Navi32

  • gfx1101 (Wheat Nas)
  • Chiplet - 1x GCD + 4x MCD (0-hi)
  • 30 WGP (60 legacy CUs, 7680 ALUs)
  • 3 Shader Engines / 6 Shader Arrays
  • Infinity Cache 64MB (0-hi)
  • 256-bit GDDR6
  • GCD on TSMC N5, ~200 mm²
  • MCD on TSMC N6, ~37.5 mm²
Coming in 2023, Navi32 is a smaller version of Navi31, reusing the same MCDs. Navi32 will also be coming to mobile as a high-end GPU offering in AMD Advantage laptops. There were plans for a 128MB (1-hi) version, however it might not be productized due to the aforementioned costs. Thus Navi32’s 64MB is also smaller than Navi22’s 96MB.
 
Last edited:

64bitmodels

Reverse groomer.
So much like Nvidia’s stack strategy, it’s getting eerie.

Nvidia left the door wide open in mid range and AMD doesn’t take a chance. It smells like collusion
this is completely unrelated but where did you get your "buggy loop" name from?
 

Buggy Loop

Member
this is completely unrelated but where did you get your "buggy loop" name from?

I was playing tons of Joint strike fighter on Mplayer ages ago, I was winning tournaments and duels like no tomorrow, peoples got suspicious I was doing something quirky with the loop of the game and one wingman nicknamed me that in jest against the others, just to trigger them.

It’s been so long I have this name.
 

64bitmodels

Reverse groomer.
I was playing tons of Joint strike fighter on Mplayer ages ago, I was winning tournaments and duels like no tomorrow, peoples got suspicious I was doing something quirky with the loop of the game and one wingman nicknamed me that in jest against the others, just to trigger them.

It’s been so long I have this name.
ohhhh.

I was thinking it was about Sonic because... Yknow, the games have lots of loops in them and they aren't known for being the most stable pieces of software. A "Buggy Loop" isn't very uncommon in those games, lol
 

hlm666

Member
Absolutely disgusting. WTF is the point of the chiplet design revolution if you cant stack CUs on top of each other. These cards shouldve been 120-160 CUs.

AMD is slipping.
I imagine power and heat is why, the 7900xtx with 96 CUs can draw up to 410 watts. That's only around 30 watts less than a 4090 in the same scenario.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
How you mean only?
6800 XT has 20% more, that's how. This is the first card in the AMD lineup to go backwards on CU count vs. last gen.

AMD naming/branding is meaningless.

Last gen 6800 XT was on par with a 3080.

This gen 7800 XT could lose to the 4070.

Imagine if Nvidia released RTX 4080 and it was the same performance as RTX 3080, they'd get roasted to hell and back...
 
Last edited:

octiny

Banned
Still can't decide if the 6950xt is better value than the 6800xt at the current prices, but I think they are both pretty great.

If they are within $100 of eachother, 6950 XT is the better buy.

I have a 4090 & 6800 XT & while I can get my 6800 XT to perform similar to a stock 6950 XT w/ a massively tuned OC, I could do the same w/ the 6950 XT & more. Due to extra clock headroom provided by the XTXH die used w/ all 6950 XT's. Clock for clock for every MHZ gained a 6950 XT will stretch it's lead even further due to more bandwidth & 8 more CU's.

But again, that's just me, while also taking into consideration that the prices that used to seperate them were closer to $350+.
 

twilo99

Gold Member
If they are within $100 of eachother, 6950 XT is the better buy.

I have a 4090 & 6800 XT & while I can get my 6800 XT to perform similar to a stock 6950 XT w/ a massively tuned OC, I could do the same w/ the 6950 XT & more. Due to extra clock headroom provided by the XTXH die used w/ all 6950 XT's. Clock for clock for every MHZ gained a 6950 XT will stretch it's lead even further due to more bandwidth & 8 more CU's.

But again, that's just me, while also taking into consideration that the prices that used to seperate them were closer to $350+.

I didn't realize they were so close in price... the 6950xt is a no brainer at only $60 more, at least that's what I'm seeing on Amazon.

How much is the 7800xt supposed to cost?

I guess RDNA3 will outperform the old stuff in RT, but if you don't care about that..
 

octiny

Banned
I didn't realize they were so close in price... the 6950xt is a no brainer at only $60 more, at least that's what I'm seeing on Amazon.

How much is the 7800xt supposed to cost?

I guess RDNA3 will outperform the old stuff in RT,
but if you don't care about that..

If the 7800 XT or whatever they call it comes in at anything above $449, it's dead on arrival. 60 CU means it's a direct replacement for the non-XT 6800, which can currently be had for $400-$450. 7800 XT will more than likely be around 10% better than the non-XT like I said in my first post, but fall short of a 6800 XT.

RDNA 3 CU for CU vs RDNA 2 is extremely similar as shown in the performance data for both the 7600 (32 CU) vs 6650 XT (32CU) & 7900GRE (80 CU) vs 6800 XT (72 CU). The 7900GRE (aka 6900 XT) is only 10% faster than a 6800 XT in RT while having 8 more CU's & a higher in-game clock. The 7900GRE is essentially what the 7800 XT should've been at the very least, yet here we are.

This notion that RDNA 3 has any meaningful improvements in RT performance vs RDNA 2 CU for CU was shot down when the 7600 released & subsequently the 7900GRE (80 CU 256-bit bus, China only). We can already extrapolate the data on how a 60 CU 7800 XT will perform in RT simply by analyzing the other two cards. It'll go head to head (+10% from higher clocks/bandwidth) w/ a non-XT 6800 (60 CU). 7900 XT & 7900 XTX will be the only ones w/ meaningful RT improvements due to the higher CU counts w/ 384-bit bus.

If AMD is really calling the RDNA 3 60 CU a 7800 "XT" then I really hope the media calls them out for it like they did w/ Nvidia & the two "4080's". Regardless of it's pricing, it's only going to confuse consumers into thinking it's better than a 6800 XT (which is probably their intention). I digress though, it is what it is.
 
Last edited:
Top Bottom