Radeon RX 9060 XT features 2048 cores, boost clock of 3.2 GHz

winjer

Member

Just like the RTX 5060 Ti, the Radeon RX 9060 XT will feature two memory configurations of 8GB and 16GB. The difference compared to GeForce is that AMD is sticking to GDDR6 technology and clocks of 20 Gbps.

Based on the most recent information we have from AMD board partners, the RX 9060 XT will launch with 2048 Stream Processors. This is, of course, nothing surprising, because the card was meant to use the Navi 44 GPU, which has half the core count of Navi 48.

We also have an update on clocks, and it looks very interesting. First, a reminder that the RX 7600 XT, the predecessor to the RX 9060 XT, featuring the Navi 33 XT GPU, had a game clock of 2470 MHz and a boost clock of 2755 MHz. The RDNA4 update will have much higher clocks. According to our information, the RX 9060 XT will ship with a 2620 MHz game clock and a 3230 MHz boost clock. But that's not all, we also learned that some OC variants will have a 3.3 GHz boost.

The RX 9060 XT will require at least a 500W power supply, and some models will need 550W or higher specs. We are yet to hear about the RX 9060 XT with a 16-pin power connector, as most specs we saw are for 8-pin variants. It's also worth adding that the RX 9060 XT will have three display connectors, not four like the RX 9070 series. This is likely due to the limitations of Navi 44.

That is about 26 TFLOPs, with VOPD.

Ppv5QCH.jpeg
 
Last edited:
Success/failure of those low end gpu's is solely depended on pricing, if 8gig version costs 250 and 16gigs 300 streetprice, they will sell okish, altho i think true lowend chad of gpu gonna be some 9070 GRE model with 12gigs of vram :)
 
Just let 8GB die already.
I understand 8GB for entry level products like an RTX 5050, but come on, mainstream cards should start at 12GB and anything on the level of a 5060 Ti should be 16GB. Enough with frugality.

Yeah, I don't think we need 16GB for 1440p cards. But if it's cheap…
Maybe not now, but in a few years down the line, if you wanna keep your high-res textures, even 12GB might not cut it.
 
Last edited:
I understand 8GB for entry level products like an RTX 5050, but come on, mainstream cards should start at 12GB and anything on the level of a 5060 Ti should be 16GB. Enough with frugality.


Maybe not now, but in a few years down the line, if you wanna keep your high-res textures, even 12GB might not cut it.
Diablo IV easily surpasses 12GB on my rig. Its not outside the realm today, hell even a year ago to see 12GB+ usage at 1440p.
 
That will be for the 7900 GRE.

r61enLB.jpeg
Are we positive they'll make a GRE? It was a China exclusive they then decided to release worldwide. Also, if they do it, I assume it's gonna be something like GSE for Golden Snake/Serpent Edition because 2025 is the year of the snake.
 
Are we positive they'll make a GRE? It was a China exclusive they then decided to release worldwide. Also, if they do it, I assume it's gonna be something like GSE for Golden Snake/Serpent Edition because 2025 is the year of the snake.

Techpowerup just added it to the GPUZ utility. So it's coming.
The question is when.

 
I never agreed with the "only for 4k resolution" myth. You can tell the difference at 1440p most of the time just fine.
Maybe but 12GB of dedicated VRAM that it's about what the PS5 has for all the logic is enough for an mid range 1440p card. I mean, it's better if it's 16GB of course but I'm looking for a sweet spot of price.
 
Maybe but 12GB of dedicated VRAM that it's about what the PS5 has for all the logic is enough for an mid range 1440p card. I mean, it's better if it's 16GB of course but I'm looking for a sweet spot of price.
I have a 12GB card myself and it's been fine so far. But if I was buying one today to keep for 4 years or so, I'd be a little worried.
 
Last edited:
You don't need a 4K display for high-quality textures. Texture resolution isn't the same as screen resolution.
PS5 tier textures are fine.
I have a 12GB card myself and it's been fine so far. But if I was buying one today to keep for 4 years or so, I'd be a little worried.
8 years sure. But this gen can last 4 more years. And for high end graphics you'd need even more since RT consumes RAM, so 16GB wouldn't cut it either.
 
PS5 tier textures are fine.
Not on a card coming out almost 5 years later, they aren't.
8 years sure. But this gen can last 4 more years. And for high end graphics you'd need even more since RT consumes RAM, so 16GB wouldn't cut it either.
16GB would be enough 99% of the time. If you crank RT high enough to break that budget, then your 5060 Ti probably can't run those settings anyway, regardless of VRAM.
 
Last edited:
Success/failure of those low end gpu's is solely depended on pricing, if 8gig version costs 250 and 16gigs 300 streetprice, they will sell okish, altho i think true lowend chad of gpu gonna be some 9070 GRE model with 12gigs of vram :)

8GB card from AMD is pretty much useless at this point, AMD uses more memory than nvidia in games and 4060ti in 8GB version is already not enough for some new games even at medium 1080p:



Add to that:

- 128bit bus
- "slow" memory
- most likely cut down PCIE lanes...

This is a fucking disaster. 16GB version will be a decent GPU (depending on price) but 10GB is the MINIMUM right now for PS5 ports. Only fucking Intel knows this.

That will be for the 7900 GRE.

r61enLB.jpeg

This GPU could be quite nice.
 
Last edited:

According to Taiwanese publication Benchlife, known for its accurate reporting on upcoming hardware, there is no change regarding the Radeon RX 9060 XT memory configurations.

Earlier, a rumor spread by one of the tech YouTubers suggested that AMD might be considering canceling the 8GB memory configuration for the Radeon RX 9060 XT graphics card. This model, which we've been reporting on over the past few weeks, is AMD's response to the GeForce RTX 5060 Ti. Both models were planned to feature either 8GB or 16GB memory options.

NVIDIA has already launched its graphics card, but to the surprise of reviewers, the company chose not to supply any 8GB models for review. It was clear NVIDIA expected this version to underperform, which was later confirmed by independent reviews using cards obtained without NVIDIA's involvement.

Bryan Cranston Reaction GIF
 
That useless 8GB card will only hurt them in the long run. GDDR6 is that expensive AMD?
It's like $2 per GB, so literally a Fast Food combo is the price difference between disposable e-waste and a useful product.
 
Last edited:
It's still not that problematic to have a card with 8 GB of VRAM. The trick is to keep it cheap, something like $200.

It won't be. My guess is that the quantities of the 8 GB model will be small (even by AMD standards), and if it sits on store shelves, it sits on store shelves.

I wouldn't be surprised if there is no official MSRP for either model.
 
It's still not that problematic to have a card with 8 GB of VRAM. The trick is to keep it cheap, something like $200.
The problem is too call it the same as the other model, it tricks many people into buying something they expect to be way better. It's bs.
 
If it's like the previous generation, it will be RX 9060 and RX 9060 XT.
No, they probably worked on those chips already and don't know where to throw them, they can't call it 9060 or 9050 XT because it's the same 9060 XT chip with less VRAM (is it tho?) so somehow they have to get rid of them and minimize the losses, that's my assumption
 
No, they probably worked on those chips already and don't know where to throw them, they can't call it 9060 or 9050 XT because it's the same 9060 XT chip with less VRAM (is it tho?) so somehow they have to get rid of them and minimize the losses, that's my assumption

But it's the same case with the 7600 and 7600 XT. They're the same card, but one with 8 GB and the other with 16 GB.
 
I just got in on a pre-order for a 9070 for $669 ($731 after tax). While it is 22% over announced MSRP it's not the $800+ of the last month or so.

The only scare is that the vendor or Amazon may cancel my order before the product ships which if these Tariffs continue I will have to cancel out this entire generation. Why, because compromising my principals from $400 in 2020 to $500, and to $600 in 2025. This will be a bridge too far for me. Going beyond a car payment and insurance for a GPU, and almost approaching a house's monthly mortgage is just pure stupidity of value. Let hope it ships
 
I'm surprised about x16 bus, this is a big +

Naming sucks (both with the same name)... -

Price is ok - assuming it's real

8GB model shouldn't exist.
 
Last edited:
Curious to see the benchmarks, this looks like a perfect fit for my console-like desktop to live next to my ps5 in the living Room. 16gb for me.
 
U know it wont be 350$ streetprice, even in the US it will be for sure over 400, likely close to 450, same way 9070/xt are nowhere near their msrp prices.
I think the price is going to settle for the lower tier cards sooner than later. You can find near MSRP 5060Ti 16GB, for example.

So it might be say $30-$50 more but it's not going to be $100 more, IMO. It all's spends on AMD's output though. Availability in US for 9070 series has not been great lately.
 
Cards aren't amazingly priced but sometimes you can get one for $700-750.
in my country 9070 non XT cheapest models cost 700-725$, that inlcude 20% vat(585-590$ without vat), but in one month they will get cheaper to 630-650$
9070XT around 760-800$ with 20% vat.
5070ti around 960 - 1k$ with 20% vat.
 
Last edited:
in my country 9070 non XT cheapest models cost 700-725$, that inlcude 20% vat(585-590$ without vat), but in one month they will get cheaper to 630-650$
9070XT around 760-800$ with 20% vat.
5070ti around 960 - 1k$ with 20% vat.

In Poland I see cheapest:

9070 - 2823zł = 753,18$
9070XT - 3129zł = 834,82$
5070TI - 3725zł = 993,83$
5070 - 2559zł = 682,74$

Of course all this with 23% VAT. After price stabilization 9070XT is the best value for sure.
 
Top Bottom