Radeon RX 9060 XT features 2048 cores, boost clock of 3.2 GHz

winjer

Member

Just like the RTX 5060 Ti, the Radeon RX 9060 XT will feature two memory configurations of 8GB and 16GB. The difference compared to GeForce is that AMD is sticking to GDDR6 technology and clocks of 20 Gbps.

Based on the most recent information we have from AMD board partners, the RX 9060 XT will launch with 2048 Stream Processors. This is, of course, nothing surprising, because the card was meant to use the Navi 44 GPU, which has half the core count of Navi 48.

We also have an update on clocks, and it looks very interesting. First, a reminder that the RX 7600 XT, the predecessor to the RX 9060 XT, featuring the Navi 33 XT GPU, had a game clock of 2470 MHz and a boost clock of 2755 MHz. The RDNA4 update will have much higher clocks. According to our information, the RX 9060 XT will ship with a 2620 MHz game clock and a 3230 MHz boost clock. But that's not all, we also learned that some OC variants will have a 3.3 GHz boost.

The RX 9060 XT will require at least a 500W power supply, and some models will need 550W or higher specs. We are yet to hear about the RX 9060 XT with a 16-pin power connector, as most specs we saw are for 8-pin variants. It's also worth adding that the RX 9060 XT will have three display connectors, not four like the RX 9070 series. This is likely due to the limitations of Navi 44.

That is about 26 TFLOPs, with VOPD.

Ppv5QCH.jpeg
 
Last edited:
Success/failure of those low end gpu's is solely depended on pricing, if 8gig version costs 250 and 16gigs 300 streetprice, they will sell okish, altho i think true lowend chad of gpu gonna be some 9070 GRE model with 12gigs of vram :)
 
Just let 8GB die already.
I understand 8GB for entry level products like an RTX 5050, but come on, mainstream cards should start at 12GB and anything on the level of a 5060 Ti should be 16GB. Enough with frugality.

Yeah, I don't think we need 16GB for 1440p cards. But if it's cheap…
Maybe not now, but in a few years down the line, if you wanna keep your high-res textures, even 12GB might not cut it.
 
Last edited:
I understand 8GB for entry level products like an RTX 5050, but come on, mainstream cards should start at 12GB and anything on the level of a 5060 Ti should be 16GB. Enough with frugality.


Maybe not now, but in a few years down the line, if you wanna keep your high-res textures, even 12GB might not cut it.
Diablo IV easily surpasses 12GB on my rig. Its not outside the realm today, hell even a year ago to see 12GB+ usage at 1440p.
 
That will be for the 7900 GRE.

r61enLB.jpeg
Are we positive they'll make a GRE? It was a China exclusive they then decided to release worldwide. Also, if they do it, I assume it's gonna be something like GSE for Golden Snake/Serpent Edition because 2025 is the year of the snake.
 
Are we positive they'll make a GRE? It was a China exclusive they then decided to release worldwide. Also, if they do it, I assume it's gonna be something like GSE for Golden Snake/Serpent Edition because 2025 is the year of the snake.

Techpowerup just added it to the GPUZ utility. So it's coming.
The question is when.

 
I never agreed with the "only for 4k resolution" myth. You can tell the difference at 1440p most of the time just fine.
Maybe but 12GB of dedicated VRAM that it's about what the PS5 has for all the logic is enough for an mid range 1440p card. I mean, it's better if it's 16GB of course but I'm looking for a sweet spot of price.
 
Maybe but 12GB of dedicated VRAM that it's about what the PS5 has for all the logic is enough for an mid range 1440p card. I mean, it's better if it's 16GB of course but I'm looking for a sweet spot of price.
I have a 12GB card myself and it's been fine so far. But if I was buying one today to keep for 4 years or so, I'd be a little worried.
 
Last edited:
You don't need a 4K display for high-quality textures. Texture resolution isn't the same as screen resolution.
PS5 tier textures are fine.
I have a 12GB card myself and it's been fine so far. But if I was buying one today to keep for 4 years or so, I'd be a little worried.
8 years sure. But this gen can last 4 more years. And for high end graphics you'd need even more since RT consumes RAM, so 16GB wouldn't cut it either.
 
PS5 tier textures are fine.
Not on a card coming out almost 5 years later, they aren't.
8 years sure. But this gen can last 4 more years. And for high end graphics you'd need even more since RT consumes RAM, so 16GB wouldn't cut it either.
16GB would be enough 99% of the time. If you crank RT high enough to break that budget, then your 5060 Ti probably can't run those settings anyway, regardless of VRAM.
 
Last edited:
Success/failure of those low end gpu's is solely depended on pricing, if 8gig version costs 250 and 16gigs 300 streetprice, they will sell okish, altho i think true lowend chad of gpu gonna be some 9070 GRE model with 12gigs of vram :)

8GB card from AMD is pretty much useless at this point, AMD uses more memory than nvidia in games and 4060ti in 8GB version is already not enough for some new games even at medium 1080p:



Add to that:

- 128bit bus
- "slow" memory
- most likely cut down PCIE lanes...

This is a fucking disaster. 16GB version will be a decent GPU (depending on price) but 10GB is the MINIMUM right now for PS5 ports. Only fucking Intel knows this.

That will be for the 7900 GRE.

r61enLB.jpeg

This GPU could be quite nice.
 
Last edited:
Top Bottom