RDNA3 rumor discussion

4870 was about 85-90% of GTX280 in performance but was half the price. NV dropped the 280 by aroubd $200 2 weeks after launch.

AMD still didn't really gain much and would have been better off making a bigger die with 50% more execution units and clearly taking the performance crown for the same price as the GTX280.
So something that happened many years ago and with an old not fully with their heads inside their asses nvidia...

I just can't find any pr talk that could justify a price cut one week after amd presentation...
 
Fo real. :/

So AMD is supposed to give everyone the exact same performance, for way less money? Sorry, no.

Right now, you can get a 6650xt for like 280 bucks new. It's as fast as a 3060ti in raster, has 8gb instead of 12, has FSR2 instead of DLS which is vanishingly less effective, and the raytracing isn't good (as if it's really so useful in the 3060ti).

Getting better price to performance is a trade-off. You can't expect EVERYTHING to be as good or better at a significantly lower price ffs. If that's what it takes to even consider amd then they aren't serious about this competition stuff.
But nobody is expecting to have absolute parity at the same price.

We know they have worse rtx
We know they have worse reconstruction tech
We know drivers are probably a bit worse
And probably something else

When you know all of this, expecting a lower price with similar performances is not out of this world.

They are not really on par with nvidia so it is clear that nobody want to pay the same price and we expect lower prices.
 
Last edited:
If the AMD cards are worse overall, they cant expect to stay close to the MSRP of the competition.

Otherwise, id just spend that lil extra and go with the better option. Its very clear to me at least.
 
Last edited:
But nobody is expecting to have absolute parity at the same price.

We know they have worse rtx
We know they have worse reconstruction tech
We know drivers are probably a bit worse
And probably something else


When you know all of this, expecting a lower price with similar performances is not out of this world.

They are not really on par with nvidia so it is clear that nobody want to pay the same price and we expect lower prices.
I don't think it's a matter of being better or worse, but how much.

If AMD's top of the line card offers 95% the performance of a 4090 and costs just 10% less, no one is going to buy it.

What if they offer 20% less for $1000 though? I know it'd be interested. They need to do with Nvidia what they did with Intel. Ryzen 1 was better than the core 7th series on multitasking and worse in gaming, but still a decent CPU overall. It also offered much cheaper prices for core count. Can AMD offer let's say similar raster performance for substantially less?

But that's not the real question. Nvidia is held back by the crazy amount of Ampere cards, and so is AMD according to sources. Lowering the price of their cards means having to take a loss on that stock. According to MLID and others, that's the real reason behind unlaunching the 4080 12GB. They are doing what they did back with Turing, keeping the same price to performance ratio.

We should ask instead: Is AMD ready to lose money on that stock to gain market share and sell 7000 series cards?
 
We should ask instead: Is AMD ready to lose money on that stock to gain market share and sell 7000 series cards?

AMD helping Nvidia raise prices by not starting a price war which would eat away at their margins anyway, chokes the PC adoption rate and makes consoles more attractive which is one of their higher margin area..

Always Sunny Reaction GIF
 
Last edited:
On Intel, I think it's obvious that Arc is either cancelled or significantly toned down in scale (as business at least for the DIY GPU segment) given the absolute shitshow of a marketing campaign.
Thats what a MLiD made you think after multiple videos. He claimed he had his sources on how Intel is will soon close their GPU division and now throws a fit when people point that out. Last year he claimed that Intel was only making server GPUs and everything else was marketing. There is absolutely no reason for them to cut down their GPU division after coming this far and being able to manufacturer a competitive hardware(although not a good driver).
Lets not forget this guy hyped up RDNA 2 by claiming that all benchmark leaks are false because "AMD is feeding incorrect info to AIB manufacturers", "AMD was not impressed with 3090 performance" or "Sources connected to AMD are starting to sound more bullish than a month ago".
 
Last edited:
The only way for AMD to gain market share is by agressively undercutting Nvidia or bringing out products that are significantly superior to Nvidia.
 
But nobody is expecting to have absolute parity at the same price.

We know they have worse rtx
We know they have worse reconstruction tech
We know drivers are probably a bit worse
And probably something else

When you know all of this, expecting a lower price with similar performances is not out of this world.

They are not really on par with nvidia so it is clear that nobody want to pay the same price and we expect lower prices.

But they ARE cheaper. The prices of the RDNA2 cards was sus... They did price the stuff like it was completely on par. I think that had to do with the shortages. It's just money on the table.

But we started coming out of it and they made meaningful price cuts all through the RDNA2 range. Look at a 6650 compared to A 3060. Or heck, a 6600 vs a 3050!

and...
We know they have worse rtx ----> Yes. This is an embarrasing gulf.
We know they have worse reconstruction tech ---> Not really. imo, the difference between FSR2.1 and DLSS is slight. The main difference is how siedely it's supposr
We know drivers are probably a bit worse ---> Probably? Maybe??
And probably something else ---> ????? I guess productivity performance?

They are missing features but on basic performance per dollar, they are doing good at the moment. Abolute parity at the same price would be a reasonable expectation. Getting something with less features for less money is also reasonable.
 
But they ARE cheaper. The prices of the RDNA2 cards was sus... They did price the stuff like it was completely on par. I think that had to do with the shortages. It's just money on the table.

But we started coming out of it and they made meaningful price cuts all through the RDNA2 range. Look at a 6650 compared to A 3060. Or heck, a 6600 vs a 3050!

and...
We know they have worse rtx ----> Yes. This is an embarrasing gulf.
We know they have worse reconstruction tech ---> Not really. imo, the difference between FSR2.1 and DLSS is slight. The main difference is how siedely it's supposr
We know drivers are probably a bit worse ---> Probably? Maybe??
And probably something else ---> ????? I guess productivity performance?

They are missing features but on basic performance per dollar, they are doing good at the moment. Abolute parity at the same price would be a reasonable expectation. Getting something with less features for less money is also reasonable.
We don't know if the series 7000 is gonna be cheaper tho, and most important, by how much.

I was more talking about the future, not the 6000 series.

Also, is fsr really on par with dlss2-3? people have different opinions about it...
 
Last edited:
We don't know if the series 7000 is gonna be cheaper tho, and most important, by how much.

I was more talking about the future, not the 6000 series.

Also, is fsr really on par with dlss2-3? people have different opinions about it...

Depends on the eye of the beholder I suppose on the fsr2. Importantly, fsr 1 and 2 are not the same thing and are still offered as parallel solutions. FSR1 is really nothing more than a decent upscaler. FSR2 is a proper reconstruction technique. The 2.1 version has a little less ghosting and came relatively quickly after the initial 2.0 release.

If you take a look at a standard zoom/freeze commparison, dlss is still ahead. particulary in distant, fine detail. In practice, DLSS, FSR, and XeSS are really close. It's a wash for me.

The price of 7000 series is totally up in the air and there are good argument for both being a token undercut like 6000, or a serious one like the 6000's after the post-crypto price slashing.

Another key question, or the other side of the msrp coin if you will, is how much people think the appropriate amd price is. It seems like many think a radeon card with the same raster and let's say 75% of the raytracing would need to be something like half as much as the nvidia. That's crazy.
 
Last edited:
Depends on the eye of the beholder I suppose on the fsr2. Importantly, fsr 1 and 2 are not the same thing and are still offered as parallel solutions. FSR1 is really nothing more than a decent upscaler. FSR2 is a proper reconstruction technique. The 2.1 version has a little less ghosting and came relatively quickly after the initial 2.0 release.

If you take a look at a standard zoom/freeze commparison, dlss is still ahead. particulary in distan, fine detail. In practice, DLSS, FSR, and XeSS are really close. It's a wash for me.

The price of 7000 series is totally up in the air and there are good argument for both being a token undercut like 6000, or a serious one like the 6000's after the post-crypto price slashing.

Another key question, or the other side of the msrp coing if you will, is how much people think the appropriate amd price is. It seems like many think a radeon card with the same raster and let's say 75% of the raytracing would need to be something like half as much as the nvidia. That's crazy.
A good price for a 7800 that go like a 4080 16gb with like 70% of the rtx would be around 800-900 dollars maybe?

IF you think that in europe that shit is gonna traduce into 1000-1200 euros (and even more for third parties), there is really no win for me, i still have to pay more than 1000 euros for a slightly worse 4070ti (if the gaffers are right when they troll the 4080 16gb)

i payed 450 euros for my current 2070 super\ti, just to put things in perspective.
 
Last edited:
A good price for a 7800 that go like a 4080 16gb with like 70% of the rtx would be around 800-900 dollars maybe?

IF you think that in europe that shit is gonna traduce into 1000-1200 euros (and even more for third parties), there is really no win for me, i still have to pay more than 1000 euros for a slightly worse 4070ti (if the gaffers are right when they troll the 4080 16gb)

i payed 450 euros for my current 2070 super\ti, just to put things in perspective.

The price jump is fucking insane. And im on the same page where a 8-900 6800 would be a good deal compared to the 4080, but still a fat price-jacking. The 6900XT was 999 on release, and $699 after the cut. Idk. We'll have to see what they do. The hope is that they actually want to increase their market share, which everyone takes for granted that they do. AMD might be way more interested in keeping the margins as high as possible. Ironically, that would be like Nvidia bringing prices up instead of amd bringing prices down. Ugh.

One interesting thing to look at is ebay prices, where you can see the prices sort of normalized to the real supply and demand. A 6600XT goes for like low 200's. A 3060 more hovering around $300.
 
They will probably be on par or better in raster and better price.

But Nvidia is safe there is enough stupid people on this world to buy their products for a reconstruction tech and a few realtime reflexes in a few games that tank performance and they can't see in movement.
 
They will probably be on par or better in raster and better price.

But Nvidia is safe there is enough stupid people on this world to buy their products for a reconstruction tech and a few realtime reflexes in a few games that tank performance and they can't see in movement.

Don't forget Hairworks tho! Deal breaker!!!
 
Do people really think that if amd is aggressive with prices nvidia is gonna lower their price the day after?

How they would justify that with pr talk??

And did something similar even happened? Amd having aggressive prices and nvidia eating their pride and lowering their price the day\week after?
They literally unlaunched an entire SKU weeks before it was due to launch.
Nvidia have done this bullshit before and they'll do it again.
 
Together we advance gaming doens't seems like a tame "don't expect much" type of announcement...

fingers crossed for something actually competitive.

RDNA2 was competitive with Ampere, at rasterization. And drivers have improved a lot in the last year.
All that is missing is to add some tensor units, something AMD has in their CDNA GPUs.
And to improve the RT performance.
 
RDNA2 was competitive with Ampere, at rasterization. And drivers have improved a lot in the last year.
All that is missing is to add some tensor units, something AMD has in their CDNA GPUs.
And to improve the RT performance.
Since i always had nvidia in the past 10 years or so, what are the most common driver problems with amd gpus?

Like just inferior performances with some games or downright graphical glitches or worse like crushes, pc restarts etc?
 
Since i always had nvidia in the past 10 years or so, what are the most common driver problems with amd gpus?

Like just inferior performances with some games or downright graphical glitches or worse like crushes, pc restarts etc?

I want to know, too. I've never tried an amd on windows. The open source linux driver is tits besides taking longer to get features in. For normal use and reliability you can pretty much forget it exists. I've never even installed it. It's all rolled into the os.
 
Since i always had nvidia in the past 10 years or so, what are the most common driver problems with amd gpus?

Like just inferior performances with some games or downright graphical glitches or worse like crushes, pc restarts etc?
Never ever had any driver issues on AMD cards. Have a 570 and two 6600xt's right now.
 
Never ever had any driver issues on AMD cards. Have a 570 and two 6600xt's right now.

Hey I just upgraded from a 570 to a 6600xt.
That 8gb 570 was $179 new with re2 and dmc5 right when they came out 4 years ago. The 6600XT was under 200 used on ebay. The 570 was a great value. The 6600XT? Idk. Compared to used 3060's at least, it was a screamin deal.
 
Hey I just upgraded from a 570 to a 6600xt.
That 8gb 570 was $179 new with re2 and dmc5 right when they came out 4 years ago. The 6600XT was under 200 used on ebay. The 570 was a great value. The 6600XT? Idk. Compared to used 3060's at least, it was a screamin deal.
Same two of my kids have 560's and I got the 6600XT's around $200, it was a while back. The third, the 570 may get upgraded to a 6700xt if it keeps going down in price.

I myself am on a GTX1080 and it's been great too, but it's old. I'd like to see what the RTX 4070 comes to and AMD's cards. My PSU is 650W and I don't feel like upgrading it.
 
If AMD really wants a shot at a bigger piece of the market they should not only price competitively but also avoid the typical delays on the mid to low-end cards. Honestly they should launch the 7700 and 7600 the same day as the big boys. Can't get much better than only needing to compete against the last-gen offerings of your rival.
 
Hoping the 7800XT delivers both in rasterization and RT performance.

Should it cost under £800 and perform similarly to a RTX 4080 I'd be in. Even if RT is a lagging behind a bit , If AMD undercuts them a lot it could be quite lucrative for them.
 
If AMD really wants a shot at a bigger piece of the market they should not only price competitively but also avoid the typical delays on the mid to low-end cards. Honestly they should launch the 7700 and 7600 the same day as the big boys. Can't get much better than only needing to compete against the last-gen offerings of your rival.

Unfortunately chip fabrication doesn't work like that. The chips tape out when they tape out.
 
Unfortunately chip fabrication doesn't work like that. The chips tape out when they tape out.
I understand that they can't sell them if they don't have the chips available to sell. What I'm saying is they should have planned to attack the lower end of the market first or at the same time as the high-end. Nvidia almost always hits the top market first and works down. Give yourself those months with no competition to speak of and price well. Honestly, I would have waited a bit on the high-end and tried to really saturate the lower-end with as many cards as possible.
 
Last edited:
I understand that they can't sell them if they don't have the chips available to sell. What I'm saying is they should have planned to attack the lower end of the market first or at the same time as the high-end. Nvidia almost always hits the top market first and works down. Give yourself those months with no competition to speak of and price well. Honestly, I would have waited a bit on the high-end and tried to really saturate the lower-end with as many cards as possible.

oh heck no i'd totally go for the big spenders right off the bat lol. NEW THING HYPE PRICE BE DAMNED LETS GO
 
oh heck no i'd totally go for the big spenders right off the bat lol. NEW THING HYPE PRICE BE DAMNED LETS GO
That's the way it's generally done, yeah. It's just that AMD has never made a dent with that strategy. The only recent cards they've had that scratched the surface at all was the 480/580 and 5700 and in both cases they didn't have a high-end product to launch which caused them to jump out in front of Nvidia on the lower-end. They'll probably never even hit a quarter of a percent on the high-end (in regards to an individual card on the Steam survey). At least the 480/580 made a splash and had been a bit more successful in the immediate years after launch. When they launch alongside Nvidia they do nothing on the low-end either.
 
Last edited:
Every time an AMD card's about to launch it's hailed as the second coming. Then they launch and performance comes in a distant second.
 
Last edited:
RDNA2 was competitive with Ampere, at rasterization. And drivers have improved a lot in the last year.
All that is missing is to add some tensor units, something AMD has in their CDNA GPUs.
And to improve the RT performance.
Why would you want tensor units in a 50+ TFLOPs GPU?
 
Any chance of a 7800 beating a 4080 16gb in rasterization??

Weren't people saying that without dlss3 that card kinda stinks on his own?

Imagine if amd can surpass their performances with a 200-300 dollars lower price.

I'm pretty sure that many people don't give 2 fucks about rtx parity...
 
Last edited:
AMD is in a very good position right now.Their 6000 cards were head to head except for RT that was lagging behind.They have probably amazing cards now and they know all the good and bad about Nvidias new cards, through all the reviews and benchmarks.
They can plan their show perfectly now.Even if they would loose at the top against the 4090 card in the entry and mid market they could do very very well better than now.I have a strong feeling that these cards are the breakthrough for AMD graphicscards.
 
Since i always had nvidia in the past 10 years or so, what are the most common driver problems with amd gpus?

I want to know, too. I've never tried an amd on windows.
There's no driver issues, it's a meme people haven't been able to let go.
I've been on amd(ati) since the x700xt, after 3 6600gts died on me.

Amd is currently using a unified driver model, upgraded from a vega to a 6700xt and didn't have to update/install anything; the driver instantly recognised the new card and it has been working perfectly for months.
 
While I can't jump ship to AMD because I have a Gsync monitor that needs supporting, I really want RDNA3 to CRUSH Nvidia at all performance tiers because:
1) I want Nvidia to be forced to drop prices and release competitively priced 4060s and 4050s (if they ever go that low this gen)
2) Competition is good - I want a more bipolar market than we have now. Hell, I want Intel to compete as well!
3) I remember the prior generations when AMD (ATI) was the gaming market leader and I'm nostalgic. (2004, 2005, 2010)
 
Last edited:
There's no driver issues, it's a meme people haven't been able to let go.
I've been on amd(ati) since the x700xt, after 3 6600gts died on me.

Amd is currently using a unified driver model, upgraded from a vega to a 6700xt and didn't have to update/install anything; the driver instantly recognised the new card and it has been working perfectly for months.
So, no more:
Display-Driver-Stopped-Responding-and-has-Recovered.png
?
 
Feature parity with nvidia and Intel. Things like XeSS and DLSS.
But no game is using tensor units for AI or inference. There's only DLSS2 and XMX-accelerated XeSS but those have very little advantage over FSR2.x, which you can even implement in virtually all DLSS2 games by replacing a DLL file.
Sure, DLSS2.x is marginally better than FSR 2.0 but the latter is still in its infancy so there's still room for improvement.

If anything, history has shown us that dedicated tensor cores aren't a great use of die area on gaming GPUs.


I would agree that RDNA3 does need to accelerate more stages in the RTRT pipeline to have a lower performance hit, but I don't see AMD needing to bring tensor cores at all.
 
Last edited:
But no game is using tensor units for AI or inference. There's only DLSS2 and XMX-accelerated XeSS but those have very little advantage over FSR2.x, which you can even implement in virtually all DLSS2 games by replacing a DLL file.
Sure, DLSS2.x is marginally better than FSR 2.0 but the latter is still in its infancy so there's still room for improvement.

If anything, history has shown us that dedicated tensor cores aren't a great use of die area on gaming GPUs.

I would agree that RDNA3 does need to accelerate more stages in the RTRT pipeline to have a lower performance hit, but I don't see AMD needing to bring tensor cores at all.

No game is using tensor units for AI of other things, because consoles don't have them. Only DP4A.

DLSS is still the better of all upscaling techs. Probably because it has tensor units doing that extra bit to polish the final result.
On terms of die space. tensor units don't seem to spend that much. And a GPU for DLSS, doesn't need many of them.
 
Any chance of a 7800 beating a 4080 16gb in rasterization??

Weren't people saying that without dlss3 that card kinda stinks on his own?

Imagine if amd can surpass their performances with a 200-300 dollars lower price.


I'm pretty sure that many people don't give 2 fucks about rtx parity...
If Nvidia sticks with current prices there's no need to imagine tbh. If you look at BOM costs, RDNA 3 will have advantages for AMD in regards to what they can sell it for. With smaller dies, better prices from TSMC and use of cheaper GDDR6 vs GDDR6X stack for stack.

And they don't have a shed load of old stock floating around (RTX 3000 series) needing to be sold first. Which is probably one the big reasons why the 4080 12GB got cancelled.


I'd say it'll have a high chance it will be possible. We saw how performant RNDA 2 cards. And this time we're getting huge bumps in frequency which is said to reach 3Ghz, and beyond and shaders uplift of 2-2.5x depending on SKU's (assuming rumors are true). Its up to AMD if they want to price competitively or just slightly lower than Nvidia and cash in.
 
DLSS is still the better of all upscaling techs. Probably because it has tensor units doing that extra bit to polish the final result.
On terms of die space. tensor units don't seem to spend that much. And a GPU for DLSS, doesn't need many of them.

DLSS is better, but is the difference in motion perceivable to more than say 0.5% of gamers?
Let's say the tensor units take only 20mm^2 of die area in a 400mm^2 GPU (5% total) . At 5nm that's probably enough for a bunch more execution units and cache that will make the GPU run >5% faster overall.

Everyone can see a 5% faster GPU in bar charts. Not everyone can see that DLSS 2 looks better, when it takes Digital Foundry to do 300% zoom screenshots with arrows pointing to small circles.


Some sites have reported 24GB.
We're less than 2 weeks away from the announcement so it's silly season.
Rumor websites will pick up and run every single trash bit they find because they know people are paying attention to RDNA3 news.

IMO just stick to Skyjuice's specs on Angstronomics, it says pretty much everything but clocks.
 
DLSS is better, but is the difference in motion perceivable to more than say 0.5% of gamers?
Let's say the tensor units take only 20mm^2 of die area in a 400mm^2 GPU (5% total) . At 5nm that's probably enough for a bunch more execution units and cache that will make the GPU run >5% faster overall.

Everyone can see a 5% faster GPU in bar charts. Not everyone can see that DLSS 2 looks better, when it takes Digital Foundry to do 300% zoom screenshots with arrows pointing to small circles.

Has there ever been a confirmation of how much die space the tensor units use?
I've seen some people speculating, but never real numbers. I'm honestly very curious about that.
 
Top Bottom