GreatnessRD
Member
I believe this hot take is all cap. I also believe AMD's offering will sit right in between the 70 and 80. I guess the 70Ti if you will. And that should be great if the price is right.
However I agree that if AMD doesn't have a proper answer to RayTracing and especially DLSS it might be a generation where the feature set easily decides in favor of Nvidia.
My personal experience was with dlss 1.0 tho.So, with that Doom demo, we have 3080 at 2080Ti + 20-30%.
As expected.
The only unexpected bit is modest, no hikes, no Fools Edition bullshit pricing.
But only up to 3080.
Hm, what could have possibly caused it.
![]()
The <resolution> DLSS is a nonsensical concept that is highly misleading.
There is also something wrong with hyping technology that in your personal experience actually sucks.
As if you were victim of FUD campaign or something.
Peace and Harmony.
Between 3070 and 3080, closer to 3080 (so that 3070Ti is still beaten) is what was leaked by that spot on at times dude.
TPU (with orgasmic title) said so:
Throughout the video, NVIDIA compared the RTX 3080 to the previous-gen flagship, the RTX 2080 Ti, with 20-30% performance gains shown for Ampere.
TPU
Yeah before I would have been all for a cheaper AMD (maybe slightly weaker) alternative but if it's missing or has worse alternatives to what's looking more and more as must have features I can't really get behind them.
Almost. FidelityFX (as most if not all stuff AMD) is cross GPU, runs on AMD, runs on NV, runs on Intel.But knowing this industry they are both copying each other
I would expect a doubled up 5700XT with RDNA2 upgrades pass a 3070 with relative ease if any of the claims of RDNA2 are true. Considering what we've seen from the consoles, RDNA2 seems decent enough, but of course that doesn't necessarily translate into Big NAVI being a success in itself. They could have all sorts of issues from power consumption to drivers, but currently one of the biggest problems they seem to have is a lack of answer to DLSS2. We also don't really know how their ray-tracing solution performs vs. RTX. There's a ton of question marks around Big NAVI, and I don't have a lot of confidence they're up to the task of catching up on all fronts. Even in the very best case they're only on par, in theory, with one or two cards, in mainstream games that don't have DLSS2/RTX in them. That's a lot of caveats.
While I think Nvidia certainly made a much better offer this time than with 20 series, from a tech standpoint this wasn't really a big surprise. They doubled up on a full node jump, which was to be expected. The reason they didn't push hard on the pricing on the low end was likely two-fold: To pre-emptively counter Big Navi which they likely estimated to slot between 3070/80, and the fact that 20 series didn't move that well and people actually won't pay any price they ask. What's curious about Nvidia's lineup is the fact that 3080 and 3090 use the same chip, while having massive price difference. Usually xx80 has been a 04 chip. I wonder how big of a chip GA102 actually is. It might end up being costly for Nvidia to have a 3080 be a cut GA102 if Big NAVI ends up being in that performance class with a more lean die. Of course Nvidia does have plenty of room to play with any Super/Ti cards they'll likely introduce after AMD shows its cards. A lot of this also depends on how TSMC 7nm vs. Samsung 8nm plays out. We'll probably never really know though how much AMD or Nvidia really pay for their chips from either fab, but at least density and power should be easy to compare.
On another note, people need to seriously ignore these speculative hardware channels. They're so embarrassingly wrong most of the time it's painful to hear them talk about things they clearly don't understand much at all. GPU fanboyism is one of the most ridiculously dumb things people could be wasting their time with.
I think it will be, and that’s short of 3080 and 3090, but that’s expected . If they price smart they have a winner . Next round they could be in a better position .There is no way in hell its going to sit at 3070 performance that's 2080ti output. That's straight up a 30-40% increase over there 5700xt.
I think it will be, and that’s short of 3080 and 3090, but that’s expected . If they price smart they have a winner . Next round they could be in a better position .
But Nvidia had a really strong showing .
Holy wishful thinking batman!
We went from extremely optimistic "AMD will have a 2080ti competitor at a lower price" to "AMD will triple their performance" in a matter of days. The combined strength of AMD fanboyism and console fanboyism is creating some entertaining content.
AMD is already on the 7nm process, their last super expensive card was competing with a 1080ti, they have not shown any additional hardware to handle ray tracing and they have not shown additional hardware that would allow for technology akin to DLSS.
The on number of occasions spot on leaker said "a104 cannot compete with Big Navi", "not even TI version".I expect there big navi to sit at 3080 or a bit above it, nvidia probably expects the same.
2080Ti is about 30-45% faster than 250mm2 RDNA1 5700XT.We went from extremely optimistic "AMD will have a 2080ti competitor at a lower price" to "AMD will triple their performance"
they have not shown any additional hardware to handle ray tracing
Dude, you do not need to invent reasons to stick with filthy green.akin to DLSS.
And you think, that given the failure rates on their recent "high end" entries they can just scale the die and call it a day?Nvidia will just counter it with a price drop or a new series of budget cards. They already did this, this generation.
5700xt die size 251 mm2
2080ti die size 775 mm2
![]()
And u think its going to sit at only a 30% gain for there next series of actual high end cards?
Zero chance.
We need to look at the solution as a whole and take ray tracing into account.2080Ti is about 30-45% faster than 250mm2 RDNA1 5700XT.
Your "triple performance" reference is bazinga.
![]()
AMD's first ray tracing demo
Everything is so shiny o_O If interested in explanations of the demo;neogaf.com
Dude, you do not need to invent reasons to stick with filthy green.
solution as a whole
they have not shown any additional hardware to handle ray tracing
all seen that demo
And you think, that given the failure rates on their recent "high end" entries they can just scale the die and call it a day?
Even if they had the 50% increase on 5700XT, how does additional acceleration for ray tracing and AI play into that?
This is the same type of wishful thinking that AMD fanboys fall into with every hardware cycle.
Bruh Lisa has just been getting fashion tips from her cousin JensenLisa being too occupied buying fancy clothes while her new gpus can't even compete with old gpus...
![]()
now that you say this, the lather jacket indeed looks rather suspicious...Bruh Lisa has just been getting fashion tips from her cousin Jensen
The guy who wrote the article can't count. The performance gains are between 40-50%. There isn't a single frame with a 20% advantage for the 3080.TPU (with orgasmic title) said so:
Throughout the video, NVIDIA compared the RTX 3080 to the previous-gen flagship, the RTX 2080 Ti, with 20-30% performance gains shown for Ampere.
TPU
Anybody believing Big Navi can't beat a 3070 is an idiot. The 2080 Ti is only about 25% faster than the SX's GPU with 52 CU's at 1825MHz. Big Navi will destroy the 2080 Ti/3070 but might not be up to par with the 3080. Seriously, people expect AMD to only match what NVIDIA did over 2 years ago? Come on man.I like Coreteks for his speculative stuff as much as the next guy here, but I wouldn't trust him on this one.
The most reliable leaker we had for Ampere had this to say about the 3070 vs RDNA2:
Jesus Christ why do you keep lying?2080Ti is about 30-45% faster than 250mm2 RDNA1 5700XT.
Your "triple performance" reference is bazinga.
I don't understand whyJesus Christ why do you keep lying?
![]()
49% according to TPU's actual benchmarks and not just an idiot writing a false headline that is easily disprovable by watching 5 seconds of the video.
At 4K mind.Jesus Christ why do you keep lying?
![]()
49% according to TPU's actual benchmarks and not just an idiot writing a false headline that is easily disprovable by watching 5 seconds of the video.
Because there are CPU bottlenecks at 1080p and nobody cares about the 1080 performance of a 2080 Ti.At 4K mind.
45% at 1440p
35% at 1080p
Overall about 40% faster.
Also baring in mind that TechPowerUp is one of many benchmarking sites.
That doesn't seem too unreasonable an ask for Radeon to beat with literally double the CU's with their largest chip, does it?
Doubling the 5700XT's performance would put a GPU at 33% faster than a 2080Ti at worst.
AMD don't need to triple their performance at all.
4k and high resolutions in general are mem bandwidth starved, a problem, that "Big Navi" should not have.Overall about 40% faster.
btarrung is the guy doing the benchmarks, dumbo. (have you noticed that older card runs on a very old driver versions? why would that be, hmm...)Jesus Christ why do you keep lying?
![]()
49% according to TPU's actual benchmarks and not just an idiot writing a false headline that is easily disprovable by watching 5 seconds of the video.
The idiocy.btarrung is the guy doing the benchmarks, dumbo. (have you noticed that older card runs on a very old driver versions? why would that be, hmm...)
You've cherry picked 4k results where 5700XT is severely bandwidth starved, something that won't be the case withBig Navi.
CPU benchmarks show barely any difference even at 1080p with 2080Ti. Certainly not something explaining the 15% difference.1080p doesn’t take advantage of what the 2080 Ti can do.
It isn’t but why do you have to outright lie about the numbers? No one gives a damn about the 2080 Ti at 1080p and within the context we are discussing, saying "35-45%" is straight up dishonest. The most common resolution used for the 2080 Ti has it beating the 5700XT by 49%.250mm2 10 billion transistor chip with laughable mem bandwidth getting "murdered" by 754mm2 18.6 billion transistor chip is a notable event.
Fuck off.outright lie about the numbers?
Of course it does, it just can’t do much at 4K. It isn’t being held back, it plain isn’t strong enough mr dishonest.4k doesn't take advantage of what the 5700XT can do.
A bigger die means higher failure rate.failure rates means nothing. they could have had bad memory modules from shipments of a provider that made most of there cards duds. ( 290 cards had this ) which can be solved with minor adjustments, or your architecture could be a complete dud.
If it was bending more to the later, we would not be seeing cards on the market now for the price they are asking or even availability. So it really doesn't seem to be much of a issue really.
About raytracing, increasing the soc by adding raytracing blocks into the soc, exactly what nvidia does. That's how. xbox is proof of this.
There is no wishful thinking, its basically how the world works.
I estimate raytracing wise the flag ship of rdna2 will sit around 2080ti performance, and performance around 3080 upwards if they push things as halo card, a tiny bit lower if they skimp it and just want something more affordable.
When I was making this thread, I wondered if I actually needed the Drama tag or not. Turns out I did.Let's cool it on the antagonism. If you have proof, provide receipts, not poke and prod to get a rise out of each other.
Based on the XSX chip, which has hardware BVH acceleration, we know that 56 CUs take up around 300 mm^2 of space. Going from 56 to 80 is an increase of 43%, so crudely scaling everything by 1.43 that would be 429 mm^2 , which is already below the estimated 505 mm^2. Based on the Minecraft demo, it seems the XSX GPU is about on par with a 2070 for raytracing. Add another 54% (52 active to 80 CUs) and you are at 2080 Ti levels, not accounting for clockspeed improvements. So it makes sense to think that Navi 21 could be between the 3070 and 3080 for ray tracing, especially in "hybrid" titles that rely on conventional rasterisation.A bigger die means higher failure rate.
And if they add ray tracing cores to the silicone, then they will not have room to magically double the number of streaming processors, which logically means that they will not be able to increase rasterisation performance in line with wishful thinking here. And that still doesn't even take into account separate cores dedicated to ML, which reportedly they will not have.
Let's not forget, that in software based ray tracing (Neon Noir tech demo from Crytek), 5700XT fares substantially worse than 1080ti, and 2080ti is around twice as fast.
So tell me again about how AMD will magically multiply their performance on every front because of using a bigger die (which will still be considerably smaller than the one NVidia is using).