That game that looks like from mid-2000s, eh? And uses that RTX thing most devs avoid. Inspiring.Test it in Control
This is patently wrong. NN is a bunch of math equations of certain kind. By increasing its complexity you couldML reigns supreme in pattern recognition of any kind.
NN can do exactly the same it could do decades ago.what NN can do nowadays without much manual input.
As far as I know, people who always buy NV tend to have very sporadic knowlege of GPU tech.AMD has nothing like DLSS.
It's a fallacy. I.e. it may look much better if the developer was not accurate enough in their MIP levels for 4K. But overall it cannot be better by definition.
Only if DN is trained per game then it can look better than native, for obvious reasons.
So we've had Ampere revealed, we roughly know what to expect.
RTX 3090 (from $1499) - complete overkill SKU.
RTX 3080 (from $699) - NVIDIA claims 2x performance over 2080Ti(2080 not Ti, thanks to those who corrected). Looks like an over-promise but 70-80% gain is believable from the numbers.
RTX 3070 (from $499) - NVIDIA claims better than the 2080 Ti, probably as long as you don't go above 1440p.
On the AMD/RDNA2/Big Navi side we still don't really know.
We've somewhat of a polarisation between tech youtubers/leakers with some people thinking Ampere has killed AMD chances, others thinking AMD is quietly waiting to pounce.
On the understanding that, at this moment in time, we really have no idea, let's think about what could happen in a few months time. I'll go first.
I think AMD will release a 16GB GPU that destroys the 3070 but doesn't quite beat the 3080. I think they'll release it for $499.
It would be crazy if they also released a 24GB or 32GB (is that even possible?) variant for around $899.
It's perfectly within the realms of possibility that NVIDIA then releases a 3070 Super/Ti later on with 16GB and higher clocks (it's not quite as power hungry as the 3080/90).
Question for everyone:
Why do we think NVIDIA has been so aggressive on pricing?
That game that looks like from mid-2000s, eh? And uses that RTX thing most devs avoid. Inspiring.
I always suspected you to be a troll. Now, I'm convinced.That game that looks like from mid-2000s, eh? And uses that RTX thing most devs avoid. Inspiring.
My knowledge is fine, thank you. I don't want to exclude AMD here but the new 3000 Series is looking good over here, thats all i'm saying.That game that looks like from mid-2000s, eh? And uses that RTX thing most devs avoid. Inspiring.
This is patently wrong. NN is a bunch of math equations of certain kind. By increasing its complexity you could
NN can do exactly the same it could do decades ago.
There is no free cheese, you train NN, it gets biases.
If you train per game, you could teach NN to be biased in a way the art works in that particular game (e.g. Team Fortress visuals vs God of War).
The more generic you go, the less help NN brings.
Now, that was theory. When applied in practice (DLSS 1.0) we saw that FidelityFX (which also runs on NV, mind you) beats all that DataCenter magic.
And now we have TAA boosted somewhat being sold as the the second coming.
For the love of god, please stop referring to it by target resolution, it is terribly misleading.
As far as I know, people who always buy NV tend to have very sporadic knowlege of GPU tech.
Yeah, educate yourself on that please....This is patently wrong. NN is a bunch of math equations of certain kind. By increasing its complexity you could
Aw gee, if only the hardware had evolved since then.,...NN can do exactly the same it could do decades ago.
This is the same bullshit you started with the RT discussion.If you train per game, you could teach NN to be biased in a way the art works in that particular game (e.g. Team Fortress visuals vs God of War).
The more generic you go, the less help NN brings.
Wow...that is so ignorant I don`t even know what to say to this obvious troll attempt.And now we have TAA boosted somewhat being sold as the the second coming.
More troll nonsense....That game that looks like from mid-2000s, eh? And uses that RTX thing most devs avoid. Inspiring.
convinced
Fuck off with brigading assholery.troll nonsense....
Your statement was about math, a front on which you are lacking, I guess.Yeah, educated yourself on that please....
I work with that tech professionally and that is exactly what it does best...the absolutely easiest use case.
This is the same bullshit you started with the RT discussion.
"Thank you to telling me something I didn't know", would be good enough. Or just nothing....so ignorant I don`t even know what to say...
Well they launched the 780 Ti for $699 the same month as the PS4 launched for $399 (and also when AMD had the R9 290X which was a pretty decent card) so that doesn't really hold true unless you go all the way back to PS3.
Ofc there is. You don´t actually think Nvidia ever shuts their NN down or stops adapting their sample base, do you?
Correct. Which is why you have specialized hw to apply it in time.
I think that`s a given. You don´t just set a flag and have flawless TAA either.
I don't know, lets say at the top end estimation, 80% I think is probably "reasonably competitive" with Ampere taking a definite lead. But who knows maybe it could be 90+% of Ampere? Would that be competitive in your view or would only a 5% loss to Ampere be competitive? I'm honestly asking because I don't really know how the RT will play out, still to early to tell but it is likely that Ampere will beat out RDNA2 in RT. The speculation at the moment I guess is "by how much?" and "is that amount enough/competitive?"
And you think that means it`s not constantly trained and expanded. I guarantee you that this system hasn`t had a free day in the last 3 years.It's generic.
With the system as it`s supposed to/marketed to work it`s probably more like feedback to nvidia with a follow up extension/adjustment of the sample base if at all necessary.Anyway, fine-tuning is a per game training.
Question for everyone:
Why do we think NVIDIA has been so aggressive on pricing?
I didn't bother addressing your argument? I made a cohesive and logical argument as to why it might be the case that nVidia is reacting to AMD being at least present at the high end. There are reasons as to why this time it's more than just being present, and I didn't get to that in that reply, but I will in this one.I'm dismissing the argument because it's been wrong two generations in a row. What's there to be said exactly? I'm being condescending because you didn't even bother addressing my argument and instead stated exactly why the belief that they'd be competitive based on price is stupid by bringing up garbage cards like the Fury X to prove a point.
Not an argumentPeople need to stop with this "NVIDIA is aggressive with prices because they know AMD is cooking up something good".
That reasoning is stupid
I never saw anyone make the argument back then that the prices of the 980 Ti and 1080 Ti were good/cheap because AMD was going to come out with something better. No one considered them cheap. In fact, the pricing of all these cards were in line with their previous generation. The 980 Ti and 1080 Ti had almost the same release price, and so did the 780 Ti. Those prices were expected and normal. So, again, not an argument, but simply a lie about the past.people made the same one regarding the 980 Ti and 1080 Ti,
Previous events are not a guarantee for future results. But the fact that AMD's products were not up to par is only of partial significance to your 'argument'. At least AMD participated. And I'll say it again. The one moment that AMD did not have a product in the high end, nVidia boosted up the prices significantly. What explanation do you have for that...?AMD ended up having no answer.
It's not just the pricing.... I have a few questions for you.They might or might not have something great in the pipeline, NVIDIA being aggressive with prices isn’t indication of that at all.
Won't bother debunking the nonsense of your post.SNIP
You use my so-called not addressing your arguments as an excuse to be condescending (even though I actually did), and now you reply with something like this? Hypocrite. How low can you go?Won't bother debunking the nonsense of your post.
I won't bother addressing it because it wouldn't even be productive. I think NVIDIA lowering their prices has little to do with them being afraid of AMD based on precedents.You use my so-called not addressing your arguments as an excuse to be condescending (even though I actually did), and now you reply with something like this? Hypocrite. How low can you go?
I guess you're out of ammo. And too arrogant to admit it. Guess we're done here.
The most important question;
Every 80 class GPU has been 104, while every 80Ti/Titan class has been 102. Suddenly, the 80 non Ti class GPU is a 102 chip and we don't have a Titan. Why has nVidia shifted down their 102 chip one tier?
It's not a question of fear. It's a question of pricing products to compete against the presumed performance of their AMD counterparts. If you have a monopoly in a market, you increase pricing to maximise revenue. If you don't, then you have to price match.I think AMD can be competitive based solely on what RDNA2 has shown us thus far. None of the stupidity about NVIDIA being afraid and lowering their prices.
Because we've seen this exact scenario in the past.It's not a question of fear. It's a question of pricing products to compete against the presumed performance of their AMD counterparts. If you have a monopoly in a market, you increase pricing to maximise revenue. If you don't, then you have to price match.
I don't see what is super controversial about this.
If we go back to the first GTX Titan GPU, launched in February 2013, Nvidia used their GK110 chip (first seen in the Quadro K6000—sound familiar?), only with one SMX disabled. At the time, Nvidia had the fastest and second fastest GPUs for gaming, and they weren't in any hurry to reduce prices. It wasn't until October of 2013 that AMD finally had a viable competitor to the GTX 780 and Titan.
When AMD launched their first Hawaii GPU, the R9 290X, they laid claim to the title of the fastest gaming GPU, beating out the GTX Titan in most games. Nvidia's response came one month later with the GTX 780 Ti, which had half the memory of the GTX Titan (and 'slow' FP64 support) but included the fully enabled GK110 chip, along with higher clocks, all at a lower price. Nvidia closed the performance gap, and even if they didn't outright win, they at least had a viable claim to the throne.
Back to the present, we know that AMD is prepping Vega 10 for release—it might make it out in 2016, but 'early' 2017 is more probable. Whether they end up calling it the RX 490, RX Fury, or something else isn't important; Vega will come out, and it could be a performance monster. Best indications are it will have 16GB of HBM2 and 4,096 cores, with higher clocks and significantly better performance than Fury X.
Nvidia spoiled the launch of the Fury X by releasing the GTX 980 Ti. They had more memory, overall better performance (even if there are a few cases where Fury X beat the 980 Ti), and the cost of manufacturing GM200 is significantly lower than Fiji. Looking at GP102 and Vega, assuming the rumors are anywhere close to accurate, Nvidia is going to try to do the same again with the 1080 Ti.
![]()
![]()
![]()
It's really sad how graphics have gone backwards since 2005.
Well, as I see it there are two claims:Because we've seen this exact scenario in the past.
Here you have an article supposing the reason for the then unannounced 1080 Ti. It was apparently priced in a way that made buying Titan X foolish and was in anticipation of AMD having a monster in the work. Something which never happened.
Because we've seen this exact scenario in the past.
Here you have an article supposing the reason for the then unannounced 1080 Ti. It was apparently priced in a way that made buying Titan X foolish and was in anticipation of AMD having a monster in the work. Something which never happened.
All models are more expensive than their old counterparts
Aren't the launch prices of the 3070/80 identical to the launch prices of the 2070/80, FE were even a bit more if I'm not mistaken. How are those more expensive? Or are you using current prices?
1080 MSRP was $599 (not counting founders edition crap, everyone waits for 3rd party anyway, we'll see if the 3080 third party will be 100 cheaper or not but I've not heard anything of the sort and I don't think they specified the announced prices are FE only either).Aren't the launch prices of the 3070/80 identical to the launch prices of the 2070/80, FE were even a bit more if I'm not mistaken. How are those more expensive? Or are you using current prices?
Founders were 599 and 799 respectively for 2070 and 2080, so they (new cards) are definitely cheaper.
Kinda lame that the 3070 only has 8GB when the 2070 had the same 2 years ago. 2080 Super was $699 at launch.
1080 MSRP was $599 (not counting founders edition crap, everyone waits for 3rd party anyway, we'll see if the 3080 third party will be 100 cheaper or not but I've not heard anything of the sort).
Nvidia knows their customers. They know how to market to those customers. It is pretty simple to!
The 3090 is all mindshare. Barely anyone will buy one compared to the number of PC gamers purchasing new gpus. Hell if you go by steam charts barely anyone has a 20xx card., but that is not the point. The point is it's their show horse. It's something for their customers to get behind and use in online arguments.
For whatever reason AMD has been ignoring this. It doesn’t matter what it costs as long as it’s a monster. They are playing too close to sales numbers and statistics. They see most people buy in the lower to mid end range so that is where they focus while totally ignoring that critical mindshare aspect completely.
100% this. Not many Steam users have 20xx cards, but the vast majority does have an Nvidia card, that's how it works, because it doesn't matter how much you can spend, 200 or 2000, you'll always go with your money to the one that's the best. While on the other hand, AMD basically goes straight out saying their entire lineup is nothing but 1080p cards, with one 1440p card... Which ironically - there are so much more Freesync displays out there than G-sync ones, and cheaper at the same time, while AMD is unable to provide GPUs that can actually utilize them, that's such a huge yet missed opportunity, they could have been THE go-to company when it comes to high refresh gaming.
Won't bother debunking the nonsense of your post.
I think AMD can be competitive based solely on what RDNA2 has shown us thus far. None of the stupidity about NVIDIA being afraid and lowering their prices.
They will beat the 3070 without a doubt. Can they match the 3080? Perhaps but this is not a guarantee.
Well there you go.GTX 780 was based on the same big GPU as the first generation Titan/780 Ti, they're simply going back to their roots.
They've done it to beat AMD. Same reason they are rumored to be coming out with a new 103 GPU this gen, which will also most likely beat RDNA2.
AMD is barely in the top 10, with the RX 580 in rank 9, and its next card is the RX 570, at rank 16. As for the RX 5700 XT, it doesn’t appear until way down the list, somehow below the 2080 Ti despite being about 1/3 of its price. The 5700 XT has about 0.88% of the market.
Not everything is a console/hardware war. What is really weird is the lack of critical thinking towards nVidia. But I guess that was always the case...I guess it's weird to see the AMD superfans digging in their heels even harder now that Nvidia is showing the biggest advantage in technology they have ever had over AMD.
I share your sentiment. The thing is, they know exactly what to do to make the masses gobble it up. The RTX 3080 and RTX 3090 are the exact same chip. So it's ludicrous that one should be twice the price of the other, but that's how nVidia operates.I find it hilarious that the 3090 is viewed as an overkill SKU in anything other than power draw.
The thing is unlikely to be much more than 20% faster than a 3080, but at more than twice the price.
Nvidia really know how to market a product. Damn.
share your sentiment. The thing is, they know exactly what to do to make the masses gobble it up. The RTX 3080 and RTX 3090 are the exact same chip. So it's ludicrous that one should be twice the price of the other, but that's how nVidia operates.
Some very interesting leaks here...;