blastprocessor
The Amiga Brotherhood
"2 GPUs" ?
"2 GPUs" ?
"2 GPUs" ?
so good
so where will my 290 fall with these new cards ?
so where will my 290 fall with these new cards ?
If youre planning to keep your video card for a while make sure your due diligence is on point. Nvidia has developed a trend of performance falling off a cliff after 18 monthsI'm looking at building a PC at some point in the next 12 months, likely Q4 2016 or Q1 2017, to hopefully be able to do 1080/60 at Ultra in basically every AAA game released up until 2013 (and beyond when possible, of course) and handle indies and non-graphical powerhouses for a good 5-7+ years. I had been thinking that I'd pick up a 1070, as when I compared the 970 to the comparable AMD model, the 970's power draw was way lower. However, looking at those rumored specs, it seems like both the Polaris 11 and 10 have great (i.e. low) power draws while also having great performance, which is making me think that perhaps I should look at AMD instead.
I'm all new to this (mostly a console gamer and probably always will be), but my 2010 Macbook Pro isn't exactly ideal for experiencing a lot of the PC games I'm interested in, so I'm looking at these AMD and Nvidia product launches with a lot of interest. I'm really hoping both companies put out compelling products.
If youre planning to keep your video card for a while make sure your due diligence is on point. Nvidia has developed a trend of performance falling off a cliff after 18 months
Technically speaking, the performance itself doesn't actually "fall off a cliff". It's just that nVidia adhere quite strictly to an aggressive engineered obsolescence model via turning various things on and off in their software packs/drivers (or removing them altogether). It's why you end up with Kepler Titan Blacks running worse than one would expect by reading the specs.If youre planning to keep your video card for a while make sure your due diligence is on point. Nvidia has developed a trend of performance falling off a cliff after 18 months
Technically speaking, the performance itself doesn't actually "fall off a cliff". It's just that nVidia adhere quite strictly to an aggressive engineered obsolescence model via turning various things on and off in their software packs/drivers (or removing them altogether). It's why you end up with Kepler Titan Blacks running worse than one would expect by reading the specs.
To me it is too obvious and perfect that the leaks line up nicely as semi-custom parts: Polaris 11 = NX and Polaris 10 = PS4K.
Polaris 10 [Ellesmere]
target - 1440p60 DX12 gaming for desktop
36CU 2304 SPs & 40CU 2560 SPs
~100-110W
Exactly 2X PS4 GPU for the 36CU version
Polaris 11 [Baffin]
target - 1080p60 gaming for desktop & laptops
2 GPUs
16CU 1024 SPs & 20CU 1280 SPs
4GB of GDDR5/X
~50W
16CU+new architecture = bit more powerful than PS4 plus nice and low powered.
That too. An improvement for one series of GCN cards translates into improvements for all of them (with some exceptions, of course, like in the case of GPUs with color compression algorithms who tend to benefit more).I think AMD has also "benefited" in this regard by using the same general architecture for a while now with relatively minor revisions, allowing older GCN cards to benefit from driver improvements for newer ones. We'll just have to see whether that holds true for Polaris and beyond.
Technically speaking, the performance itself doesn't actually "fall off a cliff". It's just that nVidia adhere quite strictly to an aggressive engineered obsolescence model via turning various things on and off in their software packs/drivers (or removing them altogether). It's why you end up with Kepler Titan Blacks running worse than one would expect by reading the specs.
Technically speaking, the performance itself doesn't actually "fall off a cliff". It's just that nVidia adhere quite strictly to an aggressive engineered obsolescence model via turning various things on and off in their software packs/drivers (or removing them altogether). It's why you end up with Kepler Titan Blacks running worse than one would expect by reading the specs.
Err... older games aren't the ones affected. It's mostly on newer titles that the phenomenon of older AMD cards performing (nearly) equally with nVidia's older cards (whereas in the past they would've been decidedly slower) is noticeable.What a bunch of BS.
If this was true older games would run worse now than they did in the past on Kepler and tests have shown that this isn't the case at all.
It's just a case of consoles using AMD and AMD reaping the benefits. Even Maxwell is falling behind. You think Nvidia is already killing Maxwell performance?
Err... older games aren't the ones affected. It's mostly on newer titles that the phenomenon of older AMD cards performing (nearly) equally with nVidia's older cards (whereas in the past they would've been decidedly slower) is noticeable.
A Kepler Titan Black (basically a 6GB version of the 780 Ti) could effortlessly pummel a 6GB 7970 GHz Edition into the dirt when it launched. On recent titles? The difference isn't as wide anymore. And the Titan Black was crazy expensive.umm, what?
What a bunch of BS.
If this was true older games would run worse now than they did in the past on Kepler and tests have shown that this isn't the case at all.
It's just a case of consoles using AMD and AMD reaping the benefits. Even Maxwell is falling behind. You think Nvidia is already killing Maxwell performance?
A Kepler Titan Black (basically a 6GB version of the 780 Ti) could effortlessly pummel a 6GB 7970 GHz Edition into the dirt when it launched. On recent titles? The difference isn't as wide anymore. And the Titan Black was crazy expensive.
A Kepler Titan Black (basically a 6GB version of the 780 Ti) could effortlessly pummel a 6GB 7970 GHz Edition into the dirt when it launched. On recent titles? The difference isn't as wide anymore. And the Titan Black was crazy expensive.
Which I think is the point he was demonstrating. With new AMD cards excelling in their support of GCN and DX12, even my older 7970 is reaping the rewards. If AMD continues this forward thinking trend with Polaris then it should be very competitive with Nvidia if current benchmarks can be used as evidence.However there is also the fact that generally GCN when conceived was more forward looking than Kepler.
Which I think is the point he was demonstrating. With new AMD cards excelling in their support of GCN and DX12, even my older 7970 is reaping the rewards. If AMD continues this forward thinking trend with Polaris then it should be very competitive with Nvidia if current benchmarks can be used as evidence.
There are three seperate issues.
1. GCN in its details was more forward looking than Nvidia's architecture at the time. At a higher level Nvidia already was simd based and looking at widespread compute.
2. The iterative strategy in GCN revisions allowed them to maximise whatever optimizations and apply it globally, whereas Nvidia has to optimize various architectures in different manners.
3. GCN gpus do much more work at the hardware level, Nvidia has a lot more resources that such that they can do a lot of work in drivers (except when they decide not to).
What's really crazy is that my three year old Radeon 7970 is giving roughly the same performance as a GTX 970 in newer DX12 supported titles. As time has gone on the card has only gotten better, which is just an insane return on investment
This, so much.What's really crazy is that my three year old Radeon 7970 is giving roughly the same performance as a GTX 970 in newer DX12 supported titles. As time has gone on the card has only gotten better, which is just an insane return on investment
AMD cards age like fine wine.
Nvidia cards age like the kind of wine that sours into vinegar![]()
I went from AMD to Nvidia to AMD. I was thinking of going back to Nvidia cause Maxwell's TDP, but now, it looks like I'll go AMD again. Holy shit if the TDP for Polaris 10 is only around 100-110w!
I went from AMD to Nvidia to AMD. I was thinking of going back to Nvidia cause Maxwell's TDP, but now, it looks like I'll go AMD again. Holy shit if the TDP for Polaris 10 is only around 100-110w!
The 8800GT lasted quite a bit.
Now AMD's GCN cards are doing something similar.
Yes, while the 8800GT (and incarnations) is still the most lengthy card, it speaks volumes that my 7950 from 3 years ago can still trade blows with some of the newer cards.
The article I saw (don't remember if it was linked here, probably was) mentioned Polaris 10 was going to be the 480/X and would probably hit at $300 or less to replace the 380 series.
I'd almost certainly be in for a 480 or 480X in that case, almost sounds too good to be true though.
The 480 aligns more with the 390/X than anything.
That's why it feels too good to be true. I can't see them dropping the price for that level of performance that much yet, regardless of branding.
If AMD are going to be using Polaris for their R7 460 and R7 470 cards as well I will skip the R7 370 completely..
You might as well sit back and wait a bit. I was planning on an upgrade mid-2015, but I held back after rumors of Polaris started showing up.
Based on available info you have it backwardsAMD
and
Nvidia
Seems like Polaris 11 and GP104 will both be announced at Computex 2016. The difference is expect Polaris 11 (R7 470 and R9 480) to be R7 380[X] and R9 390-level cards with better energy efficiency, while I imagine GP104 (GTX 1080 and GTX 1070) will most likely be noticeable performance increases--Polaris is more a revision to GCN where as Pascal is a new architecture.
7970/50 were released in December 2011. So they are well over 4 years old and still in the mainstream performance bracket with GTX960 and 380(x)Yes, while the 8800GT (and incarnations) is still the most lengthy card, it speaks volumes that my 7950 from 3 years ago can still trade blows with some of the newer cards.
Where are all those neogaf forum wankers claiming better support from nvidia now?
To be fair I don't believe PS4k will use Polaris.
I guess Sony/AMD will use the same APU's arch used in PS4 with increased CUs and clock using 14nm.
Something like..,
APU 14nm
8 Jaguar cores @ 1.8Ghz
32 CUs @ 900Mhz
That will up actual PS4 performance from 1.84TF to 3.68TF.
That exactly twice power.
Based on available info you have it backwards
It seems like the 470 and 480 are rumored to be unveiled soon. As someone who is unfamiliar with AMD's naming but a bit more familiar with how Nvidia labels theirs, which models in Nvidia's lineup are the x70 and x80 generally comparable to? Based on GPU boss, it seems like Nvidia's numbers are generally 20 below AMD's (so 970 vs. 390, 960 vs. 380, 950 vs. 370, etc.). Is that right?