AMD Polaris architecture to succeed Graphics Core Next


Here is a better example that's more up to date than using 290x models cards:

index.php


index.php


22111021183l.jpg
 
I'm looking at building a PC at some point in the next 12 months, likely Q4 2016 or Q1 2017, to hopefully be able to do 1080/60 at Ultra in basically every AAA game released up until 2013 (and beyond when possible, of course) and handle indies and non-graphical powerhouses for a good 5-7+ years. I had been thinking that I'd pick up a 1070, as when I compared the 970 to the comparable AMD model, the 970's power draw was way lower. However, looking at those rumored specs, it seems like both the Polaris 11 and 10 have great (i.e. low) power draws while also having great performance, which is making me think that perhaps I should look at AMD instead.

I'm all new to this (mostly a console gamer and probably always will be), but my 2010 Macbook Pro isn't exactly ideal for experiencing a lot of the PC games I'm interested in, so I'm looking at these AMD and Nvidia product launches with a lot of interest. I'm really hoping both companies put out compelling products.
 
I'm looking at building a PC at some point in the next 12 months, likely Q4 2016 or Q1 2017, to hopefully be able to do 1080/60 at Ultra in basically every AAA game released up until 2013 (and beyond when possible, of course) and handle indies and non-graphical powerhouses for a good 5-7+ years. I had been thinking that I'd pick up a 1070, as when I compared the 970 to the comparable AMD model, the 970's power draw was way lower. However, looking at those rumored specs, it seems like both the Polaris 11 and 10 have great (i.e. low) power draws while also having great performance, which is making me think that perhaps I should look at AMD instead.

I'm all new to this (mostly a console gamer and probably always will be), but my 2010 Macbook Pro isn't exactly ideal for experiencing a lot of the PC games I'm interested in, so I'm looking at these AMD and Nvidia product launches with a lot of interest. I'm really hoping both companies put out compelling products.
If youre planning to keep your video card for a while make sure your due diligence is on point. Nvidia has developed a trend of performance falling off a cliff after 18 months
 
If youre planning to keep your video card for a while make sure your due diligence is on point. Nvidia has developed a trend of performance falling off a cliff after 18 months

Oh, hmm, I hadn't thought something like that would be possible. Admittedly, I plan to use consoles as my primary way of experiencing blockbusters and I probably won't be trying to run anything ambitious past the first year, so I'm not too worried. However, I'll look into that.

I have a weird bias in favor of AMD (probably because I'm a console guy), but the power consumption I saw on the last line kind of scared me. These rumored specs make me very excited for that reason.
 
If youre planning to keep your video card for a while make sure your due diligence is on point. Nvidia has developed a trend of performance falling off a cliff after 18 months
Technically speaking, the performance itself doesn't actually "fall off a cliff". It's just that nVidia adhere quite strictly to an aggressive engineered obsolescence model via turning various things on and off in their software packs/drivers (or removing them altogether). It's why you end up with Kepler Titan Blacks running worse than one would expect by reading the specs.
 
Technically speaking, the performance itself doesn't actually "fall off a cliff". It's just that nVidia adhere quite strictly to an aggressive engineered obsolescence model via turning various things on and off in their software packs/drivers (or removing them altogether). It's why you end up with Kepler Titan Blacks running worse than one would expect by reading the specs.

I think AMD has also "benefited" in this regard by using the same general architecture for a while now with relatively minor revisions, allowing older GCN cards to benefit from driver improvements for newer ones. We'll just have to see whether that holds true for Polaris and beyond.
 
To me it is too obvious and perfect that the leaks line up nicely as semi-custom parts: Polaris 11 = NX and Polaris 10 = PS4K.

Polaris 10 [Ellesmere]
target - 1440p60 DX12 gaming for desktop
36CU 2304 SPs & 40CU 2560 SPs
~100-110W

Exactly 2X PS4 GPU for the 36CU version

Polaris 11 [Baffin]
target - 1080p60 gaming for desktop & laptops
2 GPUs
16CU 1024 SPs & 20CU 1280 SPs
4GB of GDDR5/X
~50W

16CU+new architecture = bit more powerful than PS4 plus nice and low powered.

If the rumor posted before is true (the GPU/APU is smaller than PS4's), PS4k can't be using Polaris 10. It has to be either Polaris 11 or a custom config. That also means that Polaris 11 is unlikely for the NX.

I just hope that Polaris 10 Pro is is nice, cheap, and faster than my GTX 970 by a nice amount so I can be free of the 3.5GB prison. :P
 
I think AMD has also "benefited" in this regard by using the same general architecture for a while now with relatively minor revisions, allowing older GCN cards to benefit from driver improvements for newer ones. We'll just have to see whether that holds true for Polaris and beyond.
That too. An improvement for one series of GCN cards translates into improvements for all of them (with some exceptions, of course, like in the case of GPUs with color compression algorithms who tend to benefit more).
 
Technically speaking, the performance itself doesn't actually "fall off a cliff". It's just that nVidia adhere quite strictly to an aggressive engineered obsolescence model via turning various things on and off in their software packs/drivers (or removing them altogether). It's why you end up with Kepler Titan Blacks running worse than one would expect by reading the specs.

same end result no matter how you word it. as time goes by and newer games release, nvidia gpus just keep sliding down the performance tiers when compared to their amd counterparts
 
What's really crazy is that my three year old Radeon 7970 is giving roughly the same performance as a GTX 970 in newer DX12 supported titles. As time has gone on the card has only gotten better, which is just an insane return on investment
 
Technically speaking, the performance itself doesn't actually "fall off a cliff". It's just that nVidia adhere quite strictly to an aggressive engineered obsolescence model via turning various things on and off in their software packs/drivers (or removing them altogether). It's why you end up with Kepler Titan Blacks running worse than one would expect by reading the specs.

What a bunch of BS.

If this was true older games would run worse now than they did in the past on Kepler and tests have shown that this isn't the case at all.

It's just a case of consoles using AMD and AMD reaping the benefits. Even Maxwell is falling behind. You think Nvidia is already killing Maxwell performance?
 
What a bunch of BS.

If this was true older games would run worse now than they did in the past on Kepler and tests have shown that this isn't the case at all.

It's just a case of consoles using AMD and AMD reaping the benefits. Even Maxwell is falling behind. You think Nvidia is already killing Maxwell performance?
Err... older games aren't the ones affected. It's mostly on newer titles that the phenomenon of older AMD cards performing (nearly) equally with nVidia's older cards (whereas in the past they would've been decidedly slower) is noticeable.
 
The new GPU's this year seem to be aligning themselves almost identically to how the AMD 7000 series first did.

Remember the 7970 was $550 at release and the 7950 was $450. The GTX 680 was $500 and the GTX 670 was $400.

Things may very well line up like this again. There was no Kepler full chip until the 780ti the following year and it was $699. The 290x for AMD was $549.

Instead both Nvidia and AMD pushed out Dual GPUs the 690 and 7990 which were both priced at $1000.


I think we'll see something very similar this time as well. Prices for the 490x/490 and 1070/1080 may be a bit lower but I'm very willing to bet we won't see the big dogs until next year. Both of those being the HBM2 monsters.


Edit: Since I am comparing these GPU's I would also point out the benchmarks between the 7970/7950 and the 6970 since this was also a node jump.

HARDOCP review of the 7970/7950

Even the 7950 was beating the 6970 by decent margins.
 
What a bunch of BS.

If this was true older games would run worse now than they did in the past on Kepler and tests have shown that this isn't the case at all.

It's just a case of consoles using AMD and AMD reaping the benefits. Even Maxwell is falling behind. You think Nvidia is already killing Maxwell performance?

no, nvidia won't kill maxwell performance as in making it worse. what they will do is stop optimizing its drivers when pascal comes out like how they stopped optimizing kepler when maxwell came out.
 
A Kepler Titan Black (basically a 6GB version of the 780 Ti) could effortlessly pummel a 6GB 7970 GHz Edition into the dirt when it launched. On recent titles? The difference isn't as wide anymore. And the Titan Black was crazy expensive.

nm i misinterpreted your post
 
A Kepler Titan Black (basically a 6GB version of the 780 Ti) could effortlessly pummel a 6GB 7970 GHz Edition into the dirt when it launched. On recent titles? The difference isn't as wide anymore. And the Titan Black was crazy expensive.

However there is also the fact that generally GCN when conceived was more forward looking than Kepler.
 
However there is also the fact that generally GCN when conceived was more forward looking than Kepler.
Which I think is the point he was demonstrating. With new AMD cards excelling in their support of GCN and DX12, even my older 7970 is reaping the rewards. If AMD continues this forward thinking trend with Polaris then it should be very competitive with Nvidia if current benchmarks can be used as evidence.
 
Which I think is the point he was demonstrating. With new AMD cards excelling in their support of GCN and DX12, even my older 7970 is reaping the rewards. If AMD continues this forward thinking trend with Polaris then it should be very competitive with Nvidia if current benchmarks can be used as evidence.

There are three seperate issues.

1. GCN in its details was more forward looking than Nvidia's architecture at the time. At a higher level Nvidia already was simd based and looking at widespread compute.

2. The iterative strategy in GCN revisions allowed them to maximise whatever optimizations and apply it globally, whereas Nvidia has to optimize various architectures in different manners. If Pascal is an iteration on Maxwell, it may benefit from similar things (Maxwell was a substancial departure from Kepler).

3. GCN gpus do much more work at the hardware level, Nvidia has a lot more resources that such that they can do a lot of work in drivers (except when they decide not to).
 
There are three seperate issues.

1. GCN in its details was more forward looking than Nvidia's architecture at the time. At a higher level Nvidia already was simd based and looking at widespread compute.

2. The iterative strategy in GCN revisions allowed them to maximise whatever optimizations and apply it globally, whereas Nvidia has to optimize various architectures in different manners.

3. GCN gpus do much more work at the hardware level, Nvidia has a lot more resources that such that they can do a lot of work in drivers (except when they decide not to).

3 is the big one IMO.
 
Dunno why they would suddenly drop the ball on win drivers, the driver reputation contributed to their massive mindshare. But I guess they are too far in the lead to care.
 
What's really crazy is that my three year old Radeon 7970 is giving roughly the same performance as a GTX 970 in newer DX12 supported titles. As time has gone on the card has only gotten better, which is just an insane return on investment

AMD cards age like fine wine.

Nvidia cards age like the kind of wine that sours into vinegar :(
 
I went from AMD to Nvidia to AMD. I was thinking of going back to Nvidia cause Maxwell's TDP, but now, it looks like I'll go AMD again. Holy shit if the TDP for Polaris 10 is only around 100-110w!

AMD cards age like fine wine.

Nvidia cards age like the kind of wine that sours into vinegar :(

The 8800GT lasted quite a bit.

Now AMD's GCN cards are doing something similar.
 
I went from AMD to Nvidia to AMD. I was thinking of going back to Nvidia cause Maxwell's TDP, but now, it looks like I'll go AMD again. Holy shit if the TDP for Polaris 10 is only around 100-110w!

The article I saw (don't remember if it was linked here, probably was) mentioned Polaris 10 was going to be the 480/X and would probably hit at $300 or less to replace the 380 series.

I'd almost certainly be in for a 480 or 480X in that case, almost sounds too good to be true though.
 
I went from AMD to Nvidia to AMD. I was thinking of going back to Nvidia cause Maxwell's TDP, but now, it looks like I'll go AMD again. Holy shit if the TDP for Polaris 10 is only around 100-110w!



The 8800GT lasted quite a bit.

Now AMD's GCN cards are doing something similar.

Yes, while the 8800GT (and incarnations) is still the most lengthy card, it speaks volumes that my 7950 from 3 years ago can still trade blows with some of the newer cards.
 
The article I saw (don't remember if it was linked here, probably was) mentioned Polaris 10 was going to be the 480/X and would probably hit at $300 or less to replace the 380 series.

I'd almost certainly be in for a 480 or 480X in that case, almost sounds too good to be true though.

The 480 aligns more with the 390/X than anything.
 
The 480 aligns more with the 390/X than anything.

That's why it feels too good to be true. I can't see them dropping the price for that level of performance that much yet, regardless of branding. Especially if they don't have a higher end GPU coming soon, seems more like 480X would come in at about $400 and the 480 at maybe $350.
 
That's why it feels too good to be true. I can't see them dropping the price for that level of performance that much yet, regardless of branding.

You could say the same about cards like the 970 at launch--it sold like crazy because it was a very powerful card for the price segment it launched in. Every generation there's usually 1 card that is a completely steal.
 
You might as well sit back and wait a bit. I was planning on an upgrade mid-2015, but I held back after rumors of Polaris started showing up.

AMD

and

Nvidia

Seems like Polaris 11 and GP104 will both be announced at Computex 2016. The difference is expect Polaris 11 (R7 470 and R9 480) to be R7 380[X] and R9 390-level cards with better energy efficiency, while I imagine GP104 (GTX 1080 and GTX 1070) will most likely be noticeable performance increases--Polaris is more a revision to GCN where as Pascal is a new architecture.
 
AMD

and

Nvidia

Seems like Polaris 11 and GP104 will both be announced at Computex 2016. The difference is expect Polaris 11 (R7 470 and R9 480) to be R7 380[X] and R9 390-level cards with better energy efficiency, while I imagine GP104 (GTX 1080 and GTX 1070) will most likely be noticeable performance increases--Polaris is more a revision to GCN where as Pascal is a new architecture.
Based on available info you have it backwards
 
Yes, while the 8800GT (and incarnations) is still the most lengthy card, it speaks volumes that my 7950 from 3 years ago can still trade blows with some of the newer cards.
7970/50 were released in December 2011. So they are well over 4 years old and still in the mainstream performance bracket with GTX960 and 380(x)
 
To be fair I don't believe PS4k will use Polaris.

I guess Sony/AMD will use the same APU's arch used in PS4 with increased CUs and clock using 14nm.

Something like..,

APU 14nm
8 Jaguar cores @ 1.8Ghz
32 CUs @ 900Mhz

That will up actual PS4 performance from 1.84TF to 3.68TF.

That exactly twice power... I guess that will be a combination of CU/clock near this config I posted... maybe 30CUs @ 1Ghz.
 
Where are all those neogaf forum wankers claiming better support from nvidia now?

They still will when people continue to post the impressive performance AMD cards are getting, especially in DX12. I've been a Nvidia user for over a decade, I realize that they are fast become the Apple of graphics processing. However, brand loyalty is an incredibly powerful thing as you can see with GAF's insane fixation with Sony. As Sony can do no wrong, to many here Nvidia can also do no wrong. Basically at this point it's like Android Vs. Apple kind of Dynamic. A whole bunch of cognitive dissonance.
 
To be fair I don't believe PS4k will use Polaris.

I guess Sony/AMD will use the same APU's arch used in PS4 with increased CUs and clock using 14nm.

Something like..,

APU 14nm
8 Jaguar cores @ 1.8Ghz
32 CUs @ 900Mhz

That will up actual PS4 performance from 1.84TF to 3.68TF.

That exactly twice power.

Although that is possible I guess, why would Sony pay AMD to design a new GPU based on old tech when Polaris is right there?

Also it is claimed the Jaguar already bottlenecks the PS4 so how would doubling GPU performance be a good idea?
 
It seems like the 470 and 480 are rumored to be unveiled soon. As someone who is unfamiliar with AMD's naming but a bit more familiar with how Nvidia labels theirs, which models in Nvidia's lineup are the x70 and x80 generally comparable to? Based on GPU boss, it seems like Nvidia's numbers are generally 20 below AMD's (so 970 vs. 390, 960 vs. 380, 950 vs. 370, etc.). Is that right?
 
Based on available info you have it backwards

I'm basing that on their marketing and that Pascal has changes to the way it handles compute which will close the DX12 gap a bit, and will probably be more energy efficient by virtue of going from 28nm to 16nm. Polaris 11 is touting energy efficiency above all and is a smaller die than Hawaii/Grenada XT, it is clearly intended to be an iterative improvement on the 390 rather than hold a candle to the flame that is Fury.

It seems like the 470 and 480 are rumored to be unveiled soon. As someone who is unfamiliar with AMD's naming but a bit more familiar with how Nvidia labels theirs, which models in Nvidia's lineup are the x70 and x80 generally comparable to? Based on GPU boss, it seems like Nvidia's numbers are generally 20 below AMD's (so 970 vs. 390, 960 vs. 380, 950 vs. 370, etc.). Is that right?

That's a relatively new thing--implemented in the Rx 200 series. AMD changes their numbering scheme often for no apparent reason (HD 4870 -> HD 5870 -> HD 6970 -> HD 7970 -> 290[X] -> 390[X]/Fury X) compared to Nvidia (280 -> 480 -> 580 -> 680 -> 780 [Ti] -> 980 [Ti]). Maybe it's an incremental move towards a similar scheme as Nvidia, but we're not entirely sure Nvidia is staying with the same scheme, as usually once they pass 9's they restart (or at least they did with the 9000 series).
 
Top Bottom