AMD Polaris architecture to succeed Graphics Core Next

It is funny I brought this up a long time ago in a PC thread about longevity on NV cards. My GTX 285, and GTX 570 went 3-4 years where I was getting gains when they released drivers for certain games.

I remember part of the PDF log, was it showed performance increases in each game with what ever version of card from a specific sereis.

Now NV just does game ready drivers and most of the time I don't see improvements on older games, I see actually more issues.

Now it seems AMD is taking the approach NV started when it kicked off DX10 cards. Meaningful driver updates, and Video card options that will last you longer than 2 years performance wise.
 
AMD-GAF, you might like this :
Anandtech said:
ASUS Announces Three New Displays with Adaptive-Sync Technology

Finally, the ASUS MG248UQ is designed for gamers who value high dynamic refresh rates most of all other features. This display will be the first 24” monitor from the company which supports up to 144 Hz refresh rate as well as Active-Sync technology. The monitor uses a TN panel with 1920×1080 resolution, 100,000,000:1 ASUS smart contrast ratio and 350 cd/m2 brightness, offering slightly better specifications and a more aggressive visual design compared to the VG247H and the VG248QE. The display supports dynamic refresh rates between 40 and 144 Hz, according to ASUS, which is a very decent range. As an added bonus, thanks to the extremely high refresh rate, the MG248UQ could be used with NVIDIA's 3D Vision stereo-3D kit
This sounds very promising. No ultra-wide screen yet. But this hugh adaptive refresh rate range is definitely something I would take in account when looking for a monitor.
 
This is the first time I've heard or read about nvidia cards aging poorly. I was planning on getting a pascal card in the summer but now I think I'm convinced to go with an amd card instead.
 
- smaller Polaris 11 rendered "4K VR content" while being passively cooled. :)
https://twitter.com/ryanshrout/status/709761872778514433
I think passive = no additional power supply, in this context.
Will see how the new generation will change things around especially with DX12 which seems to favor AMD current cards at the moment.
I think it will all come down to how Gameworks for DX12 will be implemented in Games and how many publishers want to use it.
Going by NVs 80% marketshare, many will. :/
https://www.youtube.com/watch?v=O7fA_JC_R5s
GAF at large is still very pro-Nvidia and there are a few dedicated posters who work very hard to keep it that way.
I had to laugh so hard at this.
I also always think that way, when reading the pamphlets of some supposedly russian poster.
I definitely want to go AMD for my next card. The problem is that Freesync monitors are not getting the same treatment as gsync.

For example i want a freesync version of the Asus pg348q. Ultrawide 34inch gsync 100hz curved.
DisplayPort 1.3 supports UHD resolutions with up to 120hz (DP 1.2 only up to 60hz).
And the new cards from AMD and NV will support it.
So there should be new range of UHD monitors coming up once they launch.
 
AMD 390x was 2816:176:64 Unified Shaders:Texture Mapping Units:Render Output Units
GTX 980 was 2048:128:64

AMD 480X full chip is rumored at 2560:?:?

There is no way this chip wont at least perform better than the 390x.

We just need more info. More leaks!
 
Passively cooled means cooled without active coolers. It does not mean any other thing.

Ryan knows what he's talking about, and IMO, ~50W GPU would not require much cooling anyways.

I know what it means, but he only said passively, can´t find any cooled in his tweet. ;)
AMD-GAF, you might like this
Interesting, but I´m still waiting for 27", IPS, 1440p, 144Hz and Freesync with a range of at least 35-144.
 
Adaptive-Sync is the VESA standard for variable refresh rates.
It´s an open standard, everyone can use it.

AMD uses it and calls the feature Freesync.
 
If you say "AMD" three times out loud, dr_rus will appear in your mirror with a 3000 word post about why Nvidia is better. ;)

JUST MESSING WITH YOUR DR_RUS.


On topic: This only confirms the 480 and 470 range, right? I would imagine the 490 and 490x versions to be announced at computex?
 
active-sync? is that free sync?

It is adaptive-sync. That the is the VESA standard upon which freesync works. Adaptive-Sync is the standard implemented by monitors. Freesync is the specific implementation on AMD gpus that uses that standard to implement dynamic refresh rates.

For example Intel is rumored to start supporting Adaptive-Sync, but it won't be called Freesync.
 
If you say "AMD" three times out loud, dr_rus will appear in your mirror with a 3000 word post about why Nvidia is better. ;)

JUST MESSING WITH YOUR DR_RUS.


On topic: This only confirms the 480 and 470 range, right? I would imagine the 490 and 490x versions to be announced at computex?

Nah, I'm appearing only when someone is starting to say bullshit about how AMD is better than everything including sliced bread and how NV can't make even one transistor to work at 1 Hz.

I'm actually quite happy that AMD has this streak of a success lately as their perspectives last year prior to Fury and 300 series launch were rather grim. Hopefully this will bring them more users and will put some pressure on NV. And I certainly hope that Polaris will be good enough to keep that trend going for this and the next year as the market needs competition more than anything else.
 
Interesting.

Which is better this adaptive sync or G-Sync?
Here's a video comparing G-Sync with adaptive sync using a high-speed camera. https://www.youtube.com/watch?v=MzHxhjcE0eQ
45nv5tk7z.png
45vqpkaa.png
144nvpejnv.png
144vh4jdh.png
200nvgskfx.png
 
Other than TN the MG248Q sounds perfect for me. Don't want a monitor too big, and I'm fine with 1080p at the smaller-ish size since it doesn't require a monster GPU to support. High refresh rate would be nice though.

Might be nice to pair with a new GPU, may need to start putting money aside.
 
Which is better this adaptive sync or G-Sync?
I think you mean Freesync vs G-Sync?

Adaptive sync is used by both. Without adaptive sync G-Sync wouldn't be possible. It was introduced back in 2008 as part of the eDP standard, the embedded display port protocol used for internally connecting display panels digitally. It was introduced by Intel originally as a way to save power.

Since there was no way to use adaptive sync without access to eDP (i.e. all monitors) Nvidia extended the DP protocol on its own and created an own control board to make use of this that monitor manufactures have to buy from them. AMD took the other way by pushing for adaptive sync to be accessible through standard DP (and soon HDMI) as well.
 
I think you mean Freesync vs G-Sync?

Adaptive sync is used by both. Without adaptive sync G-Sync wouldn't be possible. It was introduced back in 2008 as part of the eDP standard, the embedded display port protocol used for internally connecting display panels digitally. It was introduced by Intel originally as a way to save power.

Since there was no way to use adaptive sync without access to eDP (i.e. all monitors) Nvidia extended the DP protocol on its own and created an own control board to make use of this that monitor manufactures have to buy from them. AMD took the other way by pushing for adaptive sync to be accessible through standard DP (and soon HDMI) as well.

Funny thing that Nvidia's refusal to support Adaptive (Freesync) is what finally pushed me over the edge. There's other things as well, but that was a huge no no for me.
 
Technically speaking the reason why Kepler cards aren't performing good these days is because most devs aren't optimizing their engines for Kepler cards these days (or even NV cards in general) as their main optimization target is GCN in modern consoles. NV software isn't really to blame here.
The console devs only said they moved to DX11 model, not specifically GCN I don't think.
 
Interesting.

Which is better this adaptive sync or G-Sync?
The main functional difference is that G-sync works with borderless fullscreen mode. G-sync monitors generally also have larger effective refresh rate ranges, and better pixel response characteristics across those ranges. Of course this is model specific, but it does occur pretty much every time the same panel is available from the same manufacturer in a G-sync and adaptive sync version.

The console devs only said they moved to DX11 model, not specifically GCN I don't think.
Console developers are writing their shaders primarily with console hardware in mind. Console hardware is GCN. That this has some performance effect is rather obvious (i.e. it is something I predicted before the consoles even launched).
 
So is AMD going to be able to compete with Nvidia at all? I need to be able to troll my Nvidia friends but I can't do that if it doesn't at least have raw power.
If the rumors are true, AMD will release new cards in lower-end segment so both won't directly compete at least for 2016 (or you can say Nvidia keeps the performance crown). Hopefully sht gets real in 2017.
 
AMD-GAF, you might like this :

This sounds very promising. No ultra-wide screen yet. But this hugh adaptive refresh rate range is definitely something I would take in account when looking for a monitor.

Actually, that's the downside to that monitor. AOC's 24 inch already has a 35 to 144 range while Nixeus has one with a 30-144 range. I'm using a GTX 970 right now but I'm considering getting a freesync monitor and upgrading to Polaris. I'd probably go for the Nixeus for myself - apparently no negative ghosting on it like some other 1ms monitors, plus I like its subdued design. Nice to see ASUS jumping in at this range though.
 
The main functional difference is that G-sync works with borderless fullscreen mode. G-sync monitors generally also have larger effective refresh rate ranges, and better pixel response characteristics across those ranges. Of course this is model specific, but it does occur pretty much every time the same panel is available from the same manufacturer in a G-sync and adaptive sync version.

Console developers are writing their shaders primarily with console hardware in mind. Console hardware is GCN. That this has some performance effect is rather obvious (i.e. it is something I predicted before the consoles even launched).

As well as engines generally ignoring nvidias strenghts since they happen to be amds weakness
 
Need to know more. I'm buying a gpu soon and I don't care about AMD or Nvidia. I just want the better product.
 
Need to know more. I'm buying a gpu soon and I don't care about AMD or Nvidia. I just want the better product.

Well, even though AMD "invented" HBM, Nvidia is going to reach the market first again with the high end product as usual. So if you're buying a GPU soon, your choices will be Nvidia and Nvidia.
 
Well, even though AMD "invented" HBM, Nvidia is going to reach the market first again with the high end product as usual. So if you're buying a GPU soon, your choices will be Nvidia and Nvidia.

I didn't realize you're from the future. Please show me these products that have been released with HBM.
 
I hope we find something out on these things soon. I'm tired of nvidia lately with their shoddy updates, lack of support for slightly older cars, and would love to go team red for a change. Hopefully the rumors aren't true about no true gaming or enthusiast card until next year.
 
I didn't realize you're from the future. Please show me these products that have been released with HBM.

Probably talking about the Tesla P100. A product that retails for thousands and sells in low volumes.

There are no announced consumer (or even prosumer) GPUs that use HBM 2.
 
Probably talking about the Tesla P100. A product that retails for thousands and sells in low volumes.

There are no announced consumer (or even prosumer) GPUs that use HBM 2.

Yeah a supercomputer GPU with 30% yields that retails for $15k doesn't count. Show me a consumer GPU with HBM2, not rumors of one. A gpu.
 
The console devs only said they moved to DX11 model, not specifically GCN I don't think.

They moved to whatever new h/w they've got and this h/w is GCN which can be viewed as a DX11 or DX12 depending on what you're doing with it. Consider that PS4 doesn't have any DX in it and you can't really "move to DX11" for PS4 engine. Then DX11 is just software and even inside the general "DX11 h/w" generation there are many different h/w implementations with rather different capabilities. Some h/w is bad at some stuff (Kepler is bad at atomics for example while GCN is bad at geometry throughput) and if you're not considering this in your renderer than that h/w will run it badly even though technically it's the same "DX11 h/w".

I hope we find something out on these things soon. I'm tired of nvidia lately with their shoddy updates, lack of support for slightly older cars, and would love to go team red for a change. Hopefully the rumors aren't true about no true gaming or enthusiast card until next year.

You mean, Fermi cards which are five and a half years old? Because there are no lack of support for anything after that.

For now the rumors point to Polaris 10 being a Hawaii replacement while Polaris 11 being a Tonga replacement. NV's lineup seems to be a step higher with GP104 replacing GM200 and GM204 and GP106 replacing GM204 and GM206. In any case I'm pretty sure that Polaris 10 and GP104 will compete at some price point while providing performance close to what 980Ti/FuryX provides so I don't see how is that "lack of gaming / enthusiast card" really. This cycle will be about upping performance a bit while severely cutting down the power consumption - and this is great as you'll be able to buy a cool, quiet card with top end performance soon.
 
Check this out, yo:

AMD-Radeon-2016-2017-Polaris-Vega-Navi-Roadmap.png


Mostly the same as the roadmap from Capsaicin but if you like tinfoil, especially on hats, you probably noticed the Fury section is smaller than the Radeon 300 section under 2015. Meanwhile, Polaris 10 and 11 split the 2016 section down the middle.

Videocardz.com speculates this might mean Polaris 10 will occupy both the high end of the Radeon 400 series as well as the successor brand to the Fury. On the other hand, you could say Fury is smaller because it had fewer cards and this just means Polaris 10 and 11 will deliver an equal amount of offerings. I'm more inclined to go with the first interpretation since it echoes what AMD has pretty much been saying all along:

Apart from this mobile-centric small Polaris GPU we were also made aware of an enthusiast version that I’d mentioned above which is also coming out. This “enthusiast” Polaris GPU has since been shown to journalists yesterday at CES. It has been described as the “successor” to the R9 Fury X and R9 390X graphics cards, so it’s clearly a high-end part. This is all part of AMD”s plan to release several SKUs based on each GPU to cover the entire market and regain market share Su affirmed. From entry level graphics products to mid-range and high-end parts. Which is how AMD and Nvidia have always chose to address the different market segments, so no real surprises here.

Just don't tell rus, since he might send it to Nvidia to help them triangulate the new power levels. Also isn't P10 supposed to be the GPU PS4K is using?
 
Check this out, yo:

AMD-Radeon-2016-2017-Polaris-Vega-Navi-Roadmap.png


Mostly the same as the roadmap from Capsaicin but if you like tinfoil, especially on hats, you probably noticed the Fury section is smaller than the Radeon 300 section under 2015. Meanwhile, Polaris 10 and 11 split the 2016 section down the middle.

Videocardz.com speculates this might mean Polaris 10 will occupy both the high end of the Radeon 400 series as well as the successor brand to the Fury. On the other hand, you could say Fury is smaller because it had fewer cards and this just means Polaris 10 and 11 will deliver an equal amount of offerings. I'm more inclined to go with the first interpretation since it echoes what AMD has pretty much been saying all along:



Just don't tell rus, since he might send it to Nvidia to help them triangulate the new power levels. Also isn't P10 supposed to be the GPU PS4K is using?

This picture doesn't mean anything as I doubt that they'll cover the whole lineup for 2016 with just two Polaris chips.
 
My guess is Polaris 10 and 11 are enough to replace all AMD's current offerings except Pro Duo. Both chips will likely have two SKUs at first, and later add a third. I don't think AMD needs a ton of SKUs on the desktop to replace their current lineup that's consisted of a mess of old chips with varying feature sets. Polaris 10 will likely only match Furies in performance in a much more affordable way, and for an actual upgrade over Fury X we'll have to wait until Vega 10 and 11.
 
This picture doesn't mean anything as I doubt that they'll cover the whole lineup for 2016 with just two Polaris chips.

I know it sounds weird to make a full lineup of two chips but they've been expressly saying that's what they would do, first with that CES quote above then again at GDC:

“What you’ll see us do is completely different with Polaris 10 and 11. We are really focusing on trying to bring FinFET technology with it’s amazing performance per watt to as many segments as possible. As many ranges of performance as possible. I can tell you Ryan, you and your readers, you’ll be pleased at what we’re going to do with this thing and you’ll be surprised.
[…]
“We’re looking at the entire gamut of players, how many millions of them are there, what they buy, the performance per dollar aspect.
 
I know it sounds weird to make a full lineup of two chips but they've been expressly saying that's what they would do, first with that CES quote above then again at GDC:

Well, let's see what they'll do but I think that they won't be able to do it.

Case one: Readon Pro Duo will be in the lineup obviously (unless they're giving that thing a three months sale window which is very unlikely) and for some reason they've decided to not show it on this image which already makes it inaccurate.

There's also this: AMD Radeon M400 Series, rebrands confirmed by drivers. Everything below M380 seems to be a rebrand in M400 series and I expect the same to be true for desktop R400 lineup. P10 will be used for 490X and 490, P11 will be used for 480X and 480. 470 and lower cards will be rebranded from R200/R300 series with Tonga being used for 470 and Pitcairn being used for 460 (which would make it the longest running GPU in history probably).
 
Well, let's see what they'll do but I think that they won't be able to do it.

Case one: Readon Pro Duo will be in the lineup obviously (unless they're giving that thing a three months sale window which is very unlikely) and for some reason they've decided to not show it on this image which already makes it inaccurate.

There's also this: AMD Radeon M400 Series, rebrands confirmed by drivers. Everything below M380 seems to be a rebrand in M400 series and I expect the same to be true for desktop R400 lineup. P10 will be used for 490X and 490, P11 will be used for 480X and 480. 470 and lower cards will be rebranded from R200/R300 series with Tonga being used for 470 and Pitcairn being used for 460 (which would make it the longest running GPU in history probably).

Uhm, why? If the M series is a rebrand, does that have ANYTHING to do with the R series? Polaris is to succeed GCN, not be a rebrand of it.

Also, 14nm as a "rebrand" for 28nm? The least you will get 30-40% lower power consumption, so I would not call that a rebrand even if has the same performance as the R9 3xx series.

Additionally, if that were the case, GCN would be occupying the lower third of the roadmap in 2016 section. It does not.
 
I think 'Vega' is simply the bigger and biggest (Greenland) GPUs, same architecture as Polaris, but designed for HBM2 and won't have a memory controller for GDDR5 / GDDR5-X.

Navi will be the next major architecture advancement.
 
Uhm, why? If the M series is a rebrand, does that have ANYTHING to do with the R series? Polaris is to succeed GCN, not be a rebrand of it.

Also, 14nm as a "rebrand" for 28nm? The least you will get 30-40% lower power consumption, so I would not call that a rebrand even if has the same performance as the R9 3xx series.

Additionally, if that were the case, GCN would be occupying the lower third of the roadmap in 2016 section. It does not.

Because M and R are connected. Did you open the link? There are two Polaris slots on top of the whole M400 lineup.

Polaris isn't succeeding GCN, Polaris is just a codename for this iteration of GCN architecture, it has GCN4 in it as well as some "uncore" stuff linked to the codename as well (HDMI2.0a for example).

Exactly. Hence why I've said that this image doesn't mean anything as it's obviously missing a lot of stuff in 2016 column.
 
Because M and R are connected. Did you open the link? There are two Polaris slots on top of the whole M400 lineup.

Polaris isn't succeeding GCN, Polaris is just a codename for this iteration of GCN architecture, it has GCN4 in it as well as some "uncore" stuff linked to the codename as well (HDMI2.0a for example).

Exactly. Hence why I've said that this image doesn't mean anything as it's obviously missing a lot of stuff in 2016 column.

Well, if you are right, then I am going to have to face a huge dilemma when it comes to choosing a successor for my HD7770 :( I was hoping to be able to get a ~299$ card in 2016 that will be able to run VR stuff eventually. Guess I am stuck waiting one more year :P
 
I think 'Vega' is simply the bigger and biggest (Greenland) GPUs, same architecture as Polaris, but designed for HBM2 and won't have a memory controller for GDDR5 / GDDR5-X.

Navi will be the next major architecture advancement.
Probably, but ideally I won't have to wait to long for the 2017 drop off Vega cards. I'm going to have a lot of grey by then otherwise.
 
Top Bottom