blastprocessor
The Amiga Brotherhood
Good video about GCN improvements in Polaris vs. Fiji/Hawaii/Tonga:
https://youtu.be/FvAW5UaRG_4
PS4 neo could be a beast.
Good video about GCN improvements in Polaris vs. Fiji/Hawaii/Tonga:
https://youtu.be/FvAW5UaRG_4
Good video about GCN improvements in Polaris vs. Fiji/Hawaii/Tonga:
https://youtu.be/FvAW5UaRG_4
theres speculation on b3d that gp104 might not have inherited the architecture changes shown in the gp100 whitepaper. it might still be 128 alus per sm like maxwell. only a few more days till we know all
Good video about GCN improvements in Polaris vs. Fiji/Hawaii/Tonga:
https://youtu.be/FvAW5UaRG_4
Hey guys,
I noticed the hype train full steam about this call on 5/18 but I need to put the brakes on this for now.
This is a high-level partner webinar, and there will be no specific technical details discussed.
I want to set expectations here, and while this may seem incredibly disappointing, you'll know more about Polaris soon (no ™.
Pls no shooterino the messenger. While tempted to downvote, please do the opposite to bring awareness.
That price/perf is just nuts, probably too good to be true.Well explained and the guy seems knowledgeable.
Basically expects Polaris 10 to be Fury X performance and $299. Hope he is right.
That price/perf is just nuts, probably too good to be true.
Fury X for $299/350 would be insane. If I manage to sell my 7950 I could make the jump for <300 and basically max out any game at 1080p.I know I argued on another forum that AMD would not be giving Fury X for $299. Maybe $350.
But after the way he compared it to historically what happens on a node shrink and the relative performance between generations, that kind of ball-park looks to be on the money.
I know I argued on another forum that AMD would not be giving Fury X perf for $299. Maybe $350.
But after the way he compared it to historically what happens on a node shrink and the relative performance between generations, that kind of ball-park looks to be on the money.
People are finding it difficult to comprehend because AMD, and Nvidia, are moving 2 full nodes forward (and skipping the half nodes in between). It's effectively 3-steps-in-1. That is why people are so impressed by the leaked synthetic benchmarks of the 1080 for example. Meanwhile, if you simply factor in all the different variables the performance is simply as expected and, at least, not disappointing.
16FF+ is more or less the same as 20nm but with FinFETs. Somewhat same is true for GF's 14nm. It's not a "2 full nodes forward".
16FF+ is more or less the same as 20nm but with FinFETs. Somewhat same is true for GF's 14nm. It's not a "2 full nodes forward".
Re-posting image. It is true for TSMC, but Samsung/GF's is smaller and Polaris is based on theirs (not sure LPE or LPP).I thought 16FF+ was actually smaller than their 20nm node. From what I remember, 20nm with FinFets was rebranded as 16FF and the actual smaller node is now called 16FF+. It's been a while since I've seen those leaked(?) slides, so I could be wrong.
I thought 16FF+ was actually smaller than their 20nm node. From what I remember, 20nm with FinFets was rebranded as 16FF and the actual smaller node is now called 16FF+. It's been a while since I've seen those leaked(?) slides, so I could be wrong.
Re-posting image. It is true for TSMC, but Samsung/GF's is smaller and Polaris is based on theirs (not sure LPE or LPP).
---
It's smaller but it's not a "node jump" smaller. The biggest benefit 16/14nm bring compared to 20nm are FinFETs and a significant leakage reduction, not the gate size.
Re-posting image. It is true for TSMC, but Samsung/GF's is smaller and Polaris is based on theirs (not sure LPE or LPP).
![]()
http://anandtech.com/show/10326/the-nvidia-geforce-gtx-1080-preview/3NVIDIAs loyal opposition, AMDs Radeon Technologies Group, has strongly hinted that theyre not going to be releasing comparable high-performance video cards in the near future. Rather the company is looking to make a run at the much larger mainstream market for desktops and laptops with their Polaris architecture, something that GP104 isnt meant to address.
I would be extremely happy with Fury X performance for $299 though.
That seems to be the optimistic outlook, along with a 40 CU card. Most rumors seem to be pointing to 32 CUs and 390X performance.
Afaik most rumors actually said 36/40 CUs. I think people starting running with 32 because of their mobile cards.
I hate repeating myself again, but Fury (X) performance still seems way too optimistic. If we assume 40 CUs to be true, then that puts at 390X performace as a baseline. The rest is up to architectural improvements, frequency and bandwidth efficiency. It's entirely possible to come close to Fiji (which isn't THAT much faster than Hawaii), but I'd rather be pleasently surprised than end up with another meh.
And P10 not competing with GP104? I'm shocked. Shocked I tell you!
That seems to be the optimistic outlook, along with a 40 CU card. Most rumors seem to be pointing to 32 CUs and 390X performance.
That seems to be the optimistic outlook, along with a 40 CU card. Most rumors seem to be pointing to 32 CUs and 390X performance.
Good video about GCN improvements in Polaris vs. Fiji/Hawaii/Tonga:
https://youtu.be/FvAW5UaRG_4
I know I argued on another forum that AMD would not be giving Fury X perf for $299. Maybe $350.
But after the way he compared it to historically what happens on a node shrink and the relative performance between generations, that kind of ball-park looks to be on the money.
Afaik most rumors actually said 36/40 CUs. I think people starting running with 32 because of their mobile cards.
I hate repeating myself again, but Fury (X) performance still seems way too optimistic. If we assume 40 CUs to be true, then that puts at 390X performace as a baseline. The rest is up to architectural improvements, frequency and bandwidth efficiency. It's entirely possible to come close to Fiji (which isn't THAT much faster than Hawaii), but I'd rather be pleasently surprised than end up with another meh.
And P10 not competing with GP104? I'm shocked. Shocked I tell you!
Fury X is still $600?
Wow, that needs a price drop.
I locked myself into AMD by using one of their proprietary technologies that nVidia doesnt support, so this news of no high performance card hit really hard, Im currently on a 290x, I guess my best choice is to wait for a price drop on the 390x or Fury???
I locked myself into AMD by using one of their proprietary technologies that nVidia doesnt support, so this news of no high performance card hit really hard, Im currently on a 290x, I guess my best choice is to wait for a price drop on the 390x or Fury???
I locked myself into AMD by using one of their proprietary technologies that nVidia doesnt support, so this news of no high performance card hit really hard, Im currently on a 290x, I guess my best choice is to wait for a price drop on the 390x or Fury???
I locked myself into AMD by using one of their proprietary technologies that nVidia doesnt support, so this news of no high performance card hit really hard, Im currently on a 290x, I guess my best choice is to wait for a price drop on the 390x or Fury???
Maybe I'm reading this wrong so I'd like someone to clarify me and tell me where I'm not getting this. I know that Global Foundries (although it's Samsung Tech) will be making Polaris but according to TSMC, their 16nm chips can provide 60% power savings over 28nm. Now, I know that TSMC isn't Samsung but I doubt that the difference will be huge. However, with the rumor that that P10 will perform at 390 levels at half the draw, is it right of me to assume that P10 is more or less a 390 die shrink?
Link where TSMC say 60% power saving.
When u say locked to an AMD proprietary tech, you mean Freesync?
Also what resolution are u looking for an upgrade?
Dont believe the 390x is a huge step up from the 290 so your best bet if AMD is Fury I suppose which should see a price drop with the 1070 launch.
There are rumors of Vega dropping later this year though. So keep that in mind.
Assuming you bought a freesync monitor/adaptive sync monitor?
Adaptive sync isn't really proprietary as it simply leverages adaptive sync in the monitor. I did the same as I have a 970 and refused to pay the premium for Gsync. The nice things is that it has adaptive sync for when (hopefully) Nvidia finally supports it. In addition Freesync monitors really are barely more expensive than just any other decent quality monitor. So it's just like buying a solid monitor that can support adaptive sync if you want it to with a nominal fee. I figured if I end up going Nvidia again (I don't want to) that I'll just use it like any other monitor but I didn't have to pay an extra $200-300 dollars.
My plan is to ride out my 970 until the Vega card if the mid level AMD offerings don't spark my interest.
Not freesync, but mixed resolution Eyefinity(Surround)
![]()
Outside monitors are 1920*1200(16:10) and center is 2560*1080 (21:9), AMD handles it admirably well, nVidia doesnt have any support for something like this.
It hurts to wait till end of year as most modern games are starting to not run at descent Frame rates with my resolution6400x1080
Anyone thinking that the AMD cards that will be out this summer will only rival the 980 Ti and then they will push as hard as they can to have HBM2 cards out by the end of the year (before Nvidia's next year) to get the possible upper hand? Just a thought.
Anyone thinking that the AMD cards that will be out this summer will only rival the 980 Ti and then they will push as hard as they can to have HBM2 cards out by the end of the year (before Nvidia's next year) to get the possible upper hand? Just a thought.
295x2 had a huge price drop.Cards of that high end rarely get sig price drops. They often have very limited production runs and usually sell out before a price drop would happen.
Not freesync, but mixed resolution Eyefinity(Surround)
![]()
Outside monitors are 1920*1200(16:10) and center is 2560*1080 (21:9), AMD handles it admirably well, nVidia doesnt have any support for something like this.
It hurts to wait till end of year as most modern games are starting to not run at descent Frame rates with my resolution6400x1080
Not freesync, but mixed resolution Eyefinity(Surround)
![]()
Outside monitors are 1920*1200(16:10) and center is 2560*1080 (21:9), AMD handles it admirably well, nVidia doesnt have any support for something like this.
It hurts to wait till end of year as most modern games are starting to not run at descent Frame rates with my resolution6400x1080
AMD's "Polaris 10" GPU will feature 32 compute units (CUs) which TPU estimates – based on the assumption that each CU still contains 64 shaders on Polaris – works out to 2,048 shaders. The GPU further features a 256-bit memory interface along with a memory controller supporting GDDR5 and GDDR5X (though not at the same time heh). This would leave room for cheaper Polaris 10 derived products with less than 32 CUs and/or cheaper GDDR5 memory. Graphics cards would have as much as 8GB of memory initially clocked at 7 Gbps. Reportedly, the full 32 CU GPU is rated at 5.5 TFLOPS of single precision compute power and runs at a TDP of no more than 150 watts.