AMD Radeon Fury X Series | HBM, Small Form Factor And Water Cooling | June 16th

Keep moving the goalposts. :)

Moving what where? You're talking about 390X now, I'm saying that 275W for that card is hardly a great feature considering that it's competition has 100W less of consumption. So saying stuff like "don't read too much into it" is completely pointless - the end result is still rather bad for AMD here.

Fury is a different beast however.
 
15W less power usage for the 390X when it has had a small upclock, offers a rumoured 5-10% boost and has an extra 4GB GDDR5 over the 290X is pretty decent IMO.
 
That's Nvidias choice, there is no technical reason they can't have them.

Have AMD/ATI ever prevent custom coolers?

For the case of the 290 ans 290x if I recall correctly it took the next quarter before custom coolers hit the market. I would assume that vendors will have stuff ready by October the latest. I believe in the case of the Titan X, Nvidia is allowing custom coolers now.
 
The AMD Caribbean Islands Family: Not Just a Rebrand

While AMD’s Fury X provides a halo for the Radeon brand, AMD still has other new
graphics cards in the Radeon 300 series including the following air cooled cards:

 R9 390 & R9 390X each with 8GB of GDDR5 and 275W for 4K gaming
 R9 380 2 or 4GB of GDDR5 and 190W for 1440p gaming
 R7 370 up to 4GB of GDDR5 and 110W for everyday gaming
 R7 360 up to 2GB of GDDR5 and 100W for everyday gaming

AMD has been hard at work over the past year-and-a-half optimizing and re-architecting
the microcontrollers within the ASICs themselves. Combined with the improvements to
their manufacturing process, AMD has been able to squeeze more performance out of
each of their cards and increase performance while maintaining the same price tier as
its predecessor.


 The R9 390 and R9 390X replace the R9 290 and R9 290X and are both 300
GFLOPS faster than their predecessors (5,100 GFLOPS and 5,900 GFLOPS
respectively) without increasing power in typical workloads.
 The R9 380 also benefits from the maturing of the 28nm process technology and
AMD’s optimizations. It gains roughly 200 GFLOPS in performance: from 3,290
GFLOPS to 3,480 GFLOPS in compute performance.
 The R7 370’s compute capability of 2,000 GFLOPS is also faster by 200
GFLOPS than its predecessor’s (R7 265) 1,800 GFLOPS.
 The R7 360 has a compute performance of 1,610 GFLOPS, slightly more than
the 1,536 GFLOPS of the R7 260.

In all cases, AMD increased performance and also added many features that previous
generations did not have. Some of those features are enabled through the driver and
others are done in hardware. But all of the GPUs listed above will support DirectX 12,
Vulkan, and Mantle graphics APIs.

http://www.moorinsightsstrategy.com...e-to-Radeon-by-Moor-Insights-and-Strategy.pdf

I missed this. Great news if the 390/X are more than just a straight rebadge. More performance, less power power usage.
 
15W less power usage for the 390X when it has had a small upclock, offers a rumoured 5-10% boost and has an extra 4GB GDDR5 over the 290X is pretty decent IMO.

It's not like they had any options as going through 300W limit would get the card out of PCIE specs compliance. And 15W less isn't anything to brag about either as a lot of 290/X custom cards had even less than that in various tests a long time ago.

Let's be clear here: Hawaii rebrand is anything but exciting. The only interesting thing out of the new lineup are Fiji based cards.
 
Moving what where? You're talking about 390X now, I'm saying that 275W for that card is hardly a great feature considering that it's competition has 100W less of consumption. So saying stuff like "don't read too much into it" is completely pointless - the end result is still rather bad for AMD here.
Wachie is just completely unbearable in AMD threads. Everything which might remotely be interpreted as a negative statement needs to be relentlessly attacked, either directly or passive-aggressively. I get that it's hard to be a fan of a company constantly on the verge of going under, but I still had to put him on ignore at some point.
 
What could that mean in terms of real world performance (not very techy I'm afraid)? Is this meant to rival the 970 or does it slot more in between the 970 and 960?

I think the 390 will be around the same as a 970 at 1080p, slightly faster at 1440p and 4K.

Definitely won't slot in below the 970, if anything, above it.
 
What could that mean in terms of real world performance (not very techy I'm afraid)? Is this meant to rival the 970 or does it slot more in between the 970 and 960?
300 GFlops is an increase by ~6% over a stock 290. It's never that simple though with modern GPUs, since the listed specs are really more of a guideline. The actual performance you get depends on the specific load, power consumption and temperature limits (and thus transitively on cooling).

E.g. if the cards run cooler then it might well be that the advantage compared to a stock 290 (with default cooling) will be a bit higher than 6% in practice. Still nothing earth-shattering of course.

Estimating a comparison to any NV card is harder yet than to 290, since the exact location where it will end up will likely differ with each game.
However, given this latest comparison from Computerbase:
capturegwkvn.png

A ~6% increase should slot in pretty much on par with a 970 as a median across multiple games at 2560x1440.
 
http://www.moorinsightsstrategy.com...e-to-Radeon-by-Moor-Insights-and-Strategy.pdf

I missed this. Great news if the 390/X are more than just a straight rebadge. More performance, less power power usage.

You are kidding yourself if you think this is any significant change - if they were upgraded to GCN 1.2 then AMD would be screaming about it from the rooftops.

You also missed this part:

But AMD is not stopping with just FreeSync, even though
they will support FreeSync in all Radeon 300 series R9 series GPUs.

Funny thing R9 270 was downgraded to R7 370 this time so obviously it's just coincidence that it is rumored to be same old GCN 1.0 Pitcairn that doesn't support freesync ;)
 
I think the 390 will be around the same as a 970 at 1080p, slightly faster at 1440p and 4K.

Definitely won't slot in below the 970, if anything, above it.

300 GFlops is an increase by ~6% over a stock 290. It's never that simple though with modern GPUs, since the listed specs are really more of a guideline. The actual performance you get depends on the specific load, power consumption and temperature limits (and thus transitively on cooling).

E.g. if the cards run cooler then it might well be that the advantage compared to a stock 290 (with default cooling) will be a bit higher than 6% in practice. Still nothing earth-shattering of course.

Interesting, thanks guys! Definitely got my eyes on this, hell maybe even a 390X.
 
That's the reason for the delay I guess.

If so that's a real real dick move by amd, not that I'd consider getting the fury x anyways.

£550 at a minimum for the Fury X I imagine.

Retailers must be rubbing their hands, two chances to shoot fish in a barrel inside a month.

Wouldn't be surprised to see £600 Fury X available.

Wonder if there will be non reference Fury X available.

I meant the normal fury, not the X, I hope it's in the mid 400s.
 
I have a 550w seasonic psu powering a 290x so i assume my psu is fine for the fury x if it is around the same power usage.
I am not familiar with water coolers though. Would the AIO on the Fury X be an issue for my PSU?
 
I have a 550w seasonic psu powering a 290x so i assume my psu is fine for the fury x if it is around the same power usage.
I am not familiar with water coolers though. Would the AIO on the Fury X be an issue for my PSU?
Water coolers really don't affect power usage all that significantly, but they do allow for designs/frequencies/voltages which are more power hungry.

Since everything any vendor tells you about power consumption is pretty much useless these days I'd just wait for reviews with actual measurements in load scenarios.
 
I have a 550w seasonic psu powering a 290x so i assume my psu is fine for the fury x if it is around the same power usage.
I am not familiar with water coolers though. Would the AIO on the Fury X be an issue for my PSU?

No. The power usage figure is the power usage for the card, which includes the AIO water cooler. It doesn't need plugging into your PSU either if that is what you mean.
 
I have a 550w seasonic psu powering a 290x so i assume my psu is fine for the fury x if it is around the same power usage.
I am not familiar with water coolers though. Would the AIO on the Fury X be an issue for my PSU?

I'm surprised your computer is stable considering the recommended requirement for 290x is 650W. I'm guessing you using something like an 84W Intel CPU or 95W AMD CPU that's not overclocked.
 
I'm surprised your computer is stable considering the recommended requirement for 290x is 650W. I'm guessing you using a low powered intel CPU.

They try to be safe and assume you have a power hungry CPU and bad PSU. They're kinda useless nowadays.

For a long time the best SFX PSU was 450W and tons of people have systems with it powering an OCd Intel CPU (no ITX AMD boards) and a Titan.
 
They try to be safe and assume you have a power hungry CPU and bad PSU.

For a long time the best SFX PSU was 450W and tons of people have systems with it powering an OC CPU and a Titan.

The thing is that the 290x itself can draw up to 290W by itself, so that leaves like 200W for the rest of the system if the seasonic 550W have an effective load of 90%.

edit: Beaten.
 
The thing is that the 290x itself can draw up to 290W by itself, so that leaves like 200W for the rest of the system if the seasonic 550W have an effective load of 90%.

edit: Beaten.

That's not how PSUs work.

If its a 550w PSU then it can supply 550w, not 495w at 90%.

90% just means at 550w usage its costing you 611w at the wall.
 
Cross post.

One day i'll understand why you put the images into the code tag instead of quote


---


I'm super pumped for the Fury Nano, i really hope this is the end of the era and we put behind 10+ inches long cards.

I need an HBM powered card in the ~300€ price point, the first hardware vendor that can offer me this will have my sale.
 
Is the R7 370 actually an overclocked R7 265?
Looks like it... same core count too. That's quite a sneaky move to have less cores than 270 unless they upclock it big time to compensate <_<

I'm super pumped for the Fury Nano, i really hope this is the end of the era and we put behind 10+ inches long cards.

I need an HBM powered card in the ~300&#8364; price point, the first hardware vendor that can offer me this will have my sale.
I don't think it's that cheap... that card is targeted at SFF and perf/W rather than fighting price/perf with 970.
 
Do you think nvidia will price drop the 970 in response to this? Anytime soon?
I'd guess that depends on where exactly the performance of prospective 970 competitors in this lineup ends up, and how it affects their sales. It's not out of the question.

Whatever one is between £250-350 here in the UK is the one I'll probably get. I currently have a 290.
If you currently have a 290 already then it's very doubtful (unless the GBP conversion is exceptional) that whatever ends up with in the £250-350 slot will be an upgrade which is worth it.
 
I don't think it's that cheap... that card is targeted at SFF and perf/W rather than fighting price/perf with 970.

I know that Fury Nano won't be that cheap i was talking in general even in the future, a card with &#8805;970 performance and &#8805;4GB HBM memory is more than enough for my standards.
I think next year is when i will be able to buy such a thing.
 
Future proofing is generally a fool's errand but statistically only a small minority upgrades their PC every year. The Smokey Tier gamers. Titan owners are also a rare breed but they're usually the kind of person who buys this stuff day 1 to guarantee maximum performance at any cost.

On the other hand, a lot of people take Nvidia to task for skimping on VRAM historically. Like the 680 being 2GB to AMD's flagship's 3, or 780 being 3GB to the AMD flagship's 4. [/b]The 390X is shipping with 8GB vram by default, is that crazy excessive?[/b] 4GB on their flagship card is undesirable technologically as well as from a marketing perspective. If it was easy for them to have more, they would have more, no doubt about it.

It's marketing. The 390X is a tuned 290X GPU...so if they didn't have more VRAM - there would be very, very little to differentiate it from their existing, 2-3 year old GPU.

8GB 290X cards have proven to add very little advantage except for very specific use cases. I doubt much will change with the 390X. It's the same story: not enough horsepower for the higher resolutions, therefore the VRAM is really pointless.

NVIDIA has been taken to task in the past - but this has never proven to actually be a performance issue except in extreme cases (mod xyz + resolution abc, etc.). I ran 3-way, 4-way SLI with 680 and the 780 generations - never an issue with VRAM even at 5760x1080 NV Surround.

Either way - not an AMD or NVIDIA "thing" per se - just a consumer "thing" - everyone wants more VRAM no matter what. There's nothing wrong with wanting more - but going back to my original argument, the 4GB HBM thing will be an issue for AMD regardless of whether it technically affects its cards because of this.
 
Future proofing is generally a fool's errand but statistically only a small minority upgrades their PC every year. The Smokey Tier gamers. Titan owners are also a rare breed but they're usually the kind of person who buys this stuff day 1 to guarantee maximum performance at any cost.

On the other hand, a lot of people take Nvidia to task for skimping on VRAM historically. Like the 680 being 2GB to AMD's flagship's 3, or 780 being 3GB to the AMD flagship's 4. The 390X is shipping with 8GB vram by default, is that crazy excessive? 4GB on their flagship card is undesirable technologically as well as from a marketing perspective. If it was easy for them to have more, they would have more, no doubt about it.
Most definitely, and especially at the price you'll get a 12GB card anyway. More importantly, if such cards don't have the grunt for 4K gaming now it won't miraculously show up 6 months or a year later (despite DX12). By then, there'll be much more powerful cards with newer technologies for much cheaper.
 
That's not how PSUs work.

If its a 550w PSU then it can supply 550w, not 495w at 90%.

90% just means at 550w usage its costing you 611w at the wall.

While it is true that PSU can supply that amount, you definitely shouldn't be drawing that amount of power on a regular basis. One should target at least a 20% margin between the rated power supply and ones own power requirements. At 50~60% load the PSU is the most efficent, typically.
 
Moving what where? You're talking about 390X now, I'm saying that 275W for that card is hardly a great feature considering that it's competition has 100W less of consumption. So saying stuff like "don't read too much into it" is completely pointless - the end result is still rather bad for AMD here.

Fury is a different beast however.
Let's recap a bit.

1. I post about new cards (300 series) being more power efficient and not reading too much into the 375W TDP sticker on the box referencing Dave.
Looks like the new cards are more power efficient - http://forums.anandtech.com/showpost.php?p=37481186&postcount=1456

Plus Dave at B3D is hinting to not read into (TDP) stickers on boxes.
2. You post
Sounds rather desperate, don't you think?
3. Nachmater joins in

Is this about the 375W figure that showed up on some boxes? It seemed to me that was just the maximum power 2x8-pins could deliver instead of the actual TDP.

The drop seems pretty significant though. I wonder if it's due to better binning or they went for a respin.
4. And then you say this "375W" is a "solid" number that can actually be used for comparison
No, it's desperate when a representative of a company is saying "don't look too much into this" instead of providing a solid number which we can actually use for comparisons.
5. From the Moor report, I point out the actual TDP values to be indeed lower and not anywhere close to 375W
For comparison, 290X TDP was 290W so its in fact slightly less.

http://www.moorinsightsstrategy.com...e-to-Radeon-by-Moor-Insights-and-Strategy.pdf

Yes, such a desperate move by the AMD guy telling to wait and not read too much into the 375W stickers. DESPERATE I SAY!
6. You move the comparison (or goalpost) to compare it with Maxwell
That's almost 100W more than a competition's product of the same range so yeah, I'd say that was desperate in any case.

Also LOL at promoting a Hawaii based card for 4K gaming.
This is clear as day for every one to see and judge.


Wachie is just completely unbearable in AMD threads. Everything which might remotely be interpreted as a negative statement needs to be relentlessly attacked, either directly or passive-aggressively. I get that it's hard to be a fan of a company constantly on the verge of going under, but I still had to put him on ignore at some point.
Coming from the person who defends Nvidia in every thread. It is why you find USoldier perfectly fine - comedy gold.
 
Hey guys, i like graphics cards, they're cool huh?

Really though, that's a very handsome reference shroud AMD has there, much better than the cheap looking plastic monstrosity of their old cards.

Wachie is just completely unbearable in AMD threads. Everything which might remotely be interpreted as a negative statement needs to be relentlessly attacked, either directly or passive-aggressively. I get that it's hard to be a fan of a company constantly on the verge of going under, but I still had to put him on ignore at some point.
 
Has this been posted yet?

Cooling the card is a closed loop liquid cooler that&#8217;s capable of dissipating 500W of heat. On the Fury X, which has a typical board power dissipation of 250-275W the result is an operating temperature of 50c at an unheard of whisper quiet 32dB.

http://wccftech.com/amd-radeon-r9-f...id-cooling-pump-block-pictured/#ixzz3dKJNDeTc

Water is so the way to go. No reference cooler from any company has ever been nearly this good before.

The site also posts their mathematical performance guess. Stock for stock better than 980Ti is their guess.
 
Has this been posted yet?



http://wccftech.com/amd-radeon-r9-f...id-cooling-pump-block-pictured/#ixzz3dKJNDeTc

Water is so the way to go. No reference cooler from any company has ever been nearly this good before.

The site also posts their mathematical performance guess. Stock for stock better than 980Ti is their guess.

This bit is interesting:

Cooling the card is a closed loop liquid cooler that’s capable of dissipating 500W of heat. On the Fury X, which has a typical board power dissipation of 250-275W the result is an operating temperature of 50c at an unheard of whisper quiet 32dB.

In comparison the reference designed Nvidia NVTTM cooler used for the GTX Titan X and 980 Ti maintains temperatures at 84c at a noise level of 40dB. The combination of the over-engineered power delivery system as well as the incredibly capable cooling system make Fury X what AMD described as “an overclocker’s dream” graphics card.

Read more: http://wccftech.com/amd-radeon-r9-f...id-cooling-pump-block-pictured/#ixzz3dKNX8Lo1

I think by 'operating' they mean load temp. But this suggests the cards overclock much better than a Titan X, and are also faster at stock if the leaked benches are correct. Downside is power draw is higher. But at least temps look great.
 
Hey guys, I have some questions about the Fury X,

1. Do I need to have a fully water cooled system in order to install the Fury X?

2. Since it is a water cooled card, do you need a certain case to install this type of card?

I am currently using the CM 690 II.

cm-690-2.jpg
 
Top Bottom