AMD Radeon Fury X Series | HBM, Small Form Factor And Water Cooling | June 16th

There is no question that with 2x the perf/W Hawaii would stick out like a sore thumb. However they are not in the same pricing tiers so I doubt people would think of them as direct replacements, with the majority going for the 390 at around the $300 mark.

The efficiency in Nano will come with a price premium.

Since the Nano will be 4GB instead of 8GB like the 390/390X, I'm hopeful it will come in at $400 - $449 rather than $499 as you assume. It may be that AMD knows a price point at ~$400 will cannibalize 390/390X sales too much at the beginning of this cycle. They no doubt want to get rid of all that back-log of 290/290X silicon that has been re-branded. I expect the 300 series 8GB variants price premium to be short-lived though once word of mouth about it being re-brands gets widely circulated. The extra RAM will carry the 300 series a little ways...but bottom line is over the next 6-12 months more & more people are going to want the Fury series at a lower price point...not a re-brand (you could argue this lack of compelling offerings at lower price points will cause a lot of pent up demand for the 400 series which will hopefully be entirely Fury based & finally ditch Bonaire, Pitcairn, Hawaii, etc...but that's, what, at least 18 months away?). AMD really needs to find a way to get the Nano prices down & crank up sales...barring a surprise Tonga XT announcement the Nano looks like their best bet to claw back some decent market share & create some mainstream excitement about the AMD brand again. As impressive as Fury XT truly is, they certainly aren't going to get significant market share back with a $650 GPU. Since nothing in the 300 stack is compelling enough to challenge the status quo I don't see other options. They need to shake things up.
 
I dont see a brand new >500mm2 die with HBM coming anywhere close to the 400 price point like you hope.

Most likely the Nano is 56 CUs, so about 3584 SPs at 800Mhz. So that would put it slightly ahead of the 290X (5.73 TFLOPs v 5.63 TFLOPs, about the same pixel rate and 179 GTexels v 176 GTexels) at around 145-150W.
 
I dont see a brand new >500mm2 die with HBM coming anywhere close to the 400 price point like you hope.

Most likely the Nano is 56 CUs, so about 3584 SPs at 800Mhz. So that would put it slightly ahead of the 290X (5.73 TFLOPs v 5.63 TFLOPs, about the same pixel rate and 179 GTexels v 176 GTexels) at around 145-150W.

You may be right but if so the Nano won't be nearly as disruptive or as big of a help to AMD's bottom line as it deserves to be. It may not be the most powerful of the Fury lineup, but it's certainly the most interesting. A low profile version would be a game-changer for Steam machines.

In an ideal world Nano would come in at $450 & there would be 3 more cut down versions of Fury at the $350, $250, & $150 price points so we could say buh-bye to the re-brands. As it stands, Nano will likely be my next major GPU purchase until Arctic Islands drops. I just wish more people would get to experience Fury...if only for AMD's long term survival.

People need to understand something: AMD's brand is currently so under-appreciated - perhaps even borderline "toxic" - that nVidia can sell the 4GB 960 (9-SIXTY, not 970) for more money than the 4GB 290. This is AMD's no.1 problem & speaks to the heart of the matter of how AMD has been so poorly mismanaged away from being a successful company. They apparently have the financials to remain solvent through 2018ish...so we're not talking any immediate threat to them fading away before the 400 series & Zen drops. But, ideally AMD needs this year to bring them back into some semblance of competitiveness...yet an entire series of rebrands just isn't going to do it. The re-brand scheme seems basically designed to try stem the tide of market share loss while they prepare the 400 series. Fury at $550 & $650 is essentially an "image cultivation" product. It will help but it's not a market share winner. Nano at $450 & especially Tonga XT (380X) at $250 would help win back actual market-share.

On a side note, does anyone else think AMD has a bit of a marketing snafu to overcome with the 4GB vs 8GB thing? I mean, it might be kind of a challenge to tout "4GB of HBM memory is the best!" for the Fury & then proclaim "8GB of memory on the R9 390 series cards is the best!". At the same time, the hypocrisy on this issue in the gaming press will probably begin to show...we saw many editorials in Jan/Feb of this year about how 3.5gb is enough...but I'm expecting to see new editorials on how 4gb isn't enough. Regardless of performance.
 
Yeah, the VRAM message is confusing.

If the benchmarks show they were right about 4GB being sufficient for 4K with the right memory management and HBM, then that dual Fiji is going to blow the Titan X away.
 
The great thing is no matter what the facts or "spin" is - people will claim 4GB VRAM is not enough. The VRAM hysteria will never stop and it hasn't stopped since the early days.

12GB VRAM is 95% marketing. It's just simply not needed with a card with that muscle (a.k.a. not enough).
 
12GB VRAM is 95% marketing. It's just simply not needed with a card with that muscle (a.k.a. not enough).

Useful for SLI rigs where a modest amount, say, 4gb, may be outstripped by the shader power of the multi-GPU setup. 980ti has a better balance, although in a year or two you might see crazy high end stuff doing much better on titan X setups when 6GB is outpaced. It's easier to increase the amount of vram by doubling or halving, rather than increasing by a small amount (e.g. 8GB Titan X, 6GB cut down Titan X) which is much less common.
 
Useful for SLI rigs where a modest amount, say, 4gb, may be outstripped by the shader power of the multi-GPU setup. 980ti has a better balance, although in a year or two you might see crazy high end stuff doing much better on titan X setups when 6GB is outpaced. It's easier to increase the amount of vram by doubling or halving, rather than increasing by a small amount (e.g. 8GB Titan X, 6GB cut down Titan X) which is much less common.

I guess. In a year or two (more like 6 months) my SLI TITAN X setup will be sold off and running in someone else's secondary box. Future proofing is STUPID. This is one of the primary arguments of the "moar VRAM" crowd.

4K/5K/8K slideshows...no thanks. Give me 1440p/144Hz.
 
Now if only this kind of beast could've been in the PS4.

No more struggling to barely make it to 30FPS when trying to push the graphical envelope.
 
Now if only this kind of beast could've been in the PS4.

No more struggling to barely make it to 30FPS when trying to push the graphical envelope.

Well, then you would have complained about the PS4's $1299 price tag at launch...oh, and the CPU would be a horrendous bottleneck.
 
I guess. In a year or two (more like 6 months) my SLI TITAN X setup will be sold off and running in someone else's secondary box. Future proofing is STUPID. This is one of the primary arguments of the "moar VRAM" crowd.

4K/5K/8K slideshows...no thanks. Give me 1440p/144Hz.

Future proofing is generally a fool's errand but statistically only a small minority upgrades their PC every year. The Smokey Tier gamers. Titan owners are also a rare breed but they're usually the kind of person who buys this stuff day 1 to guarantee maximum performance at any cost.

On the other hand, a lot of people take Nvidia to task for skimping on VRAM historically. Like the 680 being 2GB to AMD's flagship's 3, or 780 being 3GB to the AMD flagship's 4. The 390X is shipping with 8GB vram by default, is that crazy excessive? 4GB on their flagship card is undesirable technologically as well as from a marketing perspective. If it was easy for them to have more, they would have more, no doubt about it.
 
Kind of annoying that not only did they not show benchmarks of their own, but it doesn't sound like they sent review cards out or the reviews are just going to take a bit.

Completely unlike the 980ti launch where reviewers had the card before it was announced and reviews were ready to go.
 
We really don't know. Since AMD hasn't provided any real benchmarks it's hard to say whether or not 4k will be adequate with 4GB HBM memory. The real question remains what happens once DX12 comes around- since you'll probably need 2 Fury's for decent 4k gaming anyways, DX12 will take advantage of both card's VRAM. That is of course if of course, if and when games actually start supporting DX12.
 
Now if only this kind of beast could've been in the PS4.

No more struggling to barely make it to 30FPS when trying to push the graphical envelope.

Er if your just putting this GPU in there, the CPU would have been twice as big a bottleneck as it is now in regard to framerate considering the disparity between the two components.

And that's without taking into account that devs would likely aim for 30 anyway and max out the graphics.

If one can't buy this right now cause they dont have a PC, there's always the wait for the hypothetical PS5.

For a part coming out now, the GPU in that console coming out 4 years from now will be blow it out of the water.
 
If this launch is anything like all previous AMD and Nvidia flagship releases, we won't see reviews until the launch date.

So we'll see next week... The wait is killing me though.
 
If this launch is anything like all previous AMD and Nvidia flagship releases, we won't see reviews until the launch date.

So we'll see next week... The wait is killing me though.

Yeah, I really need to buy a new card now. I'm running 3 OG Titans on a HDMI 2.0 dispay in 4k/60Hz @ 4:2:0 and it's ugly as shit. If the Fury doesn't support HDMI 2.0 I won't be able to buy it.

I really hate the fact the plain jane Fury (no X) is coming out next month or I might consider 2 of those (I still might).

I still want to see the numbers and see if 4GB HBM VRAM is adequate for 4k gaming. I know with 2 cards and DX12 it probably won't matter but it will take a while for games to start supporting it.
 
Yeah, I really need to buy a new card now. I'm running 3 OG Titans on a HDMI 2.0 dispay in 4k/60Hz @ 4:2:0 and it's ugly as shit. If the Fury doesn't support HDMI 2.0 I won't be able to buy it.

I really hate the fact the plain jane Fury (no X) is coming out next month or I might consider 2 of those (I still might).

I still want to see the numbers and see if 4GB HBM VRAM is adequate for 4k gaming. I know with 2 cards and DX12 it probably won't matter but it will take a while for games to start supporting it.

I thought that it has been confirmed that the Fury line has HDMI 2.0 from the previous picture leaks of the Fury X? Sites that are reporting the leaks are saying it has DisplayPort 1.2a and HDMI 2.0.

http://www.techpowerup.com/213428/amd-radeon-r9-fury-x-pictured-some-more.html
http://hexus.net/tech/news/graphics/82975-first-teaser-images-amd-radeon-r9-390x-published/
http://www.overclock3d.net/articles/gpu_displays/amd_radeon_fury_x_pictured/1
 
Cross post.

Code:
[url=http://imgur.com/M4PXHaW][img]http://i.imgur.com/M4PXHaWl.jpg[/img][/url]

Code:
[url=http://imgur.com/BEZhfUn][img]http://i.imgur.com/BEZhfUnl.png[/img][/url]

Code:
[url=http://imgur.com/jYj8Q8k][img]http://i.imgur.com/jYj8Q8kl.png[/img][/url]

Code:
[url=http://imgur.com/t3MHKIH][img]http://i.imgur.com/t3MHKIHl.jpg[/img][/url]

Code:
[url=http://imgur.com/d59PsNa][img]http://i.imgur.com/d59PsNal.jpg[/img][/url]

Code:
[url=http://imgur.com/j9qM3jI][img]http://i.imgur.com/j9qM3jIl.jpg[/img][/url]

Code:
[url=http://imgur.com/7ub843Y][img]http://i.imgur.com/7ub843Yl.jpg[/img][/url]

Code:
[url=http://imgur.com/Zqfq0Aa][img]http://i.imgur.com/Zqfq0Aal.jpg[/img][/url]

Code:
[url=http://imgur.com/IUhU2Xf][img]http://i.imgur.com/IUhU2Xfl.jpg[/img][/url]

Code:
[url=http://imgur.com/1hkAbNz][img]http://i.imgur.com/1hkAbNzl.jpg[/img][/url]

Code:
[url=http://imgur.com/dSs5OOT][img]http://i.imgur.com/dSs5OOTl.jpg[/img][/url]

Code:
[url=http://imgur.com/ngV0QmP][img]http://i.imgur.com/ngV0QmPl.jpg[/img][/url]

Code:
[url=http://imgur.com/ekdq1AQ][img]http://i.imgur.com/ekdq1AQl.jpg[/img][/url]

Code:
[url=http://imgur.com/iCARIwv][img]http://i.imgur.com/iCARIwvl.jpg[/img][/url]

Code:
[url=http://imgur.com/sP9C6Mn][img]http://i.imgur.com/sP9C6Mnl.jpg[/img][/url]

Code:
[url=http://imgur.com/T268zr5][img]http://i.imgur.com/T268zr5l.jpg[/img][/url]
 
I thought that it has been confirmed that the Fury line has HDMI 2.0 from the previous picture leaks of the Fury X? Sites that are reporting the leaks are saying it has DisplayPort 1.2a and HDMI 2.0.

http://www.techpowerup.com/213428/amd-radeon-r9-fury-x-pictured-some-more.html
http://hexus.net/tech/news/graphics/82975-first-teaser-images-amd-radeon-r9-390x-published/
http://www.overclock3d.net/articles/gpu_displays/amd_radeon_fury_x_pictured/1

How do they know the specs from a picture? I'm wondering if it's true HDMI 2.0 full bit depth 4:4:4? At any rate that's all I care about, that and the 4GB issue since I have a 4k display.

Those "benchmarks" are bullshit numbers released by AMD and are unobjective to say the least. I'm waiting for the review sites to post some real numbers.
 
Any idea why there is an asterisk after Ultra Settings on that benchmark shot?

That's a really good result compared to the Titan X, but that implies something was disabled to me (maybe Nvidia Gameworks features?)
 
Cross post.

Code:
[url=http://imgur.com/M4PXHaW][img]http://i.imgur.com/M4PXHaWl.jpg[/img][/url]

Code:
[url=http://imgur.com/BEZhfUn][img]http://i.imgur.com/BEZhfUnl.png[/img][/url]

Code:
[url=http://imgur.com/jYj8Q8k][img]http://i.imgur.com/jYj8Q8kl.png[/img][/url]

Code:
[url=http://imgur.com/t3MHKIH][img]http://i.imgur.com/t3MHKIHl.jpg[/img][/url]

Code:
[url=http://imgur.com/d59PsNa][img]http://i.imgur.com/d59PsNal.jpg[/img][/url]

Code:
[url=http://imgur.com/j9qM3jI][img]http://i.imgur.com/j9qM3jIl.jpg[/img][/url]

Code:
[url=http://imgur.com/7ub843Y][img]http://i.imgur.com/7ub843Yl.jpg[/img][/url]

Code:
[url=http://imgur.com/Zqfq0Aa][img]http://i.imgur.com/Zqfq0Aal.jpg[/img][/url]

Code:
[url=http://imgur.com/IUhU2Xf][img]http://i.imgur.com/IUhU2Xfl.jpg[/img][/url]

Code:
[url=http://imgur.com/1hkAbNz][img]http://i.imgur.com/1hkAbNzl.jpg[/img][/url]

Well... There you go, I guess.
 
How do they know the specs from a picture? I'm wondering if it's true HDMI 2.0 full bit depth 4:4:4? At any rate that's all I care about, that and the 4GB issue since I have a 4k display.

Those "benchmarks" are bullshit numbers released by AMD and are unobjective to say the least. I'm waiting for the review sites to post some real numbers.

It's a new chipset and not a rebrand, so it's very likely it has HDMI 2.0. I don't know if AMD cards will have similar issues that Maxwell card have with 4K 4:4:4/4:2:0 as it could either be a driver problem and/or a TV problem. You won't know until there's a compatibility list or something from AMD or someone testing them out. If the 3DMark leaked benchmark is to be believe, then there isn't any issue with 4K and 4GB, but we will find out about that next week.
 
Any idea why there is an asterisk after Ultra Settings on that benchmark shot?

That's a really good result compared to the Titan X, but that implies something was disabled to me (maybe Nvidia Gameworks features?)
Typically the press deck has the details of the benchmark, test rig setup etc at the end.
 
93674rstxfwukn.jpg


And for those wondering card length

Fury X - 7.6 inches
Fury Nano - 6 inches
Titan X - 10.5 inches
 
I'm now just waiting for the reviews, the overclocking headroom, firm details on the Fury and Fury X differences outside of AIO cooler and finally, board partners offerings for different coolers.

My plan was to go custom WC.
 
93674rstxfwukn.jpg


And for those wondering card length

Fury X - 7.6 inches
Fury Nano - 6 inches
Titan X - 10.5 inches

Look at that Nano man.

Super interesting little proposition. It's just got a single fan cooling it as it sips half the power and will therefore generate much less heat. I wanna know price and performance.
 
That's almost 100W more than a competition's product of the same range so yeah, I'd say that was desperate in any case.

Also LOL at promoting a Hawaii based card for 4K gaming.

Definitely not the best decision, but I imagine at least one of the two Hawaii cards could probably deliver the "console experience" ( similar settings and FPS) at 4k in some games, or at least 1600p.
 
Look at that Nano man.

Super interesting little proposition. It's just got a single fan cooling it as it sips half the power and will therefore generate much less heat. I wanna know price and performance.
My guess on the Fury Nano.

3584 SP
224 TMU
64 ROP
900 MHz Core
950 MHz Memory

6.4 TFlops
460 GB/sec
170W

Performance - GTX980 Territory
Price - $479
 
My guess on the Fury Nano.

3584 SP
224 TMU
64 ROP
900 MHz Core
950 MHz Memory

6.4 TFlops
460 GB/sec
170W

Performance - GTX980 Territory
Price - $479

Mhh yeah, but then I could buy a GTX 970 Mini for 150$ less and OC.

Would be nearly the same performance.
 
So the normal fury doesn't come out till next month? Lame, I don't have that kind of patience.

Any idea of the UK price points yet?

£550 at a minimum for the Fury X I imagine.

Retailers must be rubbing their hands, two chances to shoot fish in a barrel inside a month.

Wouldn't be surprised to see £600 Fury X available.

Wonder if there will be non reference Fury X available.
 
Top Bottom