AMD Radeon Fury X Series | HBM, Small Form Factor And Water Cooling | June 16th

Meant 1080p60.

Not even Titan X or 980ti manage that, but you expect it from a $430 card?

I would sort of expect their high end 3xx series to be able to run it better than a 290x can, so the fact that neither can run it at 1080p60 is disappointing.

390x is the 290x with modifications. You're thinking of the Fury X which isn't out for another week. And besides, as in my first reply, the Titan X and the 980ti cannot maintain 60FPS, not sure what you're expecting of these cards.
 
I would sort of expect their high end 3xx series to be able to run it better than a 290x can, so the fact that neither can run it at 1080p60 is disappointing.

This is no longer their high end architecture. It was high end 2 years ago, and is beating nvidias former top chip that is a year newer. So I don't really get what your point is. The Fury line is the high end, and will easily do 1080p60 in virtually any title.

The graph you are posting is also at 1440p, so I don't know what you're trying to do. The 390x probably does do almost 60fps in witcher 3 at 1080p.
 
I'm talking about with Hairworks off, since everyone knows it's shit and not worth the performance hit.

Am I reading this wrong or something?

I'm confused.
R9-390X-41.jpg


Edit: apparently I'm blind, 1440p, duh!

Anyway, Fury X next week, we'll see how it stacks up to the Titan X, till then, don't expect a rebrand with minor changes of a 2 year old card doing wonders.
 
Fury X would be overkill, even Fury would probably be overkill. If you can wait awhile, there's still the Fury Nano. If you can't wait that long, the I guess you could go with 390/X/970/980.

I just can't wrap my head around why people keep stating this (clearly false) statement. My GTX 980 can't run Ryse, The Witcher 3, GTA V, Dying Light, Assassin's Creed Unity, and a few other games, at a completely locked 1080P/60FPS if you want max settings and a reasonable amount of AA. That's TODAY. With TODAY'S games. What happens when tomorrow's games start to arrive? Star Wars Battlefront? Cyberpunk 2077? Rise of the Tomb Raider? Arkham Knight? Star Citizen? Etc..?

So when someone asks what card to get that will be great for years to come, I don't think I could ever say that ANY single card on the market would be overkill at this point.

Go for a Fury X or a GTX 980Ti. I'm sure you won't regret it.
 
Am I reading this wrong or something?
Edit: apparently I'm blind, 1440p, duh!

Anyway, Fury X next week, we'll see how it stacks up to the Titan X, till then, don't expect a rebrand with minor changes of a 2 year old card doing wonders.

Again, I'm talking about maintaining a stable 60fps at 1080p.

390x is overkill for 1080p.
 
Fury X will be a monster card for 1080p gaming. If all you want is a good card to play modern games with, you don't need go super high end like this, though.

But seriously, next generation GPU's with a die shrink and HBM2 are going to improve massively, so anybody looking to buy a new card around this time should really look at what they have right now and if they can wait or not. I can see how a 5770 user might not be able to wait, though.

You have to factor in that DirectX 12 is also launching this year.
I think DX12 cards will get a second speed bump along the way making them viable past 2016 without the need to upgrade to HBM2 cards just yet.
 
This is the problem I face, as I have a R9 270 and a 1440p monitor :( I mean, scaled down 1080p looks okay for games like Witcher 3 I guess I'll just have to wait until the die shrink. Kind of tempted to buy a sub $100 R9 270 and crossfire it.

I have a R9 270x and I get 30FPS on Witcher 3 with most things on high/ultra at 1440p. Looks amazeballs. (Win 8.1)

Can you not overclock that card? I'm running at 1175MHz / 1450MHz.

Mind you I'm running i5 4690k @ 4.4ghz
 
I just can't wrap my head around why people keep stating this (clearly false) statement. My GTX 980 can't run Ryse, The Witcher 3, GTA V, Dying Light, Assassin's Creed Unity, and a few other games, at a completely locked 1080P/60FPS if you want max settings and a reasonable amount of AA. That's TODAY. With TODAY'S games. What happens when tomorrow's games start to arrive? Star Wars Battlefront? Cyberpunk 2077? Rise of the Tomb Raider? Arkham Knight? Star Citizen? Etc..?

So when someone asks what card to get that will be great for years to come, I don't think I could ever say that ANY single card on the market would be overkill at this point.

Go for a Fury X or a GTX 980Ti. I'm sure you won't regret it.

I echo this post. My 290 is starting to fall behind in the latest games just at the modest resolution of 1080p. As long as I can sell my current card to offset the cost of the next one, I'll always upgrade when needed instead of 'settling' for a few months/years/etc. Especially considering we have at least a year to wait for a node shrink and HBM 2.0,

I need improved tessellation and general performance increases from AMD and they are seemingly providing that gracefully with their Fury line of cards.
 
These cards seem to have good value. R9 380 4GB is a great choice for 239$, R9 390 8GB should put up a hell of a fight against a 970, 390X seems slightly better than 980 (in some titles at least) whilst having a lower price point, and Fury X seems better than 980ti. Good job AMD.

Hmmm... so as a 290x owner, upgrading to 390x would be dumb right?

I think I will wait for Arctic Islands.

Of course. Just wait for the first 14nm gen next year, or the one after that.
 
The 390x has a slight 6% overclock and 4gb more memory than the 290x. I still don't understand what kind of black magic AMD used to make the 390x perform much better than the 290x.

I mean the 290x was struggling to keep up with the 970 in some titles, and now the 390x is giving the 980 a run for it's money.
 
The 390x has a slight 6% overclock and 4gb more memory than the 290x. I still don't understand what kind of black magic AMD used to make the 390x perform much better than the 290x.

I mean the 290x was struggling to keep up with the 970 in some titles, and now the 390x is giving the 980 a run for it's money.

do we know if they changed the architecture yet? does it have the improvements tonga has?
 
You have to factor in that DirectX 12 is also launching this year.
I think DX12 cards will get a second speed bump along the way making them viable past 2016 without the need to upgrade to HBM2 cards just yet.
This is going to make a minimal improvement on the GPU side and its mainly for games that get built with it in mind, which will be limited over the first year.
 
I just can't wrap my head around why people keep stating this (clearly false) statement. My GTX 980 can't run Ryse, The Witcher 3, GTA V, Dying Light, Assassin's Creed Unity, and a few other games, at a completely locked 1080P/60FPS if you want max settings and a reasonable amount of AA. That's TODAY. With TODAY'S games. What happens when tomorrow's games start to arrive? Star Wars Battlefront? Cyberpunk 2077? Rise of the Tomb Raider? Arkham Knight? Star Citizen? Etc..?

So when someone asks what card to get that will be great for years to come, I don't think I could ever say that ANY single card on the market would be overkill at this point.

Go for a Fury X or a GTX 980Ti. I'm sure you won't regret it.

This. Very much this. It is exactly what I am after, or as close as I can get at this point with a single card solution.
 
I just can't wrap my head around why people keep stating this (clearly false) statement. My GTX 980 can't run Ryse, The Witcher 3, GTA V, Dying Light, Assassin's Creed Unity, and a few other games, at a completely locked 1080P/60FPS if you want max settings and a reasonable amount of AA. That's TODAY. With TODAY'S games. What happens when tomorrow's games start to arrive? Star Wars Battlefront? Cyberpunk 2077? Rise of the Tomb Raider? Arkham Knight? Star Citizen? Etc..?

So when someone asks what card to get that will be great for years to come, I don't think I could ever say that ANY single card on the market would be overkill at this point.

Go for a Fury X or a GTX 980Ti. I'm sure you won't regret it.
One of the problems with PC games is that the developers don't/can't spend a lot of extra time on just the PC version, so as a result you get loads of graphics settings that when turned to ultra/highest they don't really give you the best bang for buck from a GPU resource and processing standpoint. Ultra settings tend to be really really demanding but rarely justify the sort of processing power they need.

It's weird seeing people shit on things like PhysX for taking a fair bit of processing power yet it's an instantly recognisable graphical effect in comparison to many games' differences between high and highest/ultra.

I'm starting to think of ultra settings as a bit gimicky now to be honest. Some games just don't justify the extra load on hardware, but that's what sells the hardware I guess.
Some of the settings in Witcher 3 are a good example of this sort of thing.

I still want the Fury X though. I've gotten into that addiction or habbit of feeling the need to turn games to ultra all the time or feeling like I'm missing out.
 
I don't suppose there's been any comments from AMD about a similar feature to Shadowplay?

I realise AMD have the Gaming Evolved app with AMD GVR but I find the results to be quite terrible. Bandicam produces better with AMD VCE functionality.

Not a deal breaker for me but is for some of my friends.
 
It's kind of sad that nobody at AMD even seems to know anything about the final specs of the card. Someone else from AMD on the official forum says they have to check whether the Fury has HDMI 2.0 or not. I guess we'll have to wait until next week to find out for sure. I wouldn't be surprised if the answer is not. I wouldn't be able to use it if so.
 
Fury X would be overkill, even Fury would probably be overkill. If you can wait awhile, there's still the Fury Nano. If you can't wait that long, the I guess you could go with 390/X/970/980.
lol no

I need to upgrade from the 7970 GE I have because it's not enough for 1080p 120fps in some games :c

Fury X it is!

Long live 120fps! :D

do we know if they changed the architecture yet? does it have the improvements tonga has?

It is Hawaii v1.1.1
 
I don't suppose there's been any comments from AMD about a similar feature to Shadowplay?

I realise AMD have the Gaming Evolved app with AMD GVR but I find the results to be quite terrible. Bandicam produces better with AMD VCE functionality.

Not a deal breaker for me but is for some of my friends.

You saying the gaming evolve app from http://raptr.com/amd has bad quality? There's no bitrate option? You could try OBS from https://obsproject.com/download
 
The 390x has a slight 6% overclock and 4gb more memory than the 290x. I still don't understand what kind of black magic AMD used to make the 390x perform much better than the 290x.

I mean the 290x was struggling to keep up with the 970 in some titles, and now the 390x is giving the 980 a run for it's money.

Most likely the 290x bench results are from when it launched. So you are also seeing the effect of driver improvement over the course of ~2 years.

In the witcher 3 slide above you can see it is only 3 frames ahead at 1440p, which makes sense.
 
You saying the gaming evolve app from http://raptr.com/amd has bad quality? There's no bitrate option? You could try OBS from https://obsproject.com/download

There's plenty of options in Gaming Evolved but the results are inconsistent and choppy in my experience. My friend can't even get the app to run with an R9 290.

OBS with AMD VCE gives okay results, Bandicam better but it doesn't have the 'replay' feature which is the main selling point I'd argue. I realise you can achieve this with OBS but like I said, for me the results aren't amazing.

DXTory is where it's at but the feature set is meh.

I got all my friends to get a 970 and using Shadowplay is so damn simple in comparison.
 
It's kind of sad that nobody at AMD even seems to know anything about the final specs of the card. Someone else from AMD on the official forum says they have to check whether the Fury has HDMI 2.0 or not. I guess we'll have to wait until next week to find out for sure. I wouldn't be surprised if the answer is not. I wouldn't be able to use it if so.

Tom's Hardware said that the Fury X does not support HDMI 2.0. That seems to be the consensus at this point.
 
The 390x has a slight 6% overclock and 4gb more memory than the 290x. I still don't understand what kind of black magic AMD used to make the 390x perform much better than the 290x.

I mean the 290x was struggling to keep up with the 970 in some titles, and now the 390x is giving the 980 a run for it's money.

290x was throttling in launch reviews.

The 980 seems to stomp the 390X when comparing OC to OC though, I expect if the 390X starts eating into 900 series sales, Nvidia will just refresh the line with all custom cards as well.
 
I just can't wrap my head around why people keep stating this (clearly false) statement. My GTX 980 can't run Ryse, The Witcher 3, GTA V, Dying Light, Assassin's Creed Unity, and a few other games, at a completely locked 1080P/60FPS if you want max settings and a reasonable amount of AA. That's TODAY. With TODAY'S games. What happens when tomorrow's games start to arrive? Star Wars Battlefront? Cyberpunk 2077? Rise of the Tomb Raider? Arkham Knight? Star Citizen? Etc..?

So when someone asks what card to get that will be great for years to come, I don't think I could ever say that ANY single card on the market would be overkill at this point.

Go for a Fury X or a GTX 980Ti. I'm sure you won't regret it.

Right, but tomorrows games are still multiplatform titles limited by console hardware. Chasing (your monitor's resolution) and a locked 60fps is a terrible game to put yourself through.

My MAIN concern personally is if the card I buy today, outpaces consoles in future games the way it does in today's games (speaking strictly multiplatform).

By that measure, my 670 has been one of the greatest investments in years (majorly outclassing the 360/PS3 and still outdoing these new consoles in multiplats by a great margin at BETTER settings). I don't anticipate that changing anytime soon, if ever.

Get a 970/980 or figi and you'll find settings that are MORE than acceptable while maintaining really high framerates. Ultra settings vary so wildly from game to game, that you'll never be happy using it as a baseline.

My two cents (which if you're wealthy and/or need more than 60fps, you can rightly ignore and continue to chase the carrot)

Edit: added 60fps group
 
Taken from OCUK forums

Having looked at a bunch of reviews now, there is a clear trend.

Sites with a dodgy reputation that have been shown to distort tests in the past in NVIDIA's favour are using the older Catalyst 15.5Beta drivers rather than the new Catalyst 15.15Beta drivers (3xx launch drivers).

There's a large disparity in performance.

15.5 shows the 390X (and 3xx series generally) as being slower than the 970 most of the time below 4K, and still slower than the 980 at 4K.

15.15 shows the 390X smashing the 970 at 1080p in most titles, and beating the 980 badly at 1440p and above.

The editorial spin on the cards, using these obviously distorted results as justification, is equally disparate.

Best exhaustive test I've seen so far is at Hardaware Info NL - http://nl.hardware.info/reviews/613...et-bestaande-chips-benchmarks-alien-isolation -- 15.15 3xx launch drivers

Compare that with the ******** at Tom's, for example ... -- older 15.5 drivers
 
I just can't wrap my head around why people keep stating this (clearly false) statement. My GTX 980 can't run Ryse, The Witcher 3, GTA V, Dying Light, Assassin's Creed Unity, and a few other games, at a completely locked 1080P/60FPS if you want max settings and a reasonable amount of AA. That's TODAY. With TODAY'S games. What happens when tomorrow's games start to arrive? Star Wars Battlefront? Cyberpunk 2077? Rise of the Tomb Raider? Arkham Knight? Star Citizen? Etc..?

So when someone asks what card to get that will be great for years to come, I don't think I could ever say that ANY single card on the market would be overkill at this point.

Go for a Fury X or a GTX 980Ti. I'm sure you won't regret it.

I guess it depends on how long you plan on keeping these cards. I personally think anyone paying $600+ for a card, probably will always be on the look out for the next best thing, so when Pascal cards hit, they'll be dropping these cards fast. I also think the resale value of the 980 and 980ti won't be that great next year. Going with the best this year may not be the most cost efficient when we know what's coming.
 
I also think the resale value of the 980 and 980ti won't be that great next year. Going with the best this year may not be the most cost efficient when we know what's coming.

It never is, in my experience you can get about 60% of what you originally paid for it back if you sell it the next year. For the cards coming in 2016 the situation will probably be the same.
 
Also, for those with the budget, I think AMD will be releasing their 8GB Fury X2 (or Fury 'Maxx') probably later this year.

So, the rumored "Bermuda" card is for the most part true. The card with two Fiji XT GPUs, regardless of official product name and codename.

articles with rumored dual-Fiji Bermuda card:
http://videocardz.com/54858/amd-rad...tion-395x2-bermuda-390x-fiji-and-380x-grenada
http://www.guru3d.com/news-story/am...is-380xfiji-is-390x-and-bermuda-is-395x2.html
http://wccftech.com/nvidia-gm200-titan-2-amd-fiji-380x-bermuda-390x-benchmarked/

And now:

[Updated] AMD intros Project Quantum – Dual GPU ‘Fiji XT Powered’ 4K Powerhouse in Tiny Form Factor

http://wccftech.com/amd-intros-project-quantum-powered-dual-fiji-chip/


AMD Reveals Dual Fiji Board, World’s Fastest Graphics Card – 17 TERAFLOPS Small Form Factor Behemoth

http://wccftech.com/amd-dual-fiji-fury-graphics-card/

Amazingly, the dual Fiji graphics card is actually no longer than a single Fury X graphics card. AMD described this board as “two Fury X cards” in one. Each Fury X graphics card has a computing power of 8.6 TERAFLOPS. Making a dual GPU board a 17+ TERAFLOP number crunching supercomputer.
Project Quantum pictured above is a small form factor PC powered by this dual Fiji board. Project Quantum is a console sized box, but the similarities end there. Because this mini supercomputer over 9 times more powerful than a PS4 and a whopping 13 times more powerful than an XBOX ONE. And all of this oomph is enabled by the dual FIji board pictured below.

ezYcSKt.jpg
 
So, the rumored "Bermuda" card is for the most part true. The card with two Fiji XT GPUs, regardless of official product name and codename.

articles with rumored dual-Fiji Bermuda card:
http://videocardz.com/54858/amd-rad...tion-395x2-bermuda-390x-fiji-and-380x-grenada
http://www.guru3d.com/news-story/am...is-380xfiji-is-390x-and-bermuda-is-395x2.html
http://wccftech.com/nvidia-gm200-titan-2-amd-fiji-380x-bermuda-390x-benchmarked/

And now:

[Updated] AMD intros Project Quantum – Dual GPU ‘Fiji XT Powered’ 4K Powerhouse in Tiny Form Factor

http://wccftech.com/amd-intros-project-quantum-powered-dual-fiji-chip/


AMD Reveals Dual Fiji Board, World’s Fastest Graphics Card – 17 TERAFLOPS Small Form Factor Behemoth

http://wccftech.com/amd-dual-fiji-fury-graphics-card/

Yep, that dual Fiji Fury card is going to be an absolute monster the likes of which has never been seen before if the 295X2 is anything to go by. It'll also drink power like a monster too. The 295X2 is still the most powerful card on the market to this day.

Interestingly, when the Fury X2 releases, AMD will have probably the top three most powerful enthusiast GPUs in the world in order.

1. Fury X2
2. R9 295X2
3. Fury X
4. Titan X
5. 980 Ti

Pack it up Nvidia, it's all over
:P
 
I guess it depends on how long you plan on keeping these cards. I personally think anyone paying $600+ for a card, probably will always be on the look out for the next best thing, so when Pascal cards hit, they'll be dropping these cards fast. I also think the resale value of the 980 and 980ti won't be that great next year. Going with the best this year may not be the most cost efficient when we know what's coming.

Well, your resale is much better the faster you sell it. My bet would be 980ti will still fetch ~400-$500 dollars when the next Nvidia high end cards come out. 980s are still selling on eBay for ~400 on eBay. If you hold on to the card for 2+ years you aren't going to get a whole lot back. Financially it is pretty much a wash to hold on to stuff for ever, or upgrade as soon as possible. Might as well always have the newest tech instead of holding onto stuff forever. Short term upgrades essentially amount to a product rental. People selling their 980s got to use a 980 for over roughly a year, and only spent $100-$150 to do it.

So if you buy a $600 card and hold onto for 5 years until its worth basically nothing, you are spending the same amount annually but for most of the time you have he card your performance will be shit compared to people on the rental program.
 
You have to factor in that DirectX 12 is also launching this year.
I think DX12 cards will get a second speed bump along the way making them viable past 2016 without the need to upgrade to HBM2 cards just yet.
GPUs won't get significantly faster in GPU-limited scenarios because of DX12.
 
Yep, that dual Fiji Fury card is going to be an absolute monster the likes of which has never been seen before if the 295X2 is anything to go by. It'll also drink power like a monster too. The 295X2 is still the most powerful card on the market to this day.

Interestingly, when the Fury X2 releases, AMD will have probably the top three most powerful enthusiast GPUs in the world in order.

1. Fury X2
2. R9 295X2
3. Fury X
4. Titan X
5. 980 Ti

Pack it up Nvidia, it's all over
:P

They are stupid for not releasing a full Tonga 4gb chip for the mainstream market. That's where all the money is. People buying 960s and 970s. The full Tonga is a no brainer, and they don't release it. Dumb.
 
Hopefully dual Fiji gives AMD an incentive to push developers towards unifying memory pools in DX12. All that power going to waste with a 4GB memory limit would be a shame.

I wonder if nVidia will bother to respond to this. Dual GM200 would be such a power hungry beast.
 
They are stupid for not releasing a full Tonga 4gb chip for the mainstream market. That's where all the money is. People buying 960s and 970s. The full Tonga is a no brainer, and they don't release it. Dumb.

Yeah, I read something or another about Apple being the reason they haven't released a full Tonga chip card because it's being used in the newest MacBook Pro or something. Or I could have got that completely wrong.

I do think the R9 Nano could definitely be their ace even if it will be priced a little higher than the more mainstream priced-bracket cards like the 970 and 390. If they drop the Nano at $450, it'll sell a ton.
 
Yeah, I read something or another about Apple being the reason they haven't released a full Tonga chip card because it's being used in the newest MacBook Pro or something. Or I could have got that completely wrong.

I do think the R9 Nano could definitely be their ace even if it will be priced a little higher than the more mainstream priced-bracket cards like the 970 and 390. If they drop the Nano at $450, it'll sell a ton.

I imagine the long term plan for the Nano will have it in the sub $400 bracket. Maybe not the first revision and definitely not in 2015 but just like we have re-spun Hawaii products we will definitely have re-spun HBM 1.0 products. The Nano seems like a really solid choice to be an amazing mid-tier card in the $300-$400 range down the road.
 
I imagine the long term plan for the Nano will have it in the sub $400 bracket. Maybe not the first revision and definitely not in 2015 but just like we have re-spun Hawaii products we will definitely have re-spun HBM 1.0 products. The Nano seems like a really solid choice to be an amazing mid-tier card in the $300-$400 range down the road.

Agree totally. Low power, low noise, tiny form-factor, HBM, Fiji performance. They just need to fucking get it released!
 
I do think the R9 Nano could definitely be their ace even if it will be priced a little higher than the more mainstream priced-bracket cards like the 970 and 390. If they drop the Nano at $450, it'll sell a ton.

It's going to be awesome when HBM cards eventually fill out the full lineup in the next year or so.

It will be glorious. Small HBM & 16nm cards, small cards everywhere.
 
Top Bottom