AMD Radeon Fury X Series | HBM, Small Form Factor And Water Cooling | June 16th

sorry to keep wasting page space with this question guys, but when are we expecting these to release?

[i read awhile back about "a few weeks from now", but that clearly hasn't materialized]

edit: okay, followed coloumb's first link, looks like "less than two weeks" now - fingers crossed #tothemax

Just under two weeks possibly for the 980 Ti. And between 2-4 weeks for the 390X.
 
Well, looks like these two cards are going to go head-to-head, releasing within weeks of each other and with similar price and performance. You'll just have to choose between a smaller form factor 390X (it will have a new name) with HBM or a 980 Ti with more memory, albeit 6GB GDDR5.

First real pic of 980 Ti leaked today as well:

NVIDIA-GeForce-GTX-980-Ti-Graphics-Card-Maxwell-GM200-635x428.jpg


http://wccftech.com/nvidia-geforce-...powered-cost-effective-maxwell-graphics-card/

Fight! Fight! Fight!
 
That's one fucking game man. And if the bench was done at 1440p, the 290x would pull ahead of the 970 and close the gap with the 980.

Could you please stop with this bullshit about 290X always being faster than 970? Here's a lot of games for you. Most of the time 970 leads and 4K is the only place where 290X is ahead - both cards are providing unplayable framerates in 4K anyway so it really doesn't matter. The cards are pretty much equal on average.
 
index.php


It will?

Sure thats just one game, but AMD continues to lack support when needed the most: when the biggest games of the year come out.

Two possible things about this chart, one does it include HairWorks? Because that kills the 290x by default (you can set up a profile in the AMD Control Center to override the tessellation settings and that'll help out a ton), and two, AMD will be release 15.5 drivers which they stated will include optimizations for Witcher III.
 
Kepler performance P:Cars and W3 shows we desperately need competition in the desktop GPU market. Hopefully AMD performs, but I'm not really holding my breath.
 
Has it been confirmed whether the $849 price is true?

No, the source that leaked this isn't as solid as the other leaks, so this may be wide of the mark obviously.

Interesting though that Nvidia releasing the 980 Ti just before the 390X pits them squarely against each other and in turn pairs the two cards together as points of comparison. If indeed the 390X does only have 4GB HBM, it is a better scenario to be compared to a card with 6GB memory than one with 12 (the Titan X). Marketing-wise, this helps AMD slightly.
 
There is absolutely nothing wrong with amd drivers, in fact, they continue to optimize older cards for much longer than nvidia does. A 290x will outperform a 970, forever.

The benchmarks I've seen at 1080p, the GTX 970 beats the 290x almost everytime.
 
Frankly, that's what I find most appalling. The cards aren't even two years old (and cost their users a ton of money) and nVidia already dumped support for them faster than a loaded diaper.

Support and heavy driver-side optimizations are not the same thing. Kepler is supported by NV in full.

There are still no evidence of bad driver optimization in these cases. It may be that the games code itself doesn't work that well on Kepler.

Why don't you bug the guys who produced the code / games in the first place? They were obviously running the games on Kepler hardware and should've known that the performance isn't up to the standards. Why is it NVs fault that some 3rd party code is performing badly on their hardware?

People are too fast at jumping to conclusions.
 
Could you please stop with this bullshit about 290X always being faster than 970? Here's a lot of games for you. Most of the time 970 leads and 4K is the only place where 290X is ahead - both cards are providing unplayable framerates in 4K anyway so it really doesn't matter.

Lol stop with the complete and utter lies about the 970 being faster than a 290x. The witcher 3 being poorly optimized is cdprs fault, not AMDs. http://www.gamersnexus.net/game-bench/1947-witcher-3-pc-graphics-card-fps-benchmark

We aren't even bringing up he 970s stuttering issues because of its 3.5gb of good ram and .5gb of slow ram.

The 970 is faster in some games at 1080p sure, but the 290x is faster in most. I'll dig up the benchmark suites and post. The 970 is a great 1080p card. The 290x is a great 1440p+ card. And if you aren't blinded by trying to run everyhing at ultra at 4k like an idiot, the 290x does a great job of hitting 40-60fps with a mixture of settings.
 
No. Build it with a bigger PCB, and add more sacks. Maybe you can even fit more stacks in the reference design.

You know that the HBM design puts the memory in a completely different space than with GDDR5, right? You must know at this point because it's been told to you every time you keep bringing up that quote.

Lol stop with the complete and utter lies about the 970 being faster than a 290x.

It's not lies. Look up Anandtech's review of the 970. They tested 9 games, and the 290x at best kept up in 7 of them.
 
What kind of clocks do you run on your cards? State the base clocks if you know them too please.

Base clocks. So 1080mhz on 290x i forget the clocks on the 970. I got a asus strix which i think is my fastest 970, a gigabyte 970 and a gainwood (one with removeavle fans)

I have alot of 970s because the 290x were out of stock for a time here in Australia.

My 290x does not overclock well at all. It artifacts like crazy with a plus 50mhz oc. So i just leave it. Really cant be bothered overclocking the rest. They all run a i7 4790 that boosts to 3.8ghz
 
You know that the HBM design puts the memory in a completely different space than with GDDR5, right? You must know at this point because it's been told to you every time you keep bringing up that quote.

Did you ever read the quote? He says of course you can build more or less stacks. I keep bringing up he quote because it seems like people aren't reading it.
 
Lol stop with the complete and utter lies about the 970 being faster than a 290x. The witcher 3 being poorly optimized is cdprs fault, not AMDs. http://www.gamersnexus.net/game-bench/1947-witcher-3-pc-graphics-card-fps-benchmark

We aren't even bringing up he 970s stuttering issues because of its 3.5gb of good ram and .5gb of slow ram.

The 970 is faster in some games at 1080p sure, but the 290x is faster in most. I'll dig up the benchmark suites and post. The 970 is a great 1080p card. The 290x is a great 1440p+ card. And if you aren't blinded by trying to run everyhing at ultra at 4k like an idiot, the 290x does a great job of hitting 40-60fps with a mixture of settings.

Did you even try clicking on the link I've provided?
 
Lol stop with the complete and utter lies about the 970 being faster than a 290x. The witcher 3 being poorly optimized is cdprs fault, not AMDs. http://www.gamersnexus.net/game-bench/1947-witcher-3-pc-graphics-card-fps-benchmark

We aren't even bringing up he 970s stuttering issues because of its 3.5gb of good ram and .5gb of slow ram.

The 970 is faster in some games at 1080p sure, but the 290x is faster in most. I'll dig up the benchmark suites and post. The 970 is a great 1080p card. The 290x is a great 1440p+ card. And if you aren't blinded by trying to run everyhing at ultra at 4k like an idiot, the 290x does a great job of hitting 40-60fps with a mixture of settings.

I did that with this review and benchs

http://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/1.html

At 1080p: 13-6 Nvidia
At 1440p: 10-9 Nvidia
At 4k: 12-7 AMD

The 290x best the 970 only in the 4k setups and probably cause the 3.5GB issue start to kick in. So nope the the 290x doesn't perform better than the 970 "forever"
 
Base clocks. So 1080mhz on 290x i forget the clocks on the 970. I got a asus strix which i think is my fastest 970, a gigabyte 970 and a gainwood (one with removeavle fans)

I have alot of 970s because the 290x were out of stock for a time here in Australia.

My 290x does not overclock well at all. It artifacts like crazy with a plus 50mhz oc. So i just leave it. Really cant be bothered overclocking the rest. They all run a i7 4790 that boosts to 3.8ghz

It would be interesting to know, since i believe a 970 at 1400Mhz would likely outpace the 290X in the majority of situations. I base this off that my 1500Mhz 970 beats/matches a stock 980. And although its a bit above average, i would say 1400Mhz is more average, its still quite a good achievement. On top of that i run it in an ITX chassis with all fans at 7v, and my MSI Gamer 970 with stock fan curve. It barely hits 65c, GTA V can be running passively 30% of the time.

My base clocks
1140MHz Core (Boost Clock:1279MHz) (OC Mode)
1114MHz Core (Boost Clock:1253MHz) (Gaming Mode)
7000Mhz Memory
1.21v

My overclock
1545Mhz Core
7800Mhz Memory
1.23v

Obviously with the unique memory architecture, i fear for the future.

If AMD get their act together with TDP it would be game on.
 
No. Build it with a bigger PCB, and add more sacks. Maybe you can even fit more stacks in the reference design.

You know that the HBM design puts the memory in a completely different space than with GDDR5, right? You must know at this point because it's been told to you every time you keep bringing up that quote.


Thats the difference Boombox On is talking about.
The memory does not seem to have anything to do with the pcb, and is instead placed on the gpu package.



Did you ever read the quote? He says of course you can build more or less stacks. I keep bringing up he quote because it seems like people aren't reading it.



From these pictures, adding a stack seems like a pretty advanced thing to do, as opposed to just dumping a higher capacity gddr5 chip in the designated spots on the pcb.

Seems like something that has to be part of the initial design.
As Nikodemos said on the previous page, HBM2 might be the solution, hence why i said later, in my previous post.
 
It would be interesting to know, since i believe a 970 at 1400Mhz would likely outpace the 290X in the majority of situations. I base this off that my 1500Mhz 970 beats/matches a stock 980. And although its a bit above average, i would say 1400Mhz is more average, its still quite a good achievement. On top of that i run it in an ITX chassis with all fans at 7v, and my MSI Gamer 970 with stock fan curve. It barely hits 65c, GTA V can be running passively 30% of the time.

My base clocks
1140MHz Core (Boost Clock:1279MHz) (OC Mode)
1114MHz Core (Boost Clock:1253MHz) (Gaming Mode)
7000Mhz Memory
1.21v

My overclock
1545Mhz Core
7800Mhz Memory
1.23v

Obviously with the unique memory architecture, i fear for the future.

If AMD get their act together with TDP it would be game on.

Yea it would. The 290x at 1080 is a high overclock. I have had a few gigabyte 290x shit themselves trying to push 1100mhz. The memory oc on a 290x makes no difference in my experience.

Edit: the cards didnt die just couldnt run at full load without artifacting heavily.

Its amazing how much room for oc is built into nvidias cards.
 
Yea it would. The 290x at 1080 is a high overclock. I have had a few gigabyte 290x shit themselves trying to push 1100mhz. The memory oc on a 290x makes no difference in my experience.

Edit: the cards didnt die just couldnt run at full load without artifacting heavily.

Its amazing how much room for oc is built into nvidias cards.

Of course its luck dependant, and the cheaper brands did not always OC as well, such as Zotac and PNY. But the Gigabyte G1's tend to hit 1500Mhz on average. The reason was to do with the phase design, VRM design, and VRAM location (rear of PCB had no cooling (MSI)). And the lower TDP helps, as heat causes excessive current draw and spirals into instability faster. Thats kind of the reason i stayed away from AMD, as i was building an ITX box.

All irrelevant if you are not overclocking. Then probably the 290X non reference with decent cooler would be just as good, or better for the future.
 
Thats the difference Boombox On is talking about.
The memory does not seem to have anything to do with the pcb, and is instead placed on the gpu package.








From these pictures, adding a stack seems like a pretty advanced thing to do, as opposed to just dumping a higher capacity gddr5 chip in the designated spots on the pcb.

Seems like something that has to be part of the initial design.
As Nikodemos said on the previous page, HBM2 might be the solution, hence why i said later, in my previous post.

http://wccftech.com/fiji-xt-limited-4gb-memory/

2.5D Interposer is the solution
 
I did that with this review and benchs

http://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/1.html

At 1080p: 13-6 Nvidia
At 1440p: 10-9 Nvidia
At 4k: 12-7 AMD

The 290x best the 970 only in the 4k setups and probably cause the 3.5GB issue start to kick in. So nope the the 290x doesn't perform better than the 970 "forever"

It's even easier than that - 290X has more bandwidth. This is starting to show in 4K where memory bandwidth and not shading becomes the main limiting factor. 970 is running on a 256 (well, 224 even) bit bus while 290X has a full 512 bit MC -- more than twice wider. It's no wonder that 290X is getting ahead in 4K. I'd say it's a wonder how good the 970 and 980 are performing considering their 256 bit memory buses.


Interposer is always 2.5D. There is no interposer in 3D stacking because in this case the memory chips are stacked on top of the main chip itself.
 
It's even easier than that - 290X has more bandwidth. This is starting to show in 4K where memory bandwidth and not shading becomes the main limiting factor. 970 is running on a 256 (well, 224 even) bit bus while 290X has a full 512 bit MC -- more than twice wider. It's no wonder that 290X is getting ahead in 4K. I'd say it's a wonder how good the 970 and 980 are performing considering their 256 bit memory buses.



Interposer is always 2.5D. There is no interposer in 3D stacking because in this case the memory chips are stacked on top of the main chip itself.

Isnt it irrelevant when neither card is really pushing anywhere close to 60fps at such resolutions, even using console type settings? For me i can say its not really a playable range. Better off waiting for atleast a year before these games can be played with High settings (or atleast presets above consoles) at 4K or even 1600p. These cards are still 1080p60 GPUs to me, and the 970 does fine here.
 
The fact that the 290x is keeping up with a 970 in a Gameworks game (W3) that had a Day One driver is a testament to both the card and the drivers. This is a card, that when launched, was a direct competitor to the 780Ti and now is cheaper than a 970.
 
So if 4GB isn't enough which is what I'm hearing, then an 8GB 290X should BTFO the 970, 980, 980Ti, and possibly the 390X right? More VRAM than the lot of them and a bigger bus than the former 3.
 
The fact that the 290x is keeping up with a 970 in a Gameworks game (W3) that had a Day One driver is a testament to both the card and the drivers. This is a card, that when launched, was a direct competitor to the 780Ti and now is cheaper than a 970.

Yep. I don't understand why people who own a 970 feel such a need to fight against the 290x.
 
So if 4GB isn't enough which is what I'm hearing, then an 8GB 290X should BTFO the 970, 980, 980Ti, and possibly the 390X right? More VRAM than the lot of them and a bigger bus than the former 3.

The truth is that many manufacturers put more capacity in their designs to circumvent bandwidth limitations. We actually don't know is 4GB isn't enough for 4K since the bandwidth is so much higher out of the gate.
 
So if 4GB isn't enough which is what I'm hearing, then an 8GB 290X should BTFO the 970, 980, 980Ti, and possibly the 390X right? More VRAM than the lot of them and a bigger bus than the former 3.

Not really, since in playable scenarios, the 4Gb VRAM isnt really a limit even today. I would define playable as 1080p60 in this case.
GTA V was my biggest worry with the 970, and its fine, same with Witcher 3.
Batman will be the next to look out for.
 
Nobody is fighting the 290x, it's a great card, most people are replying calling your bs statements about the performace.

http://www.techpowerup.com/reviews/Gigabyte/GTX_960_OC/27.html

Here is a performance summary across 19 of the biggest games games from Techpowerup dated April 28, 2015. 290x and 970 are exactly equal at 1080p, and at 1440p the 290x is 8% better, and at 4K a full 10% better.

It is not BS that the 290x outperforms a 970, and that's what everyone seems to have a problem with. All the benchmarks people have posted are old, and also most likely were comparing the 970 to the launch benchmarks of the 290x, because most review sites are lazy, and just pull the numbers from their previous benchmarks, and only actually test the card they are reviewing.

The bullshit is that people keep pushing out how their precious midrange 970s are better than a 290x, and that simply isn't true on many levels. The benchmarks don't support it, it's 3.5gb gimped memory is a major knock, and so is it's limited bus width. The 290x is a 1.5 year old high-end card, and the 970 is a 9 month old mid-range card. It shouldn't be so hard to accept that the 290x is a better performing card.
 
Well, looks like these two cards are going to go head-to-head, releasing within weeks of each other and with similar price and performance. You'll just have to choose between a smaller form factor 390X (it will have a new name) with HBM or a 980 Ti with more memory, albeit 6GB GDDR5.

First real pic of 980 Ti leaked today as well:

NVIDIA-GeForce-GTX-980-Ti-Graphics-Card-Maxwell-GM200-635x428.jpg


http://wccftech.com/nvidia-geforce-...powered-cost-effective-maxwell-graphics-card/

Awesome. Just what I've been holding out for. I can see them being over $1000 AUD though. If it forces 980 prices to drop significantly then I might just grab a few of those instead...decisions.
 
http://www.techpowerup.com/reviews/Gigabyte/GTX_960_OC/27.html

Here is a performance summary across 19 of the biggest games games from Techpowerup dated April 28, 2015. 290x and 970 are exactly equal at 1080p, and at 1440p the 290x is 8% better, and at 4K a full 10% better.

It is not BS that the 290x outperforms a 970, and that's what everyone seems to have a problem with. All the benchmarks people have posted are old, and also most likely were comparing the 970 to the launch benchmarks of the 290x, because most review sites are lazy, and just pull the numbers from their previous benchmarks, and only actually test the card they are reviewing.

The bullshit is that people keep pushing out how their precious midrange 970s are better than a 290x, and that simply isn't true on many levels. The benchmarks don't support it, it's 3.5gb gimped memory is a major knock, and so is it's limited bus width. The 290x is a 1.5 year old high-end card, and the 970 is a 9 month old mid-range card. It shouldn't be so hard to accept that the 290x is a better performing card.

290X is a great card.

If the 380X (not the 390X HBM) is a re-badged 290X but around 5-8% faster, runs more efficiently and has a (almost a certainty) 8GB GDDR5 version, I will buy one.
 
http://www.techpowerup.com/reviews/Gigabyte/GTX_960_OC/27.html

Here is a performance summary across 19 of the biggest games games from Techpowerup dated April 28, 2015. 290x and 970 are exactly equal at 1080p, and at 1440p the 290x is 8% better, and at 4K a full 10% better.

It is not BS that the 290x outperforms a 970, and that's what everyone seems to have a problem with. All the benchmarks people have posted are old, and also most likely were comparing the 970 to the launch benchmarks of the 290x, because most review sites are lazy, and just pull the numbers from their previous benchmarks, and only actually test the card they are reviewing.

The bullshit is that people keep pushing out how their precious midrange 970s are better than a 290x, and that simply isn't true on many levels. The benchmarks don't support it, it's 3.5gb gimped memory is a major knock, and so is it's limited bus width. The 290x is a 1.5 year old high-end card, and the 970 is a 9 month old mid-range card. It shouldn't be so hard to accept that the 290x is a better performing card.

Hope you realize that those benchs are with reference 970 than probably nobody owns except for the ones handed to reviewers. Non reference 970 like the one I take the sample perform better than the 290x, in 1080p and barely on 1440p. Non reference 290x only pack better cooling and power consumption. Also the 970 OC like a mofo so it even make the difference wider, thing that you can't say about the 290x cause there is no room left to OC.
 
It's even easier than that - 290X has more bandwidth. This is starting to show in 4K where memory bandwidth and not shading becomes the main limiting factor. 970 is running on a 256 (well, 224 even) bit bus while 290X has a full 512 bit MC -- more than twice wider. It's no wonder that 290X is getting ahead in 4K. I'd say it's a wonder how good the 970 and 980 are performing considering their 256 bit memory buses.



Interposer is always 2.5D. There is no interposer in 3D stacking because in this case the memory chips are stacked on top of the main chip itself.

Ok.... does what you're saying invalidate what WCCFtech is saying about AMD possibly making an 8gb card?
 
As for the intent of this thread. If the makers of GDDR3 and GDDR5, who now say that 4gb of their new creation; HBM is not a bottleneck for their intended target, whether through driver optimizations or whatever. I'm going to give them the benefit of the doubt.

Can't wait late June to roll around.
 
Hope you realize that those benchs are with reference 970 than probably nobody owns except for the ones handed to reviewers.

That is how things should be benched - way to many variables reviewed an OCd card versus another OCd card. Which OCd card? How much is the OC? On the core? On the memory? Etc. Also, not everyone wants an overclocked card. A 290x can also be OCd, so I don't see your point. Mine is OCd 100mhz on the core, and and 400mhz on the memory.

There are plenty of reference clocked 970s that have been sold as well. Every single board partner has a reference clocked card.
 
As for the intent of this thread. If the makers of GDDR3 and GDDR5, who now say that 4gb of their new creation; HBM is not a bottleneck for their intended target, whether through driver optimizations or whatever. I'm going to give them the benefit of the doubt.

Can't wait late June to roll around.

Yeah, I mean, this is something that can be very easily tested in benchmarks. If AMD is lying then we'll know before the cards are even out.
 
Could you please stop with this bullshit about 290X always being faster than 970? Here's a lot of games for you. Most of the time 970 leads and 4K is the only place where 290X is ahead - both cards are providing unplayable framerates in 4K anyway so it really doesn't matter. The cards are pretty much equal on average.
290X and GTX 970 are essentially the same in terms of performance.

Here is TPU summary of performance from Titan X review.

1080P
perfrel_1920.gif


1440P
perfrel_2560.gif
 
Thats the difference Boombox On is talking about.
The memory does not seem to have anything to do with the pcb, and is instead placed on the gpu package.

You're right. Up until now AIBs could add more RAM by either using higher density chips or doubling them in clamshell mode. With HBM that won't be the case because they just get the package that has the GPU, interposer and RAM included. They can't just add more because that'd require a completely redesigned interposer and possibly GPU too.

They've been vague about whether it will be four or eight stacks, but my guess is that's it's going to be one or the other. I don't think there's an easy way to go for both 4GB and 8GB versions without doing a complete (and costly) overhaul.
 
I don't see anyone backing down from the 970 v 290x argument. As for the main topic, AMD rolled with Hawaii XT and Hawaii Pro cards last time for the 290X and 290 respectively, we could see a 390 Fiji pro too, maybe one with 8GB and another with 4GB? Though given how fast HBM is I'm not sure how much it really matters.
 
http://www.techpowerup.com/reviews/Gigabyte/GTX_960_OC/27.html

Here is a performance summary across 19 of the biggest games games from Techpowerup dated April 28, 2015. 290x and 970 are exactly equal at 1080p, and at 1440p the 290x is 8% better, and at 4K a full 10% better.

The 290x is nowhere to be found in any of those 19 tests. They just randomly plopped it in as an average from... somewhere, presumably.

The GPUs in the Perf Summary were all tested. They only bothered to show a handful of GPUs that operate in the same ballpark when looking at each specific game.

Thanks for the clarification. Does make it kind of difficult to go into details with the comparison, though.

The newest TPU review of the 960 shows AMDs driver improvement have closed the gap at 1080p, and widened it at 1440p and 4K. But AMDs drivers are terrible, so how could that be (sarcasm). http://www.techpowerup.com/reviews/Gigabyte/GTX_960_OC/27.html

Both tests used the same exact drivers.
 
The 290x is nowhere to be found in any of those 19 tests. They just randomly plopped it in as an average from... somewhere, presumably.

The GPUs in the Perf Summary were all tested. They only bothered to show a handful of GPUs that operate in the same ballpark when looking at each specific game.
 
Hope you realize that those benchs are with reference 970 than probably nobody owns except for the ones handed to reviewers. Non reference 970 like the one I take the sample perform better than the 290x, in 1080p and barely on 1440p. Non reference 290x only pack better cooling and power consumption. Also the 970 OC like a mofo so it even make the difference wider, thing that you can't say about the 290x cause there is no room left to OC.

Without sounding harsh, this was what i was getting at earlier. 970 is still the better buy.
 
Top Bottom