AMD Radeon Fury X Series | HBM, Small Form Factor And Water Cooling | June 16th

Ah, ok cool. Thanks again clever.

I am definitely leaning toward the 980ti at this point. I'd still like to see what AMD has, though, since they really need to make an aggressive move here. By then the custom cooler 980tis will be out, too.
 
AMD would have the leg up if they werent held back by that first generation HBM limitation.
Rushing to market with new tech isn't really all that effective

Except they aren't 'rushing' to the market at all. All new technologies have limitations at first.

Why do you think we have been on 28 nm for so long? It is primarily because the next node sizes have limitations and the cons outweigh the pros.

The opposite is true for HBM, the pros of the first gen HBM outweigh the cons and they decided to release cards with it this year.
 
Oh I know what it says, but what it says vs actual use is a different story. It's a bit cramped in there with a GTX 780 in there. But for shits and giggles I'll try and see if I can get a H55 / H60 to fit in there.
I have and Antec 650 blowing out the front of mine. (I have an APU and wanted to over clock the iGPU.) No issues with it. I still had room to mount a 3.5 HHD on the side panel(removed in that photo) and an SSD on the bottom mounting point directly behind the radiator.
 
e3zXgsXl.jpg
N6t4spv.jpg


Beginning, the HBM wars are.
 
28nm: The Last Node of Moore's Law

There is a consensus that making this type of product (discrete desktop GPUs) on processes beyond 28nm is going to be more expensive. Until this point, costs have eventually remained similar on each successive node shrink. Increased difficulty of production beyond 28nm is going to mean (if these predictions are accurate), that GPUs are going to get more expensive going forwards.

There are very smart people working on what comes next, but just because we have enjoyed a trend this far, it does not mean that trend will continue.

Ah, I had read articles mentioning that idea before, but that one you linked is particularly great, thanks. Hmmmm, we really do need to push on beyond silicon wafers.

I guess also worth making the point that the cards that now occupy those relative price points of the sublime 6950, like the GTX960 and the r280, are pretty great cards when not viewed through the lens of 'they aren't 970/290's'. Still, difficult to reconcile the value proposition there. Ah well, I see my 6950 still fetches a reasonable sale price on eBay. That'll help offset the next upgrade.
 
Oh I know what it says, but what it says vs actual use is a different story. It's a bit cramped in there with a GTX 780 in there. But for shits and giggles I'll try and see if I can get a H55 / H60 to fit in there.

I have an H60 in a 130 Elite, so I know that at least fits...
 
28nm: The Last Node of Moore's Law

There is a consensus that making this type of product (discrete desktop GPUs) on processes beyond 28nm is going to be more expensive. Until this point, costs have eventually remained similar on each successive node shrink. Increased difficulty of production beyond 28nm is going to mean (if these predictions are accurate), that GPUs are going to get more expensive going forwards.

There are very smart people working on what comes next, but just because we have enjoyed a trend this far, it does not mean that trend will continue.

Intel have been seeing a continual drop in cost per transistor which by their reports have seen it dropping down to 14nm still. That being said, Intel also make sure their manufacturing process has a long tail so their amortization goes for many years. All of the PCHs are typically on process nodes a generation behind their CPU brethren. Value based Atom chips are usually a process node behind. SSDs are about to give their older process nodes a giant boost in longevity as SSD makers turn to larger process 3D NAND to develop the next generation of SSDs.

Maybe Nvidia will be looking to Intel Foundries for Volta or even late Pascal since Intel have been opening up their process nodes (which are far ahead of TSMC, Globalfoundries and Samsung) to certain outside customers in the past year. Doubt that leaves AMD in a good spot if that happens.
 
Yeah I got a 290x, just shocked that we are gonna start cards without DVI

AMD hates DVI and HDMI in general I guess. Their loss, I guess they don't want the rather large market of PC gamers who game on monitors which don't have a DisplayPort input, not to mention PC gamers who game on HDTVs which usually only have HDMI.

Speaking of HDTVs, only Nvidia implemented a certain 4K/60 with 4:2:0 chroma subsampling mode which is required for the early 2013/14 4K TVs to display 4K/60, on these TVs all available AMD cards are limited to 4K/30 only. Because I have a first-generation Sony 65X900A 4K TV, only Nvidia makes a card that works with my TV and offers 4K/60 output to it.

But AMD hates money I guess and won't implement this mode in their drivers, forcing all early adopters of 4K TVs to buy Nvidia instead. Never mind that enthusiast early adopters like me are the ones with disposable income to spend on expensive video cards. If AMD doesn't want my money I certainly won't give it to them.
 
I'm sure these new cards will at least HDMI output.

Edit: Oh I see you mean HDMI features. I don't have a 4K TV so that part isn't on my radar. I'd go with Nvidia in your situation.
 
AMD hates DVI and HDMI in general I guess. Their loss, I guess they don't want the rather large market of PC gamers who game on monitors which don't have a DisplayPort input, not to mention PC gamers who game on HDTVs which usually only have HDMI.

Speaking of HDTVs, only Nvidia implemented a certain 4K/60 with 4:2:0 chroma subsampling mode which is required for the early 2013/14 4K TVs to display 4K/60, on these TVs all available AMD cards are limited to 4K/30 only. Because I have a first-generation Sony 65X900A 4K TV, only Nvidia makes a card that works with my TV and offers 4K/60 output to it.

But AMD hates money I guess and won't implement this mode in their drivers, forcing all early adopters of 4K TVs to buy Nvidia instead. Never mind that early adopters like that are the ones with disposable income to spend on expensive video cards. If AMD doesn't want my money I certainly won't give it to them.

nVidia for the longest time wouldn't allow proper full RGB over HDMi without a 3rd party patch. Just wanna point out neither of them are perfect. early adopters are gamble on support for their first gen devices.
 
AMD hates DVI and HDMI in general I guess. Their loss, I guess they don't want the rather large market of PC gamers who game on monitors which don't have a DisplayPort input, not to mention PC gamers who game on HDTVs which usually only have HDMI.

Speaking of HDTVs, only Nvidia implemented a certain 4K/60 with 4:2:0 chroma subsampling mode which is required for the early 2013/14 4K TVs to display 4K/60, on these TVs all available AMD cards are limited to 4K/30 only. Because I have a first-generation Sony 65X900A 4K TV, only Nvidia makes a card that works with my TV and offers 4K/60 output to it.

But AMD hates money I guess and won't implement this mode in their drivers, forcing all early adopters of 4K TVs to buy Nvidia instead. Never mind that enthusiast early adopters like me are the ones with disposable income to spend on expensive video cards. If AMD doesn't want my money I certainly won't give it to them.
Yup, nailed it. Plenty of people out there who still need dvi, myself included.
 
You would think a person who's considering buying one of these top end GPUs would have the means to make the switch from their ancient DVI only monitor to something a bit more modern. Besides for legacy purposes there are adapters available if one really needs them.
 
You would think a person who's considering buying one of these top end GPUs would have the means to make the switch from their ancient DVI only monitor to something a bit more modern. Besides for legacy purposes there are adapters available if one really needs them.

The DVI Korean monitors are still a great bang for your buck. I got an 1440p IPS Crossover and have no plans to upgrade till 21:9 120htz Gsync monitor comes out. The DVI only part of it is what helps keep input lag down.

active DP to DVI adapters work perfectly.

Thought about it but they seem to be hit and miss with these monitors.
 
Yup, nailed it. Plenty of people out there who still need dvi, myself included.

You can always get a HDMI to DVI converter. I don't see AMD not putting out a card without HDMI 2.0 output at this point. If they don't offer HDMI 2.0 support I'll have to go with Nvidia as I have a 48" 4k display that can do 4:4:4 over HDMI 2.0.
 
The DVI Korean monitors are still a great bang for your buck. I got an 1440p IPS Crossover and have no plans to upgrade till 21:9 120htz Gsync monitor comes out. The DVI only part of it is what helps keep input lag down.



Thought about it but they seem to be hit and miss with these monitors.

Google your monitor model, and look at converters on Amazon for it, I'm ordering one for the R9 Fury and my Qnix 2710
 
AMD hates DVI and HDMI in general I guess. Their loss, I guess they don't want the rather large market of PC gamers who game on monitors which don't have a DisplayPort input, not to mention PC gamers who game on HDTVs which usually only have HDMI.

I wonder why AMD is wanting to push FreeSync onto the HDMI Standard then. Where's your GSync compatible HDMI devices?

The quality from your posts are truly a wonder.

Edit: Converters are an amazing thing.
 
I love the attempt at misdirection here. I don't even know how whatever the hell you're talking about relates to my personal situation where the Nvidia cards work with my TV and the AMD cards don't.

Your particular situation is pretty unique though. 4:2:0 on a PC looks like garbage, thus very few people are going to plug a gaming PC into a 4k display that does not do 4:4:4.
 
I love the attempt at misdirection here. I don't even know how whatever the hell you're talking about relates to my personal situation where the Nvidia cards work with my TV and the AMD cards don't.

Yes, your personal TV problem is certainly an issue. But, when you conflate that issue into a gross generalization; saying AMD hates HDMI or DVI in general. It is FUD. Which is evident in your other posts. They're spending resources working to improve that very standard. Which defies your conclusion.
 
Yes, your personal TV problem is certainly an issue. But, when you conflate that issue into a gross generalization; saying AMD hates HDMI or DVI in general. It is FUD. Which is evident in your other posts. They're spending resources working to improve that very standard. Which defies your conclusion.

Right, he's basically saying, "Why does AMD not support my gimped 4k TV color format?"
There are first gen 4:4:4 4k sets, he just didn't do his homework. It was nice of nvidia to support 4:2:0 60hz, but AMD isn't obligated to do everything nvidia does.
 
Your particular situation is pretty unique though. 4:2:0 on a PC looks like garbage, thus very few people are going to plug a gaming PC into a 4k display that does not do 4:4:4.

Yeah. Right now I am doing 4:2:0/4k@60Hz and it looks like shit. That's why I need a new card. My 2015 4k Samsung TV supports 4:4:4 in PC mode and 4:2:2 in Game mode which is just fine for my purposes.
 
You can always get a HDMI to DVI converter. I don't see AMD not putting out a card without HDMI 2.0 output at this point. If they don't offer HDMI 2.0 support I'll have to go with Nvidia as I have a 48" 4k display that can do 4:4:4 over HDMI 2.0.
If you need to drive 2560x1600 60hz you have to buy a pricy active converter, and I've heard horror stories about those.
 
AMD hates DVI and HDMI in general I guess. Their loss, I guess they don't want the rather large market of PC gamers who game on monitors which don't have a DisplayPort input, not to mention PC gamers who game on HDTVs which usually only have HDMI.

Speaking of HDTVs, only Nvidia implemented a certain 4K/60 with 4:2:0 chroma subsampling mode which is required for the early 2013/14 4K TVs to display 4K/60, on these TVs all available AMD cards are limited to 4K/30 only. Because I have a first-generation Sony 65X900A 4K TV, only Nvidia makes a card that works with my TV and offers 4K/60 output to it.

But AMD hates money I guess and won't implement this mode in their drivers, forcing all early adopters of 4K TVs to buy Nvidia instead. Never mind that enthusiast early adopters like me are the ones with disposable income to spend on expensive video cards. If AMD doesn't want my money I certainly won't give it to them.

You're passive aggressive like no other.
 
Word is that the 390X is a rebadge job. I hope not =(

It might not be. If they take Hawaii and add Tonga features like compression, better tessellation, etc, I see no reason it can't be a 980 Killer thanks to the extra VRAM. The 290X is only around 8% slower than the 980 at UHD, so it shouldn't take much to match it. Priced right, it might take the huge gap in between 290/970 and 980Ti pricing that the 980 currently struggles to justify.
 
Why would you want a G-sync monitor to use an inferior signaling scheme?

The only "amazing" things about high-bandwidth DP<->DVI converters are their prices and the level of annoyance they cause.

HDMI 1.3a is capable of sRGB 4:4:4 @60hz at 1080p. Good enough for exactly what VRR is best at. Smoothing out fluctuating experiences at sub 60fps. The wide appeal and use of that is a no brainer.

With HDMI 2.0, this is extended to 4k. It may still be inferior to DP, but it will no doubt be more adopted than DP in the nearest future unfortunately. It's a case of good enough.

You're also splitting hairs saying High-Bandwidth DP<->DVI. Converters suitable for any shit-tier monitor will work flawlessly and be $20 max. Any monitor worth their quality these days will have a Display Port or HDMI. With that, if anyone is in the market for a $500+ video card... I sure as hell hope you're not on a shit-tier monitor. Cause a good monitor will last anyone longer than the life of your video cards capabilities.
 
HDMI 1.3a is capable of sRGB 4:4:4 @60hz at 1080p. Good enough for exactly what VRR is best at. Smoothing out fluctuating experiences at sub 60fps. The wide appeal and use of that is a no brainer.

With HDMI 2.0, this is extended to 4k. It may still be inferior to DP, but it will no doubt be more adopted than DP in the nearest future unfortunately. It's a case of good enough.

You're also splitting hairs saying High-Bandwidth DP<->DVI. Converters suitable for any shit-tier monitor will work flawlessly and be $20 max. Any monitor worth their quality these days will have a Display Port or HDMI. With that, if anyone is in the market for a $500+ video card... I sure as hell hope you're not on a shit-tier monitor. Cause a good monitor will last anyone longer than the life of your video cards capabilities.
Most high end monitors don't use HDMI because it isn't capable of delivering the bandwidth required. You can't do 1080p@144Hz on 1.3.

Someone please do the math and correct me if I'm wrong, but you can't do 3560x1440 at 120-144Hz either, and those monitors are on the horizon.

G-Sync doesn't exist to smooth things out below 60Hz. It exists to smooth things out by matching the refresh with a new frame. Doing that above 60Hz is just as important.

HDMI is decidedly not good enough for PC enthusiasts.
 
Word is that the 390X is a rebadge job. I hope not =(
At this point, its more than a "word". We know 390X is Grenada (possibly a respin of Hawaii (290X)) and the real flagship is going to come under a new branding - currently being rumored to be FURY series.

Crisium, I dont think they will add color compression or change the tessellation setup as they would require some overhaul. Most likely its just a respin/binned so they can reach higher clocks. Add in more more memory (double Nvidia's equivalent) and AMD will try to position it as the value proposition. About the same speed (-5%) as 980, twice the VRAM (so last longer!) etc. for about $75-100 cheaper.

GTX 980 4GB - $499 (105% performance)
R9 390X 8GB - $399-429 (100% performance)
 
Most high end monitors don't use HDMI because it isn't capable of delivering the bandwidth required. You can't do 1080p@144Hz on 1.3.

Someone please do the math and correct me if I'm wrong, but you can't do 3560x1440 at 120-144Hz either, and those monitors are on the horizon.

G-Sync doesn't exist to smooth things out below 60Hz. It exists to smooth things out by matching the refresh with a new frame. Doing that above 60Hz is just as important.

HDMI is decidedly not good enough for PC enthusiasts.

These are two different issues you're confusing as one.

The VRR for HDMI is for the broad adoption of mainstream Television sets, such as it is good enough for that. I know HDMI 2.0 can't do 4k over 60hz. I do agree smoothing out above 60hz and removing tearing at any rate is important. But the effect is greatest and would benefit a lot more people below 60hz. And I do agree HDMI is not good enough for PC Enthusiasts, but for the rest of the world?

DP and DVI was a separate issue entirely. The contention is that the Fury line of cards have only DP ports and one HDMI. The person who was dismissing the Fury was doing so because his monitors don't have DP or HDMI. How old does your Enthusiast monitor have to be to not match your Enthusiast Video Card? My 5 year old HPZR24w has a display port.
 
Rumor now from Hwbattle, is that the Radeon Fury will have three cards. The Fury PRO, the Fury XT and the Fury Nano.
 
Rumor now from Hwbattle, is that the Radeon Fury will have three cards. The Fury PRO, the Fury XT and the Fury Nano.
If true would the XT and Pro SKUs are normal air-cooled blowers? Or do they artificially elongate the PCB of the XT and Pro and spread out the components more.

If they beat the Titan X in 4K, they can get away with pricing like below

Fury Nano - 105% Titan X - $799
Fury XT - 100% Titan X - $699
Fury Pro - 90% Titan X - $599
 
If true would the XT and Pro SKUs are normal air-cooled blowers? Or do they artificially elongate the PCB of the XT and Pro and spread out the components more.

If they beat the Titan X in 4K, they can get away with pricing like below

Fury Nano - 105% Titan X - $799
Fury XT - 100% Titan X - $699
Fury Pro - 90% Titan X - $599
Wouldn't the nano be the little brother?
 
Top Bottom