AMD Radeon Fury X Series | HBM, Small Form Factor And Water Cooling | June 16th

How do you know that before seeing even single review ?

Maxwell cards easily reach 17-20% overclock, something like Gigabyte G1 can run 25+ % higher than reference nvidia card.

At the same time GCN cards hardly ever exceed 1200Mhz clock so you are looking at 10-15% which is unlikely to change on even bigger die.

Just talking about Ti? Base or with boost? Or just a safe over clock in general for the 900 series? I think I did okay in the "silicon lottery" with my 970 and while I haven't really played a ton with overclocking beyond the factory OC of my EVGA SSC hitting 1500Mhz was achieved with no issue though it's not a big jump since my card boosts to 1417/1430.
 
Unless the 2 fans on the 980Ti hybrid are shit, generally, 2 fans will cool better than 1, and thus, they should be running at lower rpms => quieter. Did I use my brain...

Still, we need to wait for the reviews before making any blatant statements as facts.

Breh, you just said 2 fans, one of which is a blower will be quieter than a Nidec produced 120mm fan. Please breh, please.
 
Unless the 2 fans on the 980Ti hybrid are shit, generally, 2 fans will cool better than 1, and thus, they should be running at lower rpms => quieter. Did I use my brain...

Still, we need to wait for the reviews before making any blatant statements as facts.
¯\_(ツ)_/¯
 
Breh, you just said 2 fans, one of which is a blower will be quieter than a Nidec produced 120mm fan. Please breh, please.

Do you understand how rpm affects loudness? Nividia's blower fans run at high rpm under heavy load => loudness. With an added 120mm on the radiator, the card should run much cooler, and thus, the blower fan should be running at a much lower rpm => the card being much more quiet.

So, yeah, it's possible for 2 fans, blower + another, to be more quiet than 1 fan. That's if the 2 fans is able to cool the card while maintaining low rpms while the later 1 fan has to run at higher rpms.

I never said is or it isn't. I was just responding to someone claiming it was a given fact without waiting for the reviews / hard numbers.
 
Do any of the TV manufacturers include displayport?

EDIT: sorry thought I was in the I need a PC thread

You're in the right place if you're talking about Fury, the $650 flagship 2015 GPU which doesn't support HDMI 2.0.

For consumer models, the only TVs which have DisplayPort are the 2014 Panasonic 4K TVs. I hope you enjoy spending money, those are $2,000+ top-shelf 4K TVs.
 
A standard GTX660(not even Ti version) matched a 580.

So? A 680 was 40% something faster than 580. If you compare it to Maxwell 2 generation you'll see that 980Ti is the same 40% something faster than 780Ti. Then if you compare 780Ti to 680 you'll see that it's some 40% faster as well.

Huge performance jumps with new processes is a thing of the past. Last two updates on 28nm provided the exact same jumps as the switch from 40 to 28nm did - and I expect it to be even worse with 16nm as the initial cost of wafers will be even higher.
 
So? A 680 was 40% something faster than 580. If you compare it to Maxwell 2 generation you'll see that 980Ti is the same 40% something faster than 780Ti. Then if you compare 780Ti to 680 you'll see that it's some 40% faster as well.

Huge performance jumps with new processes is a thing of the past. Last two updates on 28nm provided the exact same jumps as the switch from 40 to 28nm did - and I expect it to be even worse with 16nm as the initial cost of wafers will be even higher.
Accomplished largely through holding back the bigger, more powerful chips for later, not because the jump in performance potential hadn't actually increased dramatically.

The 980 was not a 40% leap over the Titan/780Ti, either. It took 2 years to pass before Nvidia released something that was 40% faster than those cards. Here, we're looking at about a year that will pass, and it's entirely possible that we'll have 1st generation midrange cards that can beat these *true* top end ones right now. That's something to look forward to. And that's if they don't hold back with the big chips, which they may not if the next process jump isn't as far away.

I suppose I'm looking at this from the perspective of somebody who doesn't buy $500-600+ GPU's.
 
You're in the right place if you're talking about Fury, the $650 flagship 2015 GPU which doesn't support HDMI 2.0.

For consumer models, the only TVs which have DisplayPort are the 2014 Panasonic 4K TVs. I hope you enjoy spending money, those are $2,000+ top-shelf 4K TVs.

Criticism of no HDMI 2.0 is slightly OTT as what hardcore gamer is going to be gaming on a 4K TELEVISION with their 60Hz refresh rates?

Vast majority of hardcore gamers, which this ultra-enthusiast card is marketed to, are going to be gaming on gaming monitors, and nearly all of these have support for DisplayPort, which is superior to HDMI 2.0 too. Not sure about this but haven't virtually all 4K monitors got support for DisplayPort 1,2 at least, which again, has higher bandwidth (i.e superior) compared to even HDMI 2.0?
 
The lack of HDMI 2 is a non-issue for most but a huge issue for some. It's like multi-gpu support. It's a non-factor for most but for some it is one of the biggest factors.
 
Criticism of no HDMI 2.0 is slightly OTT as what hardcore gamer is going to be gaming on a 4K TELEVISION with their 60Hz refresh rates?

Vast majority of hardcore gamers, which this ultra-enthusiast card is marketed to, are going to be gaming on gaming monitors, and nearly all of these have support for DisplayPort, which is superior to HDMI 2.0 too. Not sure about this but haven't virtually all 4K monitors got support for DisplayPort 1,2 at least, which again, has higher bandwidth (i.e superior) compared to even HDMI 2.0?

And 80ms lag.
 
The lack of HDMI 2 is a non-issue for most but a huge issue for some. It's like multi-gpu support. It's a non-factor for most but for some it is one of the biggest factors.

True but if you are going to be using one of these uber GPUs for a PC hooked up to a 4k living room TV with standard refresh rate, you're doing something wrong. These are designed for gaming monitors surely. Just get a gaming monitor, which most of the card's target audience will own.
 
Criticism of no HDMI 2.0 is slightly OTT as what hardcore gamer is going to be gaming on a 4K TELEVISION with their 60Hz refresh rates?

Vast majority of hardcore gamers, which this ultra-enthusiast card is marketed to, are going to be gaming on gaming monitors, and nearly all of these have support for DisplayPort, which is superior to HDMI 2.0 too. Not sure about this but haven't virtually all 4K monitors got support for DisplayPort 1,2 at least, which again, has higher bandwidth (i.e superior) compared to even HDMI 2.0?
The answer is ZERO, and the point is that the option is gone. Regardless of how many people it affects or not, the point is that this is an enthusiast part designed for 4K (if you believe AMD) and not to mention their 'Quantum' PC designed for 4K in the living room makes absolutely ZERO sense considering this.

It really makes no sense. The standard has been there for 2 years... if you're going to have a HDMI port, it should be 2.0 at this point.
 
The answer is ZERO, and the point is that the option is gone. Regardless of how many people it affects or not, the point is that this is an enthusiast part designed for 4K (if you believe AMD) and not to mention their 'Quantum' PC designed for 4K in the living room makes absolutely ZERO sense considering this.

It really makes no sense. The standard has been there for 2 years... if you're going to have a HDMI port, it should be 2.0 at this point.

The Quantum hasn't got a Fury X or just two Fury X cards inside it. It has an unnamed dual Fiji GPU inside that may have support for HDMI 2.0 if it is indeed truly aimed for the living room space, but I even question that - have they said this box is designed specifically for the living room? I thought it was just supreme power in a small form factor for hardcore gamers.
 
True but if you are going to be using one of these uber GPUs for a PC hooked up to a 4k living room TV with standard refresh rate, you're doing something wrong. These are designed for gaming monitors surely. Just get a gaming monitor, which most of the card's target audience will own.

ok so couch gaming = nvidia....
IMO amd is missing a turning point in pc gaming here.
 
The Quantum hasn't got a Fury X or just two Fury X cards inside it. It has an unnamed dual Fiji GPU inside that may have support for HDMI 2.0 if it is indeed truly aimed for the living room space, but I even question that - have they said this box is designed specifically for the living room? I thought it was just supreme power in a small form factor for hardcore gamers.

Yeah, they did. Lisa Su mentioned the 4K in the living room multiple times. And it's HIGHLY unlikely that the dual Fiji graphics card has HDMI2.0 support if the Nano, Fury and Fury X don't.


I know the Quantum PC they showed is not going for sale, but rather to show vendors what is possible with the new smaller form factor of these gpus. (which AMD is all too keen on mentioning is perfect for small form factor PCs like HTPCs)
 
Criticism of no HDMI 2.0 is slightly OTT as what hardcore gamer is going to be gaming on a 4K TELEVISION with their 60Hz refresh rates?

Vast majority of hardcore gamers, which this ultra-enthusiast card is marketed to, are going to be gaming on gaming monitors, and nearly all of these have support for DisplayPort, which is superior to HDMI 2.0 too. Not sure about this but haven't virtually all 4K monitors got support for DisplayPort 1,2 at least, which again, has higher bandwidth (i.e superior) compared to even HDMI 2.0?

Just take a look at their post history.

Anyway, I was planning on getting a Fury or Fury X but I might wait now until the Nano is released to see where things stand. I'm gaming on a 1080P monitor and I doubt I'll be upgrading to 4K for a few years yet. Some may call a Fury card overboard for a 1080P system, but to me it'll be the perfect card for that setup. I currently have an overclocked 7970 GHz Edition and it's struggling with games like The Witcher 3. I imagine a Fury card will have enough power to play that at maximum settings at that resolution. Plus obviously I'm not bothered by the 4GB only memory, so it seems one of the Fury cards are for me.

Just have to wait and see how things pan out with Fury and Nano...
 
Criticism of no HDMI 2.0 is slightly OTT as what hardcore gamer is going to be gaming on a 4K TELEVISION with their 60Hz refresh rates?

Vast majority of hardcore gamers, which this ultra-enthusiast card is marketed to, are going to be gaming on gaming monitors, and nearly all of these have support for DisplayPort, which is superior to HDMI 2.0 too. Not sure about this but haven't virtually all 4K monitors got support for DisplayPort 1,2 at least, which again, has higher bandwidth (i.e superior) compared to even HDMI 2.0?

People on the internet these days are compelled to try to ruin anything that AMD does, to constantly find flaws and spread it through as many channels as they can. The fact that HDMI 2.0 support is being brought up constantly is consistent with that. This kind of stance will ruin the market for all PC gamers.

ok so couch gaming = nvidia....
IMO amd is missing a turning point in pc gaming here.

Unless you have a 4K TV, no.
 
Accomplished largely through holding back the bigger, more powerful chips for later, not because the jump in performance potential hadn't actually increased dramatically.

The 980 was not a 40% leap over the Titan/780Ti, either. It took 2 years to pass before Nvidia released something that was 40% faster than those cards. Here, we're looking at about a year that will pass, and it's entirely possible that we'll have 1st generation midrange cards that can beat these *true* top end ones right now. That's something to look forward to. And that's if they don't hold back with the big chips, which they may not if the next process jump isn't as far away.

I suppose I'm looking at this from the perspective of somebody who doesn't buy $500-600+ GPU's.

Nobody held anything "for later". The price of larger GPUs was such that it wasn't feasible to introduce any mass market products on them. Should I remind you that the first card on this bigger chip even though it came out almost a year later was priced at $999?
 
ok so couch gaming = nvidia....
IMO amd is missing a turning point in pc gaming here.

No, the turning point in PC gaming, if we're talking display devices, are the new breed of high-quality G-Sync and FreeSync monitors from the likes of Acer and ASUS, all of which have support for DisplayPort. Like the ROG Swift and Predator. These are game-changing, although I don't own one myself yet.

The turning point in PC gaming is not everyone thinking "let's plant my gaming rig next to my living room 4K TV so I can enjoy 60Hz refresh rates, awful tearing, higher latency, and increased stuttering because V-sync is so 2015". I mean really. Ok I am exaggerating here, and of course it would be nice if you had the option for perfect 4k television gaming via a Fury X, but it's not a momentous cock-up by AMD if you think about it, and are also cognizant of the fact that AMD are part of VESA, who helped develop the DisplayPort standard. So naturally they are going to want to push the standard in some meaningful way.

Yeah, they did. Lisa Su mentioned the 4K in the living room multiple times. And it's HIGHLY unlikely that the dual Fiji graphics card has HDMI2.0 support if the Nano, Fury and Fury X don't.


I know the Quantum PC they showed is not going for sale, but rather to show vendors what is possible with the new smaller form factor of these gpus. (which AMD is all too keen on mentioning is perfect for small form factor PCs like HTPCs)

Dunno man, I mean we don't know yet whether the Nano and Fury won't have support for HDMI 2.0, although it's unlikely.

I also need more clarification on what exactly that Quantum thing is tbh, like who it's aimed at or what is purpose is if it's just an AMD tech project.
 
I'm waiting for Fury's benchs. If its very close to the 980ti, I'll go with Nvidia. I have my PC hooked to my TV (1080p) and I'm planning on getting a 4K TV soon, so HDMI 2.0 is a concern.
Dumb question: Would a DisplayPort-to-Hdmi cable carry the bandwidth??
 
Fury X will be a good OCer so expect +30% performance

That would surprise me. But it would be awesome. I'm expecting more in the way of 15 - 20% on average but with all of this new technology on board there's really no telling how much headroom these cards will have.

Since AMD specifically called out their ability to overclock and made sure that they could handle wattage well over what the dual 8 pin connectors can provide plus added the AIO water cooling it would seem like they are serious about cranking up the core clock.

30%+ OC's on these would be a big win but I'm holding on to my skepticism for now.
 
True but if you are going to be using one of these uber GPUs for a PC hooked up to a 4k living room TV with standard refresh rate, you're doing something wrong. These are designed for gaming monitors surely. Just get a gaming monitor, which most of the card's target audience will own.
There are people out there who do both. You can have your pc in an office with a real monitor and run an HDMI cable and a powered usb hub to your living room so you can game on your tv. I do that and it's nice to have the option, some games you prefer to play on the couch (especially with a friend).
 
No, the turning point in PC gaming, if we're talking display devices, are the new breed of high-quality G-Sync and FreeSync monitors from the likes of Acer and ASUS, all of which have support for DisplayPort. Like the ROG Swift and Predator. These are game-changing, although I don't own one myself yet.

The turning point in PC gaming is not everyone thinking "let's plant my gaming rig next to my living room 4K TV so I can enjoy 60Hz refresh rates, awful tearing, higher latency, and increased stuttering because V-sync is so 2015". I mean really. Ok I am exaggerating here, and of course it would be nice if you had the option for perfect 4k television gaming via a Fury X, but it's not a momentous cock-up by AMD if you think about it, and are also cognizant of the fact that AMD are part of VESA, who helped develop the DisplayPort standard. So naturally they are going to want to push the standard in some meaningful way.



Dunno man, I mean we don't know yet whether the Nano and Fury won't have support for HDMI 2.0, although it's unlikely.

I also need more clarification on what exactly that Quantum thing is tbh, like who it's aimed at or what is purpose is if it's just an AMD tech project.

Some folks prefer sitting on a couch and playing on a 65 inch 4k versus sitting in a chair with a 24-32 inch monitor. Myself included.
 
Accomplished largely through holding back the bigger, more powerful chips for later, not because the jump in performance potential hadn't actually increased dramatically.
Hmm holding back implies they were deliberately withholding the part with no technical constraints otherwise. I'd like to think it takes them a learning curve and maturation to reach the Maxwell-type stage.
 
No, the turning point in PC gaming, if we're talking display devices, are the new breed of high-quality G-Sync and FreeSync monitors from the likes of Acer and ASUS, all of which have support for DisplayPort. Like the ROG Swift and Predator. These are game-changing, although I don't own one myself yet.

The turning point in PC gaming is not everyone thinking "let's plant my gaming rig next to my living room 4K TV so I can enjoy 60Hz refresh rates, awful tearing, higher latency, and increased stuttering because V-sync is so 2015". I mean really. Ok I am exaggerating here, and of course it would be nice if you had the option for perfect 4k television gaming via a Fury X, but it's not a momentous cock-up by AMD if you think about it, and are also cognizant of the fact that AMD are part of VESA, who helped develop the DisplayPort standard. So naturally they are going to want to push the standard in some meaningful way.



Dunno man, I mean we don't know yet whether the Nano and Fury won't have support for HDMI 2.0, although it's unlikely.

I also need more clarification on what exactly that Quantum thing is tbh, like who it's aimed at or what is purpose is if it's just an AMD tech project.
Because some people do not want to game sat at a desk in their bedroom? Your comments that TVs are inherently inferior to monitors makes no sense at all. For a start, all non-strobed LCD at 4K is more than likely resolving sub-HD levels of detail in motion any way, monitor or TV, g-sync or fixed sync.
 
Some folks prefer sitting on a couch and playing on a 65 inch 4k versus sitting in a chair with a 24-32 inch monitor. Myself included.

I like both, any multiplayere game I want that PC chair close experience while with single player, I want to sit back on my arse on a comfy soft sofa/bed and game on a massive screen.
 
Anyone with a 4k hdmi only display will skip the Fury. Simple. No need to poo poo people who choose this, but also don't pretend it affects more than a tiny part of the market.
 
No, the turning point in PC gaming, if we're talking display devices, are the new breed of high-quality G-Sync and FreeSync monitors from the likes of Acer and ASUS, all of which have support for DisplayPort. Like the ROG Swift and Predator. These are game-changing, although I don't own one myself yet.

The turning point in PC gaming is not everyone thinking "let's plant my gaming rig next to my living room 4K TV so I can enjoy 60Hz refresh rates, awful tearing, higher latency, and increased stuttering because V-sync is so 2015". I mean really. Ok I am exaggerating here, and of course it would be nice if you had the option for perfect 4k television gaming via a Fury X, but it's not a momentous cock-up by AMD if you think about it, and are also cognizant of the fact that AMD are part of VESA, who helped develop the DisplayPort standard. So naturally they are going to want to push the standard in some meaningful way.

Sure OK, if AMD thinks with their dwindling market share they can just throw away customers, then more power to them. Normally the underdog wants more customers, not less. AMD is clearly not your typical underdog.

Nvidia supplies a solution for me to game on my Sony 65" 4K TV though so no harm no foul. I wonder how many customers AMD can throw away until they are left without any though.
 
Have there been any R7 360 benchmarks yet? Some of the 370 ones put its performance around that of the 270, despite having less shaders, so I was wondering if the 360 seemingly pushes above its weight a bit too, leaning towards 260X performance.
 
Some folks prefer sitting on a couch and playing on a 65 inch 4k versus sitting in a chair with a 24-32 inch monitor. Myself included.

Sure OK, if AMD thinks with their dwindling market share they can just throw away customers, then more power to them. Normally the underdog wants more customers, not less. AMD is clearly not your typical underdog.

Nvidia supplies a solution for me to game on my Sony 65" 4K TV though so no harm no foul. I wonder how many customers AMD can throw away until they are left without any though.

Yeah I'm not saying there aren't people who want to game on 4K TVs (obviously) or that AMD isn't in fact shutting out potential customers. They patently will be. They have just made the decision that the amount of people in their target audience that do so is small enough to forgo HDMI 2.0 support and put their weight behind the DisplayPort on the GPU.

My point in the post you quoted was that there are better and more forward-looking alternatives to 4K TV gaming after the assumption was made that that was the 'turning point' in our hobby that AMD will be missing.
 
Anyone with a 4k hdmi only display will skip the Fury. Simple. No need to poo poo people who choose this, but also don't pretend it affects more than a tiny part of the market.
If the Steam survey is an accurate indication, that is an extremely tiny market.
 
Sure OK, if AMD thinks with their dwindling market share they can just throw away customers, then more power to them. Normally the underdog wants more customers, not less. AMD is clearly not your typical underdog.

Nvidia supplies a solution for me to game on my Sony 65" 4K TV though so no harm no foul. I wonder how many customers AMD can throw away until they are left without any though.

There are dozens of us! Dozens!
 
If the Steam survey is an accurate indication, that is an extremely tiny market.

There are 125 million Steam users lol, when you see any percentage at all greater than zero that's a real and significant number of people.

The point is Nvidia offers this and has offered it since Kepler. So whether there are 10, 100, 100,000, or 1 million people, Nvidia has 100% of that market and AMD has 0%. You can't compete if you don't even show up.
 
Anyone with a 4k hdmi only display will skip the Fury. Simple. No need to poo poo people who choose this, but also don't pretend it affects more than a tiny part of the market.

But wouldn't a displayport-to-hdmi cable work? Displayport on the gpu to the hdmi port on the tv.
 
Does not support 4k streaming. And it isn't even on the market. If if it were to support 4k streaming you would need ridiculous network speeds to achieve it. Gigabit fiber I'd imagine.

A standard gigabit cat5e would be more than enough for 4k streaming inside the home.
 
There are 125 million Steam users lol, when you see any percentage at all greater than zero that's a real and significant number of people.

The point is Nvidia offers this and has offered it since Kepler. So whether there are 10, 100, 100,000, or 1 million people, Nvidia has 100% of that market and AMD has 0%. You can't compete if you don't even show up.
It's some subset of people running 4k. If we assume 10% of the Steam survey's 4k displays are HDMI only then that's only about 7500 people. To put things in perspective, the 680 GTX series sold over 10 million.
 
A standard gigabit cat5e would be more than enough for 4k streaming inside the home.

No it isn't, 1080p is still somewhat choppy with 30 meg down and 5 up using ethernet on both ends. That's the fastest option I have available to me with centurylink.
 
It's some subset of people running 4k. If we assume 10% of the Steam survey's 4k displays are HDMI only then that's only about 7500 people. To put things in perspective, the 680 GTX series sold over 10 million.

I don't think you can really make such far reaching assertions based on the steam survey. I'm not a participant, and neither are a lot of other people, regardless of their display type. Could be people who game on TVs are less likely to be a participant. And I'd say a much larger % of 4k displays are hdmi only, not just 10%.
 
Top Bottom