AMD Radeon Fury X Series | HBM, Small Form Factor And Water Cooling | June 16th

I don't think you can really make such far reaching assertions based on the steam survey. I'm not a participant, and neither are a lot of other people, regardless of their display type. Could be people who game on TVs are less likely to be a participant. And I'd say a much larger % of 4k displays are hdmi only, not just 10%.

Yes I would guess half or maybe even slightly more than half of all 4K displays are TVs. 4K hasn't really taken off in monitors yet from what I've seen... I see a lot more people using TVs on forums when 4K comes up.

Is this bad news for AMD? It might eat into sales a bit, but overall 4K is VERY niche still. Not as big of a deal as people are making it out to be. A lot more people will be buying Fury for high refresh rate 1080p panels and 1440p.
 
Yes I would guess half or maybe even slightly more than half of all 4K displays are TVs. 4K hasn't really taken off in monitors yet from what I've seen... I see a lot more people using TVs on forums when 4K comes up.

Is this bad news for AMD? It might eat into sales a bit, but overall 4K is VERY niche still. Not as big of a deal as people are making it out to be. A lot more people will be buying Fury for high refresh rate 1080p panels and 1440p.

In general a $650 video card sells to a very niche audience. A $650 card with built in WC that requires a case with a free spot for a radiator even more so. So if you start excluding people from what is already a small niche, what are you left with?

And about those high refresh 1080/1440p panels. A lot of people bought the Korean IPS panels when they were the only game in town for years and those only have DVI inputs. Looks like AMD are excluding those people too.

I've never seen a company more anxious to exclude customers while losing the market share battle 75% to 25% to their competitor who is not excluding anybody who wants to buy their products.
 
No it isn't, 1080p is still somewhat choppy with 30 meg down and 5 up using ethernet on both ends. That's the fastest option I have available to me with centurylink.

I don't think 4K is viable simply due to the encoding that needs to take place in the server PC and the decoding taking place in the living room PC would produce too much latency.

That said, your ISP bandwidth caps shouldn't affect your home network speeds. Steam In-Home Streaming doesn't use your internet connection to stream, it uses your LAN.
If 1080p is choppy and you're using Cat5 on both ends, then it's probably due to having a slow router or an older HTPC that can't handle the decoding task.
 
In general a $650 video card sells to a very niche audience. A $650 card with built in WC that requires a case with a free spot for a radiator even more so. So if you start excluding people from what is already a small niche, what are you left with?

And about those high refresh 1080/1440p panels. A lot of people bought the Korean IPS panels when they were the only game in town for years and those only have DVI inputs. Looks like AMD are excluding those people too.

I've never seen a company more anxious to exclude customers while losing the market share battle 75% to 25% to their competitor who is not excluding anybody who wants to buy their products.
I think the idea is to have a "halo" product to attract attention to the whole line, such as the air cooled Fury and the Fury Nano.

We have also yet to see what ports AMD partners will include on the boards.
 
In general a $650 video card sells to a very niche audience. A $650 card with built in WC that requires a case with a free spot for a radiator even more so. So if you start excluding people from what is already a small niche, what are you left with?

And about those high refresh 1080/1440p panels. A lot of people bought the Korean IPS panels when they were the only game in town for years and those only have DVI inputs. Looks like AMD are excluding those people too.

I've never seen a company more anxious to exclude customers while losing the market share battle 75% to 25% to their competitor who is not excluding anybody who wants to buy their products.

Oh noes, those people will have to buy (if it isn't included with their card) a display port to DVI cable which costs 10$

The horror, I tell you.
 
I don't think 4K is viable simply due to the encoding that needs to take place in the server PC and the decoding taking place in the living room PC would produce too much latency.

That said, your ISP bandwidth caps shouldn't affect your home network speeds. Steam In-Home Streaming doesn't use your internet connection to stream, it uses your LAN.
If 1080p is choppy and you're using Cat5 on both ends, then it's probably due to having a slow router or an older HTPC that can't handle the decoding task.

It's a $200 Asus router and a brand new Nvidia shield console, so no. I'm pretty sure most people's streaming experiences are not perfect at 1080p.
 
I don't think you can really make such far reaching assertions based on the steam survey. I'm not a participant, and neither are a lot of other people, regardless of their display type. Could be people who game on TVs are less likely to be a participant. And I'd say a much larger % of 4k displays are hdmi only, not just 10%.
You can't get an exact count of affected people, but you can get a ballpark figure. If it's 100% instead if 10% and there are twice as many people off Steam as on it's still a small drop in the bucket. These type of GPUs potentially move millions. I imagine that they have to to recoup R&D costs.

I honestly don't know how targeting 4k gamers is profitable at all, unless that market grows rapidly in the near future. Or maybe the profit margins are insanely high for the premium parts. It's such a niche within a niche.
 
Is there any data on how something like an adapter actually impacts customer purchases? We live in an accessory world, after all.

DP to DVI at 1080p has no impact from the adapters I have at home (my old monitor which I gave to my GF only had DVI and HDMI). Since his scenario is those 1080p Korean monitor users, the lack of a DVI port on the card is as much a non issue as it can be. But hey, that's Universal Soldier for you in an AMD thread.
 
You can't get an exact count of affected people, but you can get a ballpark figure. If it's 100% instead if 10% and there are twice as many people off Steam as on it's still a small drop in the bucket. These type of GPUs potentially move millions. I imagine that they have to to recoup R&D costs.

I honestly don't know how targeting 4k gamers is profitable at all, unless that market grows rapidly in the near future. Or maybe the profit margins are insanely high for the premium parts. It's such a niche within a niche.

This high end of a gpu is niche in itself. 144hz, gysnc, freesync, 1440p, and 4k monitors are all very niche in comparison in a 1080p monitor. If you only have a 1080p monitor you have no business buying this card.
 
This high end of a gpu is niche in itself. 144hz, gysnc, freesync, 1440p, and 4k monitors are all very niche in comparison in a 1080p monitor. If you only have a 1080p monitor you have no business buying this card.

1440p isn't nich at all nowadays. While I have no intention to move to a 4k monitor before at least 2-3 more years (so that GPUs have enough time to mature), such a card is very very handy for 1440p gaming and I may go Fury to replace my 970 and move it to the living room PC.
 
OK, if you have a 1080p 60hz monitor, you have no business buying this card. High refresh rate monitors are not commonplace.

No one buying a 650$ card is using a 60hz 100$ 1080p monitor

Stop moving goalposts

Hell (almost) no one buying a 400$ card is either.
 
No one buying a 650$ card is using a 60hz 100$ 1080p monitor

Stop moving goalposts

Hell (almost) no one buying a 400$ card is either.

The only thing I'm tying to say is that the use of a 4k TV as a monitor is not really any less niche as a 1440p gysnc monitor. The display is going to be niche in most scenarios in this high end of a market segment.
 
What some of you seem to be overlooking is the fact that these newer GPUs are supposed to be pushing monitor/television technology forward. Not the other way around.

You might say 'eh who cares, it's a fraction of a fraction of people' but the truth is that right now PCs are the only things truly pushing monitor technology forward and same with 4K adoption rate, whether it's on a monitor or TV.

I think it's pretty silly to market the card as a 4K powerhouse yet still using dated display interfaces.
 
This high end of a gpu is niche in itself. 144hz, gysnc, freesync, 1440p, and 4k monitors are all very niche in comparison in a 1080p monitor. If you only have a 1080p monitor you have no business buying this card.

Nonsense. With VSR, 1080 displays can still shine. Certainly 1440 is better since you can do native 1440 or super sample from higher as well, but I like to use a 1080 still as it has better compatibility with consoles and you have the benefit being able to run insanely demanding games at 1080 if you choose. With 1440, you have to start to sacrifice texture and effect settings if a game is absurdly demanding, since 1080 interpolated on a 1440 display looks really bad.

I'm planning a new build in August, and will stick with a 1080 60Hz display but may go Fury/Fury X if the performance sells itself. I also have a 1080 120Hz display and can totally appreciate high framerates, but I prefer the 60Hz one since it is IPS. I want to go 1440 144Hz eventually but I'm waiting to see more competition for the ASUS MG279Q.
 
DP to DVI at 1080p has no impact from the adapters I have at home (my old monitor which I gave to my GF only had DVI and HDMI). Since his scenario is those 1080p Korean monitor users, the lack of a DVI port on the card is as much a non issue as it can be. But hey, that's Universal Soldier for you in an AMD thread.

The best part of this post is you literally have no idea what you're talking about.

If a simple adapter was all that was needed to make the high refresh Korean IPS monitors work with DP or wouldn't be a big deal.

People still try to make it personal with me over video cards. It's hilarious to witness, I feel like I'm at HardOCP sometimes.
 
What some of you seem to be overlooking is the fact that these newer GPUs are supposed to be pushing monitor/television technology forward. Not the other way around.

You might say 'eh who cares, it's a fraction of a fraction of people' but the truth is that right now PCs are the only things truly pushing monitor technology forward and same with 4K adoption rate, whether it's on a monitor or TV.

I think it's pretty silly to market the card as a 4K powerhouse yet still using dated display interfaces.

You should do some research on DisplayPort. It's aim is precisely to push monitor tech forward. It's several years newer and superior to the latest HDMI 2.0 even in its older 1.2 interface, so just about the opposite of dated too.

The debate isn't whether AMD is pushing tech forward or not. It's whether they are justified in omitting HDMI 2.0 or not given the size of market whom that decision effects.
 
You should do some research on DisplayPort. It's aim is precisely to push monitor tech forward. It's several years newer and superior to the latest HDMI 2.0 even in its older 1.2 interface, so just about the opposite of dated too.

The debate isn't whether AMD is pushing tech forward or not. It's whether they are justified in omitting HDMI 2.0 or not given the size of market whom that decision effects.

Yes, which is why the new GPUs should support BOTH for more options. Do some research into the amount of 4K TVs that have DP inputs, and you'll see the reason for including HDMI 2.0.

I mean, can you really defend the inclusion of HDMI1.4? They've already got the port... it should be to the latest specification.
 
No one buying a 650$ card is using a 60hz 100$ 1080p monitor

Stop moving goalposts

Hell (almost) no one buying a 400$ card is either.
I got a 780ti for 720p 3D (max for 3DTV Play at the moment) and I'm still not close to happy as I can't get 60 fps with any decent settings in modern AAA games. Even in 1080p 2D, I can't get watch dogs, unity or dead rising 3 at a solid 60 fps.
 
People still try to make it personal with me over video cards. It's hilarious to witness, I feel like I'm at HardOCP sometimes.

Because you obviously are in every single AMD thread and you're hardcore spinning in Nvidia ones, especially with the 3.5 gb fiasco.

And yes, most DP to DVI adaptaters at 1080p work perfectly fine with Korean monitors provided they didn't buy dogshit ones llke some did, which is all on them.
 
Yes, which is why the new GPUs should support BOTH for more options. Do some research into the amount of 4K TVs that have DP inputs, and you'll see the reason for including HDMI 2.0.

I mean, can you really defend the inclusion of HDMI1.4? They've already got the port... it should be to the latest specification.

My contention is you said it will use 'dated display interfaces', which is completely wrong.

It uses the latest display interface but doesn't include another separate display interface standard. Yes, it should probably have included HDMI 2.0, but I don't think that that is a massive issue.
 
My contention is you said it will use 'dated display interfaces', which is completely wrong.

It uses the latest display interface but doesn't include another separate display interface standard. Yes, it should probably have included HDMI 2.0, but I don't think that that is a massive issue.

HDMI 1.4 is dated. Actually so is DP 1.2a. However I was referring specifically to HDMI 1.4. When your competitor has 2.0 support, it seems like an oversight to not include it imo. That's all.
 
DP to DVI at 1080p has no impact from the adapters I have at home (my old monitor which I gave to my GF only had DVI and HDMI). Since his scenario is those 1080p Korean monitor users, the lack of a DVI port on the card is as much a non issue as it can be. But hey, that's Universal Soldier for you in an AMD thread.

Don't forget he claims AMD is abandoning HDMI, clearly they're starting a war against it despite recently showing off freesync working through it.

*Posts almost almost nothing but negative things in AMD threads while posting in high, high volumes

*Wonders why people are calling him out
 
Ionic said:
People still try to make it personal with me over video cards. It's hilarious to witness, I feel like I'm at HardOCP sometimes.
Because you obviously are in every single AMD thread and you're hardcore spinning in Nvidia ones, especially with the 3.5 gb fiasco.

And yes, most DP to DVI adaptaters at 1080p work perfectly fine with Korean monitors provided they didn't buy dogshit ones llke some did, which is all on them.

That's not what I typed ):
 
The best part of this post is you literally have no idea what you're talking about.

If a simple adapter was all that was needed to make the high refresh Korean IPS monitors work with DP or wouldn't be a big deal.

People still try to make it personal with me over video cards. It's hilarious to witness, I feel like I'm at HardOCP sometimes.
I don't see why a DisplayPort to Dual Link DVI adapter wouldn't work.
 
Because you obviously are in every single AMD thread and you're hardcore spinning in Nvidia ones, especially with the 3.5 gb fiasco.

And yes, most DP to DVI adaptaters at 1080p work perfectly fine with Korean monitors provided they didn't buy dogshit ones llke some did, which is all on them.
I think you are greatly missing the point.

No one ever bought 1080p Korean monitors. Many enthusiasts (including me!) bought 1440p Korean IPS monitors, because for a long time their price/performance was absolutely unbeatable.

I don't see why a DisplayPort to Dual Link DVI adapter wouldn't work.
It would. Those aren't quite $10 though.
 
I don't see why a DisplayPort to Dual Link DVI adapter wouldn't work.

Those are the $100 adapters and the people who have tried the things say they crap out at high refresh rates and are buggy regardless. They also reportedly introduce significant input lag because of the conversion process. I have no first hand experience with the things though.
 
It's a $200 Asus router and a brand new Nvidia shield console, so no. I'm pretty sure most people's streaming experiences are not perfect at 1080p.

1. You're using Nvidia's proprietary streaming service on the Shield console not Steam In-Home streaming. That like saying a burger tastes bad based on your experience with pizza.

2. Are you talking about streaming from your PC or from Nvidia's servers? Those are two completely different things.

3. You were saying that your internet speed would keep you from Steam Streaming. That makes it sound like your only experience is streaming from Nvidia's servers if you didn't realize internet bandwidth plays no part in the Steam Streaming equation.


I stream from my gaming PC to an AMD APU powered HTPC with few problems. Obviously there's latency but the image quality is fine. I use a Netgear Nighthawk router.
 
No one buying a 650$ card is using a 60hz 100$ 1080p monitor

Stop moving goalposts

Hell (almost) no one buying a 400$ card is either.

If you want to play games like The Witcher 3 or GTA V max settings 1080p@60fps you are going to need a card like this, or at least a 980.
 
Nonsense. With VSR, 1080 displays can still shine. Certainly 1440 is better since you can do native 1440 or super sample from higher as well, but I like to use a 1080 still as it has better compatibility with consoles and you have the benefit being able to run insanely demanding games at 1080 if you choose. With 1440, you have to start to sacrifice texture and effect settings if a game is absurdly demanding, since 1080 interpolated on a 1440 display looks really bad.[/URL].

Not sure why people are shitting on 1080 / 60. I'd much rather play demanding games to their fullest at 1080, than sub 60 at 1440...or even having to severely compromise settings to achieve 60FPS @ 1440.

Perhaps when the people playing at 45FPS @ 1440 (For demanding games) switch over to 4K in a few years to play demanding games at 45FPS once again, I'll consider making the jump from 1080 to 1440.


I took a long-winded way of saying I agree with you in case you thought I was debating / replying to the wrong person.
 
Not sure why people are shitting on 1080 / 60. I'd much rather play demanding games to their fullest at 1080, than sub 60 at 1440...or even having to severely compromise settings to achieve 60FPS @ 1440.

Perhaps when the people playing at 45FPS @ 1440 (For demanding games) switch over to 4K in a few years to play demanding games at 45FPS once again, I'll consider making the jump from 1080 to 1440.


I took a long-winded way of saying I agree with you in case you thought I was debating / replying to the wrong person.

But you can play 45 fps @ 4K right now on a single card. Both the Fury and 980 Ti can do it.
 
This high end of a gpu is niche in itself. 144hz, gysnc, freesync, 1440p, and 4k monitors are all very niche in comparison in a 1080p monitor. If you only have a 1080p monitor you have no business buying this card.

No one buying a 650$ card is using a 60hz 100$ 1080p monitor

Stop moving goalposts

Hell (almost) no one buying a 400$ card is either.


1080p 60 Hz IPS. Intend to purchase Fury X (pending reviews).
 
Count me in the 1080p/60 camp as well :) well more like 1200p

I don't like that 1440p/4K raises the GPU cost for the rest of your life, so you have to spend/sacrifice a lot more to maintain the 60fps. It's like a trophy spouse or something.

No one ever bought 1080p Korean monitors. Many enthusiasts (including me!) bought 1440p Korean IPS monitors, because for a long time their price/performance was absolutely unbeatable.
Ahh reminds me of good old days when enthusiasts would get refurbed Sun or Apple monitors.
 
Count me in the 1080p/60 camp as well :) well more like 1200p

I don't like that 1440p/4K raises the GPU cost for the rest of your life, so you have to spend/sacrifice a lot more to maintain the 60fps. It's like a trophy spouse or something.

But it also improves your desktop experience immensely for the rest of your life
 
Do you guys really think that Fury X will be able to overclock 15-30%?

I have seen some people say this, but it seems more like wishful thinking for several reasons. First, I think Fury X is already running at its maximum, and with its 4096 SP's, it runs really hot hence the watercooling. Likewise, it's the reason why we will only see watercooled Fury X and not a Fury X on air (there is of course Fury Pro which will run on air, but it's a cut down). It's also AMD's biggest chip, and it is very power hungry, and I think that increasing the voltage would make it throttle. Sure it runs cool but looking at Hawaii chips, they're not really known to be overclockers.

Hope I'm wrong, but I really don't think Fury X will be able to overclock as well as Nvidia's Maxwell's chips.
 
Do you guys really think that Fury X will be able to overclock 15-30%?

I have seen some people say this, but it seems more like wishful thinking for several reasons. First, I think Fury X is already running at its maximum, and with its 4096 SP's, it runs really hot hence the watercooling. Likewise, it's the reason why we will only see watercooled Fury X and not a Fury X on air (there is of course Fury Pro which will run on air, but it's a cut down). It's also AMD's biggest chip, and it is very power hungry, and I think that increasing the voltage would make it throttle. Sure it runs cool but looking at Hawaii chips, they're not really known to be overclockers.

Hope I'm wrong, but I really don't think Fury X will be able to overclock as well as Nvidia's Maxwell's chips.

AMD has stated that it runs at 50c under full load. They stated the board comes with 6 phase power for 400 amps. It runs at 275w but comes with two 8-pin power connections and the cooler is rater at dissipating 500w. So yeah, I do think it's going to be a killer overclocker.
 
Can HDMI 1.4 or whatever handle 1440p at 60? What does 1440p look like on your TV?

Yes. I use 1440p on my 4K display at times when I don't have the performance. It scales just fine, and is a nice improvement over 1080p for sure.

AMD has stated that it runs at 50c under full load. They stated the board comes with 6 phase power for 400 amps. It runs at 275w but comes with two 8-pin power connections and the cooler is rater at dissipating 500w. So yeah, I do think it's going to be a killer overclocker.

Apparently they have a 75c throttle limit in the card, so that might limit the amount of voltage OC you can do. Then again maybe that's something that can be ignored with a bios switch.
 
Apparently they have a 75c throttle limit in the card, so that might limit the amount of voltage OC you can do. Then again maybe that's something that can be ignored with a bios switch.

I doubt it. There are two reasons it has that 75c limit.

1) watercooling is not effective when the temperature rises above that level. It was the reason they gave for the 295X2 throttling at that temperature.
2) the gpu or ram or even both is sensitive to high temperatures.
 
I doubt it. There are two reasons it has that 75c limit.

1) watercooling is not effective when the temperature rises above that level. It was the reason they gave for the 295X2 throttling at that temperature.
2) the gpu or ram or even both is sensitive to high temperatures.

What about the air cooled fury though?
 
Top Bottom