Sleeping Lesson
Member
What is a logical reason they would keep the voltage locked? They had to know how the card was going to perform. Is there something we don't know? Does the Fury just go to shit at higher voltages?
So if we take into account that:
- The card is sitting at it's absolute minimum voltage right now to reduce it's power draw.
- The card is getting errors when trying to overclock at stock voltage.
- The temps at stock are incredibly low.
This card could unleash all of its Fury when the voltage gets unlocked?
Let's say that's the case, you work at AMD, and the NDA on Fury X is going to be lifted in a month.
How in the holy fuck do you not unlock the voltage before the NDA releases? They have to know the 980 Ti performance and they have to know they can't beat it. Their only possible saving grace is to out overclock it, but they won't let you do that at review time?
I will be shocked if this thing overclocks beyond a 980 Ti when the voltage is unlocked.
Let's say that's the case, you work at AMD, and the NDA on Fury X is going to be lifted in a month.
How in the holy fuck do you not unlock the voltage before the NDA releases? They have to know the 980 Ti performance and they have to know they can't beat it. Their only possible saving grace is to out overclock it, but they won't let you do that at review time?
I will be shocked if this thing overclocks beyond a 980 Ti when the voltage is unlocked.
But that's the thing! They didn't know! Remember this?
What is a logical reason they would keep the voltage locked? They had to know how the card was going to perform. Is there something we don't know? Does the Fury just go to shit at higher voltages?
Disagree, just cause the Fury X benches put it in between the 980 and the 980ti, but the 980 price is $499 ($150 lower Fury X)
Are you high? Releasing a hyped up product that loses to its primary competitor at the exact same price months after the other one came out is "doing rather well?"
You can blame AMD for making the 980 relevant again.
Are you remembering time backwards? I thought that was the opposite of how it was.
One potential explanation (note that this is just speculation) is that upping the voltage doesn't increase the clock potential significantly, but greatly increases power consumption. So they would rather have the good press about less power consumption than 290x with far better performance (which is a good thing) than slightly higher clocks.What is a logical reason they would keep the voltage locked? They had to know how the card was going to perform. Is there something we don't know? Does the Fury just go to shit at higher voltages?
After thinking about it, give this thing a $100 price cut and it'd be very competitive. But as it is, it's a hard sell and hard to recommend. Guess we'll see what happens.
CF had issues and AMD actually worked on it and shit got rather good. Nvidia didn't put so much effort into SLI. Maybe I remember wrong, just what I remember from way back then.
How likely is that they'll release a driver update in a month or so that will solve most of these problems?
But that's the thing! They didn't know! Remember this?
Haha. Unfortunately I doubt that's the case.I'd love to think their incompetence is so high they honestly didn't know, and someone at AMD right now is in a meeting going "you showed me the benchmarks! You said it was faster! We look like idiots now!"
That 12GB of VRAM will always make it the viable option when gaming at higher resolutions (even some today's reviews point this out). By no means is the Titan X an obsolete card.
I had the exact same thought, was going to post this as my response as well. Total speculation though.One potential explanation (note that this is just speculation) is that upping the voltage doesn't increase the clock potential significantly, but greatly increases power consumption. So they would rather have the good press about less power consumption than 290x with far better performance (which is a good thing) than slightly higher clocks.
I'm not sure how you can come to this conclusion with the data presented.FuryX is in 4k club with 980Ti, even if people seem to want think it doesn't have performance for it.
By the sounds of it, this card isn't going to be an overclocker even with voltage unlocked. The fact that it has problems overclocking at current voltages is worrying. It wouldn't be if it had super low power draw, but its already higher than 980Ti/TitanX.So if we take into account that:
- The card is sitting at it's absolute minimum voltage right now to reduce it's power draw.
- The card is getting errors when trying to overclock at stock voltage.
- The temps at stock are incredibly low.
This card could unleash all of its Fury when the voltage gets unlocked?
Well, that's what the non x fury is.
Yeah people are mistaking current benchmarks as meaning 980 Ti is same as Titan X but cheaper
That isnt true. Titan X has architectural advantages over the 980 Ti but with pretty much most of the existing games, they perform the same.
Whether this is the same 2 years from now is another question. Titan X has more compute & more VRAM. Right now there doesnt seem to be anything that really needs beyond 6GB.
Therefore right now the 980 Ti appears to be the better choice because of similar performance and much lower price.
We cant say that will be the case 1-2 years from now.
Lol. 980Ti.I haven't read through the reviews, but if you are in the market for a GPU, which do you buy - GTX 980 TI or Fury X?
980tiI haven't read through the reviews, but if you are in the market for a GPU, which do you buy - GTX 980 TI or Fury X?
I haven't read through the reviews, but if you are in the market for a GPU, which do you buy - GTX 980 TI or Fury X?
I haven't read through the reviews, but if you are in the market for a GPU, which do you buy - GTX 980 TI or Fury X?
Yeah, I feel like this is the GPU equivalent of the Bulldozer release. I was so hyped for it.I have been an ATI/AMD fan for a long time, always came to their defense, but this release is almost a joke. I was hyped. I was let down. I will be going Nvidia unless they really come through next year (looks doubtful now).
Cards trade blows from game to game when looking at 1440p and especially 4k, kinda stuff for which people are getting cards at this price range.
I don't think that's completely fair. Fury X has the potential to be a decent competitor with some driver fixes and price adjustments.Yeah, I feel like this is the GPU equivalent of the Bulldozer release. I was so hyped for it.
Lol. 980Ti.
Welcome to the thread.
980ti
980 Ti no questions asked, unless the Fury X gets a significant price cut.
Read through the reviews. The recommendations are there.
Many have said the Ti
http://www.pcper.com/reviews/Graphi...Review-Fiji-Finally-Tested/Grand-Theft-Auto-V
Fury X has some bad frame times in GTA V 2160p and gets well beat at 1440p by the 980ti in this test.
Overall not a bad alternative, just needs some better drivers. At this point I'd plump for a 980ti but the Fury X is a cool product for sure.
They already had to give it a price cut after 980Ti forced them to. The issue with Fiji is that both the GPU die and the HBM stacks are very expensive. Mostly the latter I'd assume as the die itself isn't that different from GM200.After thinking about it, give this thing a $100 price cut and it'd be very competitive. But as it is, it's a hard sell and hard to recommend. Guess we'll see what happens.
Well, that's what the non x fury is.
Yeah people are mistaking current benchmarks as meaning 980 Ti is same as Titan X but cheaper
That isnt true. Titan X has architectural advantages over the 980 Ti but with pretty much most of the existing games, they perform the same.
Whether this is the same 2 years from now is another question. Titan X has more compute & more VRAM. Right now there doesnt seem to be anything that really needs beyond 6GB.
Therefore right now the 980 Ti appears to be the better choice because of similar performance and much lower price.
We cant say that will be the case 1-2 years from now.
That being said, unless someone can trivially afford Titan X(s) I would recommend 980 Ti as the better value. But its not necessarily true that current performance remains the same over time
The only "architectural" advantage Titan X has above 980Ti is it's additional 6GB of RAM. Well, that and the price which is bigger as well.
That 1440p result is terrible... 33% slower :|
Fury X only really makes sense at 4K, but even then it gets beat by the 980 Ti in most cases. Even if they priced it at $550 it would be in a weird spot, anyone that is gaming at 4K is on the bleeding edge and can surely afford to fork out another $100.
4k monitors and TV's are getting pretty affordable nowadays, actually.That 1440p result is terrible... 33% slower :|
Fury X only really makes sense at 4K, but even then it gets beat by the 980 Ti in most cases. Even if they priced it at $550 it would be in a weird spot, anyone that is gaming at 4K is on the bleeding edge and can surely afford to fork out another $100.
AMD... drivers... In a month... Hmmmm.... *holds back laughter*
They already had to give it a price cut after 980Ti forced them to. The issue with Fiji is that both the GPU die and the HBM stacks are very expensive. Mostly the latter I'd assume as the die itself isn't that different from GM200.
Non X Fury will be quite a lot slower than Fury X I think.
.
That 1440p result is terrible... 33% slower :|
Fury X only really makes sense at 4K, but even then it gets beat by the 980 Ti in most cases. Even if they priced it at $550 it would be in a weird spot, anyone that is gaming at 4K is on the bleeding edge and can surely afford to fork out another $100.
Titan X is not.
No single card currently makes sense at 4K.
I don't think that's completely fair. Fury X has the potential to be a decent competitor with some driver fixes and price adjustments.
Bulldozer was never ever going to be competitive.
Yup. My TitanX gets its ass kicked at 4K in both The Witcher 3 and (lolol) Batman Arkham Knight. Can barely maintain 30 FPS in those games. And I have to kill hairworks in The Witcher 3 to do it.
But is freaking glorious in Battlefield 4. Seems to hover around 70 FPS in 64 player battles.
4k monitors and TV's are getting pretty affordable nowadays, actually.
So? That's a purely performance difference and it's clearly not enough to justify paying $350 more for Titan X than 980Ti.Actually, the Titan X has more cores/TMUs as well.
Double post.Titan X is not.
Based on what, exactly? It's price? Precedent has shown that it should be only 5-10% slower than the top card. Remember, the Fury will be air-cooled, so they'll save money right there on BOM over the Fury X. There will also be non-reference designs for the Fury, could be the one to watch. But how many times have we said that when it comes to AMD...
Älg;169629935 said:Damn, seems to me like there's absolutely no reason to get this over a 980Ti..
Whats the use of slapping an expensive AIO if it overclocks like shite. I rather pay 5% more for a custom 980ti and overclock the hell outta it
If the card isn't a good overclocker, then what purpose does the watercooler serve? Was that needed for the card to even run properly at stock?
ATM the there's really no reason to get the Fury X over the 980ti unless you prefer cooler temps. AMD dropped the ball IMO.
I needed a new monitor too and now that I think of it... The package price on a freesync and the fury x is about 250 dollars less than the 980ti and gsync monitor so essentially its quite cheaper... Hmmmm
I couldn't be more disappointed. Nvidia is probably laughing their asses off reading these reviews.
1. 980ti outperforms it, relased before it, has more features than it, and has the general reassurance of better performing Nvidia drivers.
2. Without HDMI 2.0 this is pretty much useless to any one with a 4K television and serious PC gaming ambitions. Congratulations AMD.
3. The cost/performance ratio is just not there and the water cooler model actually inhibits it from a lot of cases.
I think if they would have had 6 or 8gigs of men, it would have been more of a show stopper IMO.
No single card currently makes sense at 4K. If you want to play any of the latest graphically intense games at anything above 30 fps, you are going to need an SLI setup. Even then, games like GTA V and Witcher 3 can't maintain 60 fps. Given how much these cards cost, I'm sure you aren't going to settle for 30. So advertising a card to be "ideal" for 4K gaming is actually pretty misleading for both Nvidia and AMD to do. We just aren't there yet.