Unknown Soldier
Member
It's amusing that you don't even recognize how your retort here defeats your own earlier hysterics as well.
Really? Do tell. Because you can OC a 980 Ti and a Titan X. The Fury X on the other hand
It's amusing that you don't even recognize how your retort here defeats your own earlier hysterics as well.
As helpful as always tuxfool many thanks.All PC games that started development 2 years ago should be targeting Dx11 and Ogl 4. There are exceptions such as Deus Ex:MD offering DirectX 12 support (comes out early next year), however it should also support dx11.
Wasn't 10 worse?As helpful as always tuxfool many thanks.
Developers should be glad they are moving away from 11 since from what i hear and experience in the games that supported it was a very resource intensive API.
Wasn't 10 worse?
Glide?And 9 before it. They're merely products of their time. A couple of years ago a thin API like dx12 on PC wouldn't be possible.
Glide?
Brad Grenz said:It's amusing that you don't even recognize how your retort here defeats your own earlier hysterics as well.Unknown Soldier said:Yep, in the fantasy world where you can't overclock the Titan X too. Nvidia is clearly fantasy embarrassed in that fantasy world!Brad Grenz said:Wow, the Titan X is getting HUMILIATED by an overclocked 980 Ti. nVidia must be SO EMBARRASSED right now!Unknown Soldier said:The Fury X gets humiliated by an overclocked 980 Ti
Only worked on 3dfx.
Guys, stop feeding the you-know-who.
That's a GTX 980ti in that picture.
This it what baffles me, how in the world a new card doesn't have HDMI 2.0? Its marketed as a 4k card and a lot of people (me one of them) have 4k Tv's instead of a monitor. I guess I'm going team green this time. AMD may lose some sales due that.
Yeah, I bet AMD never intended it to be a huge seller, I could see it as a trial run for thier new memory architecture. They might not be making a profit on it, perhaps even a loss, is that a thing in this market? Really, with this process node, all things point to hbm2, and the die shrink as the real story.mhm. Everything seems to point to the card being supply constrained, they'll empty out inventory. They'll only keep this up for the next 6/7 months and then everybody will be focused on the upcoming new things.
Yeah, despite the ways people want to deflect it, it's a glaring omission for a GPU purportedly aimed at 4k. I don't really know what they were thinking with both this and the dual-link DVI issue. I can't believe cutting costs by a few cents is worth that.This it what baffles me, how in the world a new card doesn't have HDMI 2.0? Its marketed as a 4k card and a lot of people (me one of them) have 4k Tv's instead of a monitor. I guess I'm going team green this time. AMD may lose some sales due that.
Here in Austria, it's available in stock already. Non-reference 980tis are harder to get.mhm. Everything seems to point to the card being supply constrained, they'll empty out inventory. They'll only keep this up for the next 6/7 months and then everybody will be focused on the upcoming new things.
Holy shit. What the Fuck AMD?dual-link DVI issue.
Hopefully new 3rd party revisions are better with those issues. Almost grabbed one yesterday, but held off because of those.
Why are you considering it over the 980Ti?
You're not going to get non-ref Fury X cards.Hopefully new 3rd party revisions are better with those issues. Almost grabbed one yesterday, but held off because of those.
Why are you considering it over the 980Ti?
So Fury X is generally around 12% worse than the 980TI in most games.
What's the pricing between the two on average?
Yeah, despite the ways people want to deflect it, it's a glaring omission for a GPU purportedly aimed at 4k. I don't really know what they were thinking with both this and the dual-link DVI issue. I can't believe cutting costs by a few cents is worth that.
Here in Austria, it's available in stock already. Non-reference 980tis are harder to get.
Rober Hallock tweeted that "It was a feature they built specifically for the 300 series."
Twitter Conversation
So does that mean they modded the chips and are doing something at a chip level or is this more PR bullshit.
So does that mean they modded the chips and are doing something at a chip level or is this more PR bullshit.
PR bullshit. Confirmed so, in fact.Rober Hallock tweeted that "It was a feature they built specifically for the 300 series."
Twitter Conversation
So does that mean they modded the chips and are doing something at a chip level or is this more PR bullshit.
PR bullshit. Confirmed so, in fact.
People have been frame limiting with external tools for a decade, and people have already run even AMD's implementation on 200-series cards with hacked drivers.
Have people flashed 390X BIOS on 290X yet? Of they did that they could use the 300/Fury driver branch instead of the older cards branch.
If anybody wanted to know more about the noise coming from the Fury X pump.
PCPER.com did a nice article with sound recording and comparison (uncompressed)
http://www.pcper.com/reviews/Graphics-Cards/Retail-AMD-Fury-X-Sound-Testing-Pump-Whine-Investigation
At the bottom they link to a new article describing newer CM parts that apparently eliminate/reduce the issue. Sucks to have a part lottery but good that they're making fast changes.
Can you flash a 290 with a 390 bios too?Have people flashed 390X BIOS on 290X yet? Of they did that they could use the 300/Fury driver branch instead of the older cards branch.
Why are you considering it over the 980Ti?
Nvidia cards are massively price hiked here in the UK. On top of that, if history repeats itself, the AMD card may fare better long term than the Nvidia counterpart. Seems like AMD cards generally tend to close or switch the gap over time.
Nvidia cards are massively price hiked here in the UK. On top of that, if history repeats itself, the AMD card may fare better long term than the Nvidia counterpart. Seems like AMD cards generally tend to close or switch the gap over time.
Well that's just bullshit :/ as a 290x owner it'd be awesome to have frame targeting built into CCC.
It means they're limiting features on older cards so you'll buy new ones but then they'll bitch about Nvidia being anticonsumer.
PR bullshit. Confirmed so, in fact.
People have been frame limiting with external tools for a decade, and people have already run even AMD's implementation on 200-series cards with hacked drivers.