• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD/NVIDIA market-share graph. Spoiler Alert: it ain't pretty.

Graphics card market has been getting stupid for such a long time.

Prices have continued to rise but performance jumps haven't. I mean no card get do 4k particulary well. My last AMD card was a HD5850, a great card. I held onto it for a lot longer than I thought I would as nothing was beating it by much for a good two years.
Back then there were a clear two tier card level. Wheras Nvidia have been pulling some utter bullshit for a few years now.

Right when Kepler launched it was clear that the top end card was not a top end card, it was neutered but was so much better than anything AMD could come up with. This allowed them to have a big launch price, then whip out the titan and other shit. Both companies are guilty of rebranding old stuff as new too. The 5850 cost me £200. 2nd tier cards cost a lot more than that now. I eventually got a gtx670 for £250 ish, still using it.
 
Cheapest 290X available in Canada is $500 post rebate, and the 980 is $700 frickin dollars. I cannot figure out for the life of me why the 970 is $400 here and the 980 is 40% more expensive.

970 was always much better value than 980. When they came out the consensus from reviews was the same - "980 is meh, 970 is amazing". Part of the reason the 290/x are so cheap now is because of the value proposition of 970. Prices dropped in response to it. Can you import for cheaper from Amazon US?
 

diaspora

Member
970 was always much better value than 980. When they came out the consensus from reviews was the same - "980 is meh, 970 is amazing". Part of the reason the 290/x are so cheap now is because of the value proposition of 970. Prices dropped in response to it. Can you import for cheaper from Amazon US?

Yes.
No. Customs.

edit: To clarify, I've got a 7970 and gtx 970 already from 2013 and 2014 respectively. New GPUs are for new machines. Planned on selling the AMD one and passing the 970 one to my brother but there's nothing affordable to replace it bar the 290X.
 
Graphics card market has been getting stupid for such a long time.

Prices have continued to rise but performance jumps haven't. I mean no card get do 4k particulary well. My last AMD card was a HD5850, a great card. I held onto it for a lot longer than I thought I would as nothing was beating it by much for a good two years.
Back then there were a clear two tier card level. Wheras Nvidia have been pulling some utter bullshit for a few years now.

Right when Kepler launched it was clear that the top end card was not a top end card, it was neutered but was so much better than anything AMD could come up with. This allowed them to have a big launch price, then whip out the titan and other shit. Both companies are guilty of rebranding old stuff as new too. The 5850 cost me £200. 2nd tier cards cost a lot more than that now. I eventually got a gtx670 for £250 ish, still using it.
I agree, prices have skyrocketed since that time with very little to show. Inflation hasn't gone up that much since 2009 to warrant such a massive price increase.
 

IceIpor

Member
Yes.
No. Customs.

edit: To clarify, I've got a 7970 and gtx 970 already from 2013 and 2014 respectively. New GPUs are for new machines. Planned on selling the AMD one and passing the 970 one to my brother but there's nothing affordable to replace it bar the 290X.

Well, to be fair, the 290x was around $390 at Christmas time... Before the Canadian dollar went further into the crapper.

The 290 was had for less than $270 at the same time back then.
 

diaspora

Member
Well, to be fair, the 290x was around $390 at Christmas time... Before the Canadian dollar went further into the crapper.

The 290 was had for less than $270 at the same time back then.

you're killing me here
hoo doggy, 290x 8GB is $550
 

diaspora

Member
Can you drive to a micro center in the US? Should be able to get a 290x for about $300.

I would make up the cost difference in gas. I'm not exactly close to the border. Best case scenario, the 390x drives 290x prices lower.

edit: Ideally 980 too.
 
For anyone wanting a comparison of drivers between AMD and Nvidia, have a look at this piece on Hexus I read today.

Certainly in terms of which company delivers the best drivers to improve performance in this test, it was very close between the two last year.
 
This was posted earlier on that regard:

Yeah, that was in response to NVidia preparing some documentation and offering some broad assistance to the nouveau (open-source driver) developers. In the time since that my understanding is that they've cooled off their participation again somewhat and still haven't provided some of the things that would be necessary to get nouveau to a more functional level.
 
For anyone wanting a comparison of drivers between AMD and Nvidia, have a look at this piece on Hexus I read today.

Certainly in terms of which company delivers the best drivers to improve performance in this test, it was very close between the two last year.

This isn't all that useful information. It's simply showing how GPU performance has scaled over time with new driver releases in GPU bottlenecked scenarios.

It does nothing to measure the underlying driver overhead.
 
Happily!

SNIP

Thanks for the response but I still don't buy it; I have two major objections to your proof:

1. CPU overhead / draw call benchmark numbers are meaningless wrt framerate performance and are certainly not comparable across GPUs and architectures
2. The minimum FPS numbers can not reliably be traced back to deficiencies in overhead - framerate performance can take a hit due to a million things
3. Ergo, bigger overhead numbers can not be reliably linked to lower framerate.

Hence, my mind, "the whole AMD drivers have terrible overhead which causes terrible performance" is your conjecture that can not be reliably measured or proven. So yeah, I'm still not buying it, sorry.
 
Thanks for the response but I still don't buy it; I have two major objections to your proof:

1. CPU overhead / draw call benchmark numbers are meaningless wrt framerate performance and are certainly not comparable across GPUs and architectures
2. The minimum FPS numbers can not reliably be traced back to deficiencies in overhead - framerate performance can take a hit due to a million things
3. Ergo, bigger overhead numbers can not be reliably linked to lower framerate.

Hence, my mind, "the whole AMD drivers have terrible overhead which causes terrible performance" is your conjecture that can not be reliably measured or proven. So yeah, I'm still not buying it, sorry.

Could you please provide a reason why AMD's cards always perform worse than Nvidia's in CPU limited scenarios then? If it isn't driver overhead causing this issue then what is it instead? Even if it was something else, in what way is that a positive for AMD? Either way you're still getting a CPU performance downgrade moving from an Nvidia GPU to an AMD GPU.


Oh and p.s. If it comes to technology and you're arguing against Durante, you're better off simply saving yourself the time and assuming you're wrong. ;)
 
Could you please provide a reason why AMD's cards always perform worse than Nvidia's in CPU limited scenarios then? If it isn't driver overhead causing this issue then what is it instead? Even if it was something else, in what way is that a positive for AMD? Either way you're still getting a CPU performance downgrade moving from an Nvidia GPU to an AMD GPU.


Oh and p.s. If it comes to technology and you're arguing against Durante, you're better off simply saving yourself the time and assuming you're wrong. ;)

I can't give you a satisfactory reason. Neither am I trying to tell you that AMD's overhead is less or multithreaded performance is better. Both of these could very well be right, but regardless, you can't reliably argue causality between overhead and performance, or link these deficiencies to drivers as they may very well be architectural. I'm just saying that the AMD driver team are taking flak here for things they probably shouldn't.

As for arguing with Durante, why not? Even if I'm wrong, I will have learned something.
 

hodgy100

Member
AMD cards are great. so are Nvidia cards.

want a lower budget high end gpu -> AMD r9-290/290x
want a high end gpu with all the frills and don't mind spending a bit more -> Nvidia gtx 970/980
want the currently highest performing single chip gpu -> GTX Titan x
want the most powerful single GPU card on the market -> R9-295
Want the most powerful gaming computer -> Titan X SLI
Want have a low end system but want to play games - > gtx750ti
Game on linux -> Nvidia

both manufacturers trade blows, its no surprise nvidia comes out on top regarding market share as they really are on the ball for workstations and the like they pretty much have the business sector to themselves. But those claiming AMD cards are shit are making a ridiculous statement :/
 
I've never understood why AMD trails so far behind Nvidia. For a long time, they've been providing cards that benchmark higher than the equivalently-priced GFX card. Are people really just avoiding AMD because of advertising or FUD about unstable drivers (which hasn't been true for a very long time)?

I don't know about anyone else, but when I'm buying a card, I usually* just look at what I can afford and pick the one with the highest FPS benchmarks. I'll read some reviews to make sure there are no obvious problems (noise, overheating, whatever), but otherwise, bang/buck is king.

The only downsides to AMD are more CPU-intense drivers (not unstable!); and more heat/power. CPU-overheads are rarely important unless you've really min-maxed for graphics, with your processor as the dump stat.

Currently, gfx have a huge power-consumption advantage, which is the only thing I see as a legitimate reason for their dominance in the mainstream market (290X/980 buyers are a very small segment). The 750ti is an amazing card because it'll turn a basic Dell office PC into a half-decent gaming machine, without needing a gaming PSU or any other difficult/scary upgrades.

*I have just got a gfx970 , because the 'cheaper' 290 would require a new PSU and dumping an extra 150W of heat into a small room means it turns into a sauna if I'm not careful.
 

Durante

Member
Thanks for the response but I still don't buy it; I have two major objections to your proof:

1. CPU overhead / draw call benchmark numbers are meaningless wrt framerate performance and are certainly not comparable across GPUs and architectures
Of course they are comparable. The overhead benchmark simply measures the number of draw calls that can be achieved on a given HW/driver stack in a given API. When one HW/driver stack performs much worse with one API even though another proves that the actual hardware is on par, then the only valid conclusion is that this given driver stack implements the other API far less effectively than its competitor.

2. The minimum FPS numbers can not reliably be traced back to deficiencies in overhead - framerate performance can take a hit due to a million things
This would be true in a general case. In this specific case, we see the same GPU HW tested across different CPUs, and the HW that performs far better on the faster CPU performs worse on the slower one. Again, the only valid conclusion is that it is more limited by CPU performance than its competitor, and since the game's GPU load is vendor-agnostic this clearly points to driver API overheads.

Hence, my mind, "the whole AMD drivers have terrible overhead which causes terrible performance" is your conjecture that can not be reliably measured or proven. So yeah, I'm still not buying it, sorry.
You arguments seem to boil down to "I don't want to believe this so I won't", even in the face of conclusive evidence. I can't help you with that.

Also, no one said ""the whole AMD drivers have terrible overhead which causes terrible performance". What I said is that AMD drivers suffer from higher CPU overheads in DX9 and DX11 workloads, and particularly so in multithreaded DX11 workloads. And the references I've provided show exactly that.
 

riflen

Member
I've never understood why AMD trails so far behind Nvidia. For a long time, they've been providing cards that benchmark higher than the equivalently-priced GFX card. Are people really just avoiding AMD because of advertising or FUD about unstable drivers (which hasn't been true for a very long time)?

I don't know about anyone else, but when I'm buying a card, I usually* just look at what I can afford and pick the one with the highest FPS benchmarks. I'll read some reviews to make sure there are no obvious problems (noise, overheating, whatever), but otherwise, bang/buck is king.

The only downsides to AMD are more CPU-intense drivers (not unstable!); and more heat/power. CPU-overheads are rarely important unless you've really min-maxed for graphics, with your processor as the dump stat.

Currently, gfx have a huge power-consumption advantage, which is the only thing I see as a legitimate reason for their dominance in the mainstream market (290X/980 buyers are a very small segment). The 750ti is an amazing card because it'll turn a basic Dell office PC into a half-decent gaming machine, without needing a gaming PSU or any other difficult/scary upgrades.

*I have just got a gfx970 , because the 'cheaper' 290 would require a new PSU and dumping an extra 150W of heat into a small room means it turns into a sauna if I'm not careful.

You make no mention of features at all. Perhaps this goes some way to explaining why you don't understand the trend in sales, as some of these features add real value for people. Some examples:

ShadowPlay - This was a really big deal when it was released. You can record your gameplay easily, reliably and efficiently with a few button presses. Again, it took a long time for an alternative to arrive from AMD. All the while Nvidia had been aggressively adding features at a swift rate; different resolutions and bitrates, desktop/windowed recording, broadcasting support, etc.

Downsampling - Until they introduced VSR extremely recently, downsampling was practically impossible on AMD hardware. Nvidia had custom resolution support for a very long time. An excellent way to replay older games, or take advantage of extra GPU power you might have.

G-Sync - Again, an alternative has been a long time coming and this is a really massive improvement to existing display synchronisation options.

Compatibility bits for SLI and Anti-Aliasing - This is more for the enthusiast crowd, but some people will not give up the ability to force SGSSAA in their favourite game. SLI compatiblity bits are sometimes needed if you want to play a new game in SLI and the profile is not available or not optimal. These are edge cases, but it's making these power tweaks available that can retain customers.

There are certainly people for whom none of these things matter at all and I can understand why Nvidia GPUs can seem unnecessarily expensive to them. AMD do often develop equivalent features in time, but they're not always comparable in quality. Even so, immediacy is valuable to some, as their leisure time can be limited.
 
AMD cards benefit much more from DX12 than Nvidia cards. This was something that most reviewers did not expect at all.

This sort of thing is exactly what those of us in the "we don't like AMD's drivers" camp have been complaining about. The hardware is fine, it's the firmware/driver that is shit.

Put it this way, when you have a graphics API abstraction from an OS vendor perform what is equivalent to the GPU version of the 2nd coming of Christ, you can't deny that the GPU maker itself has screwed up huge when an outsider provides such a better service that the product itself outclasses its direct competitor with the same tool. And the kicker is that the tool provided simply ignores/bypasses the factory-provided tools.

Well, I think there are at least a couple ways to interpret this knowledge. The fact that DX12 (& presumably Vulkan) has such a dramatically positive effect on AMD performance has to be recognized as the direct result of AMD pushing Mantle so effectively. At this point Mantle is far & away the single most important factor as to why MS finally got off their ass with DX12. From what I understand, once the low level benefits of Mantle were verified AMD has since been working hand in hand with MS to essentially implement Mantle techniques into DX12, which is going to benefit everyone...whether you have an AMD or nVidia GPU.

Now, in regards to the way you chose to interpret this information, I'd argue that while you're correct that it shows us AMD's software drivers are not as optimized or as low level as nVidia's drivers are already what it also shows us is that AMD is aware of this fact and that is precisely why they began developing Mantle. Whether the efficiency disparity between AMD & nVidia GPUs is due to AMD having less resources to put towards the design of better drivers or architectural differences that require a different approach to ring out the best performance from their GPUs - or some combination thereof - is open for debate. But, what is eminently clear in any case is that DX12 is going to help everyone...especially AMD GPUs...to maximize performance. And that is because AMD pushed Mantle. Short on resources or not, that was a smart decision which is going to benefit the entire industry & all gamers. I think we owe AMD a tad more credit than your slightly derisive interpretation implies.

As for AMD/ATI, I've always felt that their problems are entirely organizational, more on the executive side of things than the engineering side of things.

I agree with this 110%. With that in mind, Mantle absolutely had to be a bottom-up (or engineering up) directive. In the bigger picture I do think AMD may finally have the right person in the CEO chair since she at least seems wise enough to let smart people execute. The question is if its too little too late. Then again, I do think all the doom & gloom about AMD is over-stated for one big reason: Regardless of what happens in the GPU space, Intel isn't going to let AMD go out of business for the same reasons MS didn't let Apple go under...anti-trust. Intel needs AMD in the market. AMD may be the red-headed step child who gets taken behind the wood-shed every now n then...but Intel doesn't want to kill AMD off. Quite the opposite.
 
AMD has been able to close the gap before they can do it again. Their slow release of the 390X will hurt them because the majority of sales are in the $100-$250 range. The architecual changes in the 390X should be made available in a 370X but to my understanding most of their middle tier cards are just their older cards from last year with a different name. It's impossible to support AMD if they can't get over this slow production of advanced GPU technology. When they finally do they'll close the gap. If they don't Nvidia will get bigger.

There has been some conflicting information out there on this topic. From what I understand the reason AMD decided to release the entire islands GPU line at the same time this summer was so they could implement architecture improvements/changes thru the entire line and not just at the top end. The original road map had them starting a trickle release of GPUs starting in April and going thru June/July, but under that plan only the 390X & 360X would have a new arch while all the others would be re-badges. Supposedly all of the cards will be new arch at every price point now, including the all important 370X & 380X where most of the sales are made. At this point its all speculation, but one interesting document I've seen shows the 370X having lower TDP than the GTX 960. If it matches the 280X in performance (which would put it about 10-15% faster than the 960) while having a similar TDP and price point I think AMD will have a winner on their hands.

If all the info I have is wrong & they release only 1 new GPU (390X) with a bunch of re-badges at all the other price points its going to be a tough year for AMD. I'm hopeful they understand this fact & the delay to June really does mean we are getting a full lineup of new silicon.
 

ZOONAMI

Junior Member
God this whole tdp thing is ridiculous. A titan x pulls 250w, and will pull 300 if OCd. We're not even talking about 50 watts. My guess would be that while gaming they average close to the same tdp in the 200-275 watt range. At idle and non-gaming scenarios they will pull the same. In fact, amds cards have been pulling less than nvidias at idle lately - and I would think that would actually save more power over time than saving 25-50 watts at load here and there. I have no idea how nvidia has convinced you all that tdp is such is a big deal. And then the board partners release 300 watt special edition titan x, and you all go and buy those. Lol.
 

IMACOMPUTA

Member
AMD cards are great. so are Nvidia cards.

want a lower budget high end gpu -> AMD r9-290/290x
want a high end gpu with all the frills and don't mind spending a bit more -> Nvidia gtx 970/980
want the currently highest performing single chip gpu -> GTX Titan x
want the most powerful single GPU card on the market -> R9-295
Want the most powerful gaming computer -> Titan X SLI
Want have a low end system but want to play games - > gtx750ti
Game on linux -> Nvidia

both manufacturers trade blows, its no surprise nvidia comes out on top regarding market share as they really are on the ball for workstations and the like they pretty much have the business sector to themselves. But those claiming AMD cards are shit are making a ridiculous statement :/

That's not actually a single GPU card. So basically you said that AMD is only a low budget option.
 

Marlenus

Member
I can't give you a satisfactory reason. Neither am I trying to tell you that AMD's overhead is less or multithreaded performance is better. Both of these could very well be right, but regardless, you can't reliably argue causality between overhead and performance, or link these deficiencies to drivers as they may very well be architectural. I'm just saying that the AMD driver team are taking flak here for things they probably shouldn't.

As for arguing with Durante, why not? Even if I'm wrong, I will have learned something.

To be honest it seems rather obvious that the AMD drivers have more CPU overhead. You can use performance charts in CPU intensive games to see this but another way is resolution scaling. At low resolutions like 900p, 1080p the 290x is a good chunk behind the 980. If you scale up to 4k though that performance deficit decreases by quite a large margin as the settings become more and more GPU bound.

I know you could argue that is architectural but it has been the same story for multiple GPU generations now and I do not think NV would design architecture after architecture that exhibits the same resolution scaling weakness.

I do not think the issue is as serious as some make it out to be though because for the most part it only results in a small performance differential and when DX12 becomes the default API the issue will be pretty much gone anyway. I do think that AMD need to work harder on getting drivers out for newly released games though, even though gameworks can make it difficult.

There has been some conflicting information out there on this topic. From what I understand the reason AMD decided to release the entire islands GPU line at the same time this summer was so they could implement architecture improvements/changes thru the entire line and not just at the top end. The original road map had them starting a trickle release of GPUs starting in April and going thru June/July, but under that plan only the 390X & 360X would have a new arch while all the others would be re-badges. Supposedly all of the cards will be new arch at every price point now, including the all important 370X & 380X where most of the sales are made. At this point its all speculation, but one interesting document I've seen shows the 370X having lower TDP than the GTX 960. If it matches the 280X in performance (which would put it about 10-15% faster than the 960) while having a similar TDP and price point I think AMD will have a winner on their hands.

If all the info I have is wrong & they release only 1 new GPU (390X) with a bunch of re-badges at all the other price points its going to be a tough year for AMD. I'm hopeful they understand this fact & the delay to June really does mean we are getting a full lineup of new silicon.

I do not think rebadges would take this long to bring to market. I also think if AMD is serious about free sync it needs to be a feature across the entire range of cards. Being able to say that all 300 series GPUs fully support freesync is going to be important because once Gsync is more widely supported by NV cards, especially cheaper ones, people will be locked in and AMD will not be able to do much about it.
 

ZOONAMI

Junior Member
If the 380x is a more efficient full Hawaii with higher clocks and lower tdp at around $300-400 - I think that's a winner as it would outperform a gtx 980 for much less $. There is nothing wrong with a rebadge of a 290x, as long as there is some performance tweaking done to improve it. Then the 390x with it's HBM and new architecture would be high end, 380x midrange, 370x low-mid. The 370x should be a full Tonga (r9 285) with a 4gb gddr5 option. The Tonga wouldn't really be a rebadge as that is pretty new architecture, from q4 last year. You then have AMD with a better lineup than nvidias.
 
God this whole tdp thing is ridiculous. A titan x pulls 250w, and will pull 300 if OCd. We're not even talking about 50 watts. My guess would be that while gaming they average close to the same tdp in the 200-275 watt range. At idle and non-gaming scenarios they will pull the same. In fact, amds cards have been pulling less than nvidias at idle lately - and I would think that would actually save more power over time than saving 25-50 watts at load here and there. I have no idea how nvidia has convinced you all that tdp is such is a big deal. And then the board partners release 300 watt special edition titan x, and you all go and buy those. Lol.

I agree for the most part that TDP at the *high end* is more often than not a manufactured/over-hyped problem since most people who are going to pop for a GPU that costs anywhere from $300 - $1000 will presumably have 500 - 1000 watt PSUs which can easily handle the requirements. They also must have large cases with generally good airflow to run a monstrous GPU, so heat isn't an issue either. And the notion that I'm going to pay $500 or $1000 for a GPU but am somehow concerned about an extra $10 - $40 per year in electricity costs is pretty ridiculous.

With that said, I do think TDP efficiency (performance/watt & heat generation) does legitimately come into play in certain low & mid-range applications. I've been building HTPCs & compact gaming PCs for the past, oh, 12 years or so & trying to cram components into various small cases for myself & clients/family/friends. I've been doing this far longer than it's been fashionable. So for several years I had to deal with generally under-performing CPU/GPU combos that get hot enough to fry an egg on the case, fans that sound like jet engines on take-off to try to keep it cool, etc...

So, to me at least, Maxwell is something of a Godsend basically because it allows me to build a really small system with a low profile 750Ti or a relatively small system with a 960. The whole thing can be powered by a 450W SFX PSU, will run silent & cool inside a small case, and will play modern games at 1080P with settings turned up. When I was recently looking to add a new GPU to my main HTPC (I wanted to upgrade my 750 Ti in that system) the 960 just made the most sense. I considered the R9 285 & tried one out. While the wattage usage at idle is about the same, the 285 does consume about 100W more when gaming (its rated at 70W more TDP, but the 960 stays at about 120W while the 285 gets up to about 220W). But, energy use isn't really my main concern. Heat generation is. Still, I might have kept the 285 if it matched the 960 in performance, but it was consistently 10-15% slower at 1080P. Since I run AMD processors in my HTPCs (760K and 860Ks at the moment) the nVidia GPU gives me a little better performance (probably due to less CPU overhead as others have discussed).

So, I agree 100% that for high-end GPUs the TDP thing is a non-issue. But, having a more efficient/cooler running GPU at the mid and low end that can play modern games at 30-60fps/1080P is kind of the Holy Grail for people like myself who prefer to build small systems. I still have 1 smallish mid-tower case ready to go for my next big gaming system upgrade (probably the R9 380X so I can do some 1440P gaming), but for the most part I've switched over to small, space-saving systems in my household. As much as I prefer to support AMD, right now nVidia makes a product that better fits my HTPC needs. Hopefully AMD's Fiji GPUs will bring something comparable because the mid-range $150-$200 price-point is where they will claw back market-share.
 

ZOONAMI

Junior Member
I agree for the most part that TDP at the *high end* is more often than not a manufactured/over-hyped problem since most people who are going to pop for a GPU that costs anywhere from $300 - $1000 will presumably have 500 - 1000 watt PSUs which can easily handle the requirements. They also must have large cases with generally good airflow to run a monstrous GPU, so heat isn't an issue either. And the notion that I'm going to pay $500 or $1000 for a GPU but am somehow concerned about an extra $10 - $40 per year in electricity costs is pretty ridiculous.

With that said, I do think TDP efficiency (performance/watt & heat generation) does legitimately come into play in certain low & mid-range applications. I've been building HTPCs & compact gaming PCs for the past, oh, 12 years or so & trying to cram components into various small cases for myself & clients/family/friends. I've been doing this far longer than it's been fashionable. So for several years I had to deal with generally under-performing CPU/GPU combos that get hot enough to fry an egg on the case, fans that sound like jet engines on take-off to try to keep it cool, etc...

So, to me at least, Maxwell is something of a Godsend basically because it allows me to build a really small system with a low profile 750Ti or a relatively small system with a 960. The whole thing can be powered by a 450W SFX PSU, will run silent & cool inside a small case, and will play modern games at 1080P with settings turned up. When I was recently looking to add a new GPU to my main HTPC (I wanted to upgrade my 750 Ti in that system) the 960 just made the most sense. I considered the R9 285 & tried one out. While the wattage usage at idle is about the same, the 285 does consume about 100W more when gaming (its rated at 70W more TDP, but the 960 stays at about 120W while the 285 gets up to about 220W). But, energy use isn't really my main concern. Heat generation is. Still, I might have kept the 285 if it matched the 960 in performance, but it was consistently 10-15% slower at 1080P. Since I run AMD processors in my HTPCs (760K and 860Ks at the moment) the nVidia GPU gives me a little better performance (probably due to less CPU overhead as others have discussed).

So, I agree 100% that for high-end GPUs the TDP thing is a non-issue. But, having a more efficient/cooler running GPU at the mid and low end that can play modern games at 30-60fps/1080P is kind of the Holy Grail for people like myself who prefer to build small systems. I still have 1 smallish mid-tower case ready to go for my next big gaming system upgrade (probably the R9 380X so I can do some 1440P gaming), but for the most part I've switched over to small, space-saving systems in my household. As much as I prefer to support AMD, right now nVidia makes a product that better fits my HTPC needs. Hopefully AMD's Fiji GPUs will bring something comparable because the mid-range $150-$200 price-point is where they will claw back market-share.

Myself and others have a 290x inside a hadron air with the stock 500w PSU, haven't had any problems. I even have a 4.5 GHz oc on my 4690k, and use afterburner to push the 290x up about 75 MHz core and a bit more on the memory before I start up a game.
 
Top Bottom