• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD/NVIDIA market-share graph. Spoiler Alert: it ain't pretty.

bathsalts

Member
Slightly surprised there wasn't a bump for AMD during the bitcoin mining craze. The 300 series cards are shaping up to be pivotal for AMD, I'd hate to see them be bought out by Sammy to refocus mostly on mobile chips.
 
Slightly surprised there wasn't a bump for AMD during the bitcoin mining craze. The 300 series cards are shaping up to be pivotal for AMD, I'd hate to see them be bought out by Sammy to refocus mostly on mobile chips.

That's because there were so many people using ASICs for it.
 

tuxfool

Banned
Slightly surprised there wasn't a bump for AMD during the bitcoin mining craze. The 300 series cards are shaping up to be pivotal for AMD, I'd hate to see them be bought out by Sammy to refocus mostly on mobile chips.

They couldn't meet demand, which is why prices rose. The retailers are the ones who benefited from mining.

Also afterward, the market was flooded with cheap graphics cards.
 

nib95

Banned
Even though I'm using an Nvidia card now, the company strikes me as overly unscrupulous and shady, and I doubt I'll be buying another GPU from them. Instead I'll be sticking with AMD for the foreseeable future. I used to change between AMD and Nvidia all the time, and honestly AMD cards strike me as better bang for buck, and I never had any more issues with their drivers or software issues than I do with Nvidia. It would be awful if Nvidia ever got a monopoly with this market.
 
The only real company I have any brand loyalty to in the GPU making/selling space is EVGA. I will pretty much only buy a GPU from them to use in my system. If they ever started selling AMD GPUs I would definitely consider those products.

I rate EVGA as highly as I do Logitech in regards to my mice.


AMD and Nvidia both make cards that are great and I don't really see any reason to buy one over the other except if you are looking for certain tradeoffs. Have a cheap case that might very well have bad airflow? Be careful on how much TDP the card you are buying uses and how much heat it produces. You might not want that much hot air blowing around in your case. A card with the same or better performance may be cheaper but if it costs more to run and you keep it for two or more years, you'll see that price difference made up from a more expensive card using less power and producing less heat. When you pay your own power bill that stuff adds up.


These are just my opinions but it is how I feel and think about when buying/building a new system.
 
There are trade-offs though. A 290x has a much higher TDP and gets much hotter than any 970, with or without overclocks.



I kind of disagree. For instance, when I bought my 780ti, I bought it the day it launched. My version cost 730$. A GTX 980 which is a faster card (by about 10-15% overall) and has 1GB more Vram released at 550$. The 970, for instance released at 330$ I believe and is faster than the GTX 780 which released for 550$ while having more Vram. A 10-15% jump each generation is just fine and most people will probably upgrade every 2-3 years.

It depends on whether they can offer that same kind of boost AND the solid price-performance ratio the way that the 900 series managed. NVIDIA's stinginess with VRAM in the 700 series for all but the most expensive cards totally put me off of those, because I figured that more would be necessary once next-gen titles started coming out. And that's exactly what happened.
 

Itachi87

Member
As someone who's jumped back and forth between AMD and nVidia over the years w/ no problems, this really sucks. Hopefully, AMD regains some market share with their r9 300x series.
 
I vowed to never buy AMD again after 3 strikes. First was my Phenom II processor which had a faulty third core. Second and third were multiple 4670s from different manufacturers that ran so hot the thermal paste just disappeared, even after multiple reapplications. The bonus round is my old HP laptop video card that never got new drivers, but that was more HP's fault for having some weird proprietary discrete/dedicated card.

I'm strictly Intel/Nvidia now, and I don't anticipate leaving anytime soon now that I have 3DVision and GSync.
 
It depends on whether they can offer that same kind of boost AND the solid price-performance ratio the way that the 900 series managed. NVIDIA's stinginess with VRAM in the 700 series for all but the most expensive cards totally put me off of those, because I figured that more would be necessary once next-gen titles started coming out. And that's exactly what happened.

I can almost feel the limits of my 780Ti due to the 3GB framebuffer. It's fine for now and I don't really have any buyer's regret but I really wish they would have made it a 4GB card. I will probably be getting a card from the next Nvidia series 1000 or whatever they go with. I'm willing to bet the top card (1080?) will have a 5 or 6GB framebuffer.
 

jwhit28

Member
I vowed to never buy AMD again after 3 strikes. First was my Phenom II processor which had a faulty third core. Second and third were multiple 4670s from different manufacturers that ran so hot the thermal paste just disappeared, even after multiple reapplications. The bonus round is my old HP laptop video card that never got new drivers, but that was more HP's fault for having some weird proprietary discrete/dedicated card.

I'm strictly Intel/Nvidia now, and I don't anticipate leaving anytime soon now that I have 3DVision and GSync.

I had almost the opposite experience as you. My first build had a Phenom II x3 that had a free working 4th core and a 4670 that my brother is still using.
 
No, you don't. Nobody that's had an AMD card in the last 3 or 4 years should have any complaints.

The fact that you're referring to them as "ATI" in 2015 tells me all I need to know about your current knowledge and experience with their products.

Oh come on. You don't speak for other people. Fucking Gabe Newell called them ATI in a recent interview he did at GDC. Clearly that guy has no idea what he's talking about in computer hardware and software.
 
Oh come on. You don't speak for other people. Fucking Gabe Newell called them ATI in a recent interview he did at GDC. Clearly that guy has no idea what he's talking about in computer hardware and software.

I mean for a long while I was doing the same thing. If you grew up to ATI/Nvidia for 10-15+ years it is probably even harder to stop calling AMD cards ATI cards. I wouldn't fault someone for calling them ATI cards.
 

LiquidMetal14

hide your water-based mammals
Shame. I've owned both GPU's and want healthy market competition. It serves us, the consumer, for the better.
 

Crisco

Banned
This isn't really NVs doing, it's Intels. AMD has always been pretty competitive on the GPU side, but their CPU business has bled them dry trying to compete with Intel. Now the overall drain on the company is bringing down their GPU business.
 
Shame. I've owned both GPU's and want healthy market competition. It serves us, the consumer, for the better.

Ditto.

My current card is a 6950 (one of the ones that could basically become ~6970s) and I've had a good experience with it thus far. When it comes time to replace it I am open to either make but it would be a shame if we are only left with one choice in the marketplace.
 

espher

Member
The fact that you're referring to them as "ATI" in 2015 tells me all I need to know about your current knowledge and experience with their products.

Not the quoted poster, but for me, it could tell you that I had a run of lemon ATI cards and had to deal with shitty vendor and driver support and have been on the nVidia train since I hopped to a 8800 GTS and have had nothing but positive experiences since. Or it could just tell you that I "grew up" with ATI, much like I still call the Ottawa Senators' home rink the Corel Centre from time to time. ;)

I keep looking at AMD but that perception is hard to offset. I ended up buying a 970 w/ my build at the beginning of the year and have had zero issues with the card itself.
 

Kazdane

Member
It's not just consumers (or at least it doesn't look like). It's also reviewers and so on. I recently saw some site label the R9 290 as a mid-range graphic card, while the 970 (which are pretty much in the same range) be labeled as a high-end graphic card.

If we're not honest in all fronts, it's no wonder AMD will perform terribly. I've recently switched from a GTX 660 to a R9 290 because, in Spain, it's literally 90€ cheaper (that's over 100$) than the 970s, and the card runs great with a 650 watt PSU. I haven't experienced any issues with drivers, and the only thing I miss (and not that much) is Shadowplay, but GVR and OBS with AMD VCE support for the games that don't support GVR works just as well.

There's a lot of false information being spread around in both fronts, and that doesn't help...
 

Sou Da

Member
It's not just consumers (or at least it doesn't look like). It's also reviewers and so on. I recently saw some site label the R9 290 as a mid-range graphic card, while the 970 (which are pretty much in the same range) be labeled as a high-end graphic card.

If we're not honest in all fronts, it's no wonder AMD will perform terribly. I've recently switched from a GTX 660 to a R9 290 because, in Spain, it's literally 90€ cheaper (that's over 100$) than the 970s, and the card runs great with a 650 watt PSU. I haven't experienced any issues with drivers, and the only thing I miss (and not that much) is Shadowplay, but GVR and OBS with AMD VCE support for the games that don't support GVR works just as well.

There's a lot of false information being spread around in both fronts, and that doesn't help...

Heh, reminds of that one escapist debacle with the obvious bias.
 

Easy_D

never left the stone age
nvidia cards aren't any quieter than their amd counterparts if you ignore reference coolers (almost nobody has a reference 290/290X, and nobody is making/selling them anymore). they draw more power, true, but nvidias quoted TDP figures are far far too conservative, the cards often exceed them under load, where AMD cards rarely hit their TDP figures (unless overclocked). its just another way nvidia is lying to people really. I have a 290X in a silverstone RVZ01B case, heat isn't an issue, at all, so stop spreading fud please.


Noise.png


would you look at that, aftermarket 290X is quieter than a GTX 970? you don't say?

Power.png


what's that? 290X only uses 30 - 60 watts more than GTX 970? not the 150+ more watts people are constantly claiming? come on get real with that nonsense about heat/power and noise.

Temp.png


what's this? the aftermarket 290X runs cooler than the GTX 970 (and thus 980 as well)? Gee.......

Yeah. My new 280X is actually noticeably quieter than my old 5870 was. Shit's improved a lot. Runs cooler too, while being a rather beastly card for the price. Haven't had driver issues the last 5 years either. The bias is real. Same goes for their CPUs. Read the FX line couldn't do Dolphin propelrly yet every game I tried runs full speed :lol
 

inherendo

Member
In regards to TDP, I think toms or anandtech mentioned in an article that nvidia uses a different definition that isn't what is conventional, whereas AMD uses the standard, which is why nvidia can draw more than what they state and AMD doesn't.
 
In regards to TDP, I think toms or anandtech mentioned in an article that nvidia uses a different definition that isn't what is conventional, whereas AMD uses the standard, which is why nvidia can draw more than what they state and AMD doesn't.

When anandtech and other reviews review the card, they still measure power consumption. It doesn't really matter if Nvidia calls it something else the reviews still get the proper power consumption readings.

30-40+ watts over two years is a HUGE difference in energy usage and money savings which are things I consider when choosing similar cards.

Not trying to argue one brand over another just throwing facts out there.
 
i mean for as much as people complain about nvidias prices, its pretty clear a lot of that revenue gets recycled into R&D to keep pushing tech forward

AMD's innovations on the otherhand...its pretty clear their earnings mostly just go to keeping the status quo
 
Some thoughts:

a) I do think re-branding ATi to AMD was a mistake, especially considering that AMD's desktop processors has been a complete joke for the last half a decade.
b) I also do think that AMD's software package is a bit pants. Their driver support as always been sub-par and their attempt to replicate GeForce Experience with Raptr sunk my opinion on Raptr. While in terms of pure pixel-pumping power, AMD's graphics cards are competitive with Nvidia's, I can quite easily see why people are prepared to pay the premium for team green, and it is the software.
c) Ultimately, I do think AMD has to work on improving their reputation with regards to drivers and other software, and considering they haven't even started on fixing that, I can't be too surprised that their fortunes are continuing to decline.
 
When anandtech and other reviews review the card, they still measure power consumption. It doesn't really matter if Nvidia calls it something else the reviews still get the proper power consumption readings.

30-40+ watts over two years is a HUGE difference in energy usage and money savings which are things I consider when choosing similar cards.

Not trying to argue one brand over another just throwing facts out there.

40 watt difference under load for 8 hours a day, every single day is, based on the US average, $14.016 per year. when the AMD equivalent to the GTX 980 costs $260 less than it, $14/yr isn't even relevant to this discussion (unless you plan to use these cares for 18 and a half years, which is how long it would take for the added energy cost to make their price equal).
 
40 watt difference under load for 8 hours a day, every single day is, based on the US average, $14.016 per year. when the AMD equivalent to the GTX 980 costs $260 less than it, $14/yr isn't even relevant to this discussion (unless you plan to use these cares for 18 and a half years, which is how long it would take for the added energy cost to make their price equal).

The AMD equivalent to the 980 doesn't yet exist. I fail to see your point. I'm talking about when people are looking towards similar performing cards with smaller price differences.
 
The AMD equivalent to the 980 doesn't yet exist. I fail to see your point. I'm talking about when people are looking towards similar performing cards with smaller price differences.

the 290X is 8% slower than the GTX980 at 1440p/4k (you don't buy cards of this class for 1080p, or you shouldn't rather, 970/290 is the 1080p card IMO). 290X is absolutely in the same class as the 980, it isn't amd's fault that nvidia priced the 980 nearly twice what it should've been.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/29.html


even the reference model 290X (which has throttling issues) is within 7% of the GTX 980 at 1080p, 1440p, and 4k on average. a non reference model will average 2 - 3% better without the throttling. so yes, yes they are in the same fucking class, despite the enormous price difference.
 
The funny thing is, I dont even consider buying AMD hardware ever any longer. The last time I considered it was around the R600 era, but we know how that turned out.

Absolutely how I feel. I switched to Nvidia at the end of 2010 and haven't looked back.

At first it was SGSSAA + FAR better downsampling support. Now we have so many AAA games including nvidia specific features that I want. Their drivers are typically pretty solid..... and then there is Andrew Burnes.

I mean just look at this Far Cry 4 graphics, performance, and tweaks guide. That is so comprehensive. He's very dedicated. Can't wait for the GTA5 guide.

These days, I don't even care if the top AMD card is faster. Just feels I'd be missing out buying something other than Nvidia. And that is the perception that AMD needs to change.

That said, I'm not going to cheer against AMD. That's silly. Honestly, I want their next card to knock it out of the park
so maybe Nvidia will consider dropping their prices a bit.
 
nvidia cards aren't any quieter than their amd counterparts if you ignore reference coolers (almost nobody has a reference 290/290X, and nobody is making/selling them anymore). they draw more power, true, but nvidias quoted TDP figures are far far too conservative, the cards often exceed them under load, where AMD cards rarely hit their TDP figures (unless overclocked). its just another way nvidia is lying to people really. I have a 290X in a silverstone RVZ01B case, heat isn't an issue, at all, so stop spreading fud please.


Noise.png


would you look at that, aftermarket 290X is quieter than a GTX 970? you don't say?

Power.png


what's that? 290X only uses 30 - 60 watts more than GTX 970? not the 150+ more watts people are constantly claiming? come on get real with that nonsense about heat/power and noise.

Temp.png


what's this? the aftermarket 290X runs cooler than the GTX 970 (and thus 980 as well)? Gee.......

Nice try.

Reference 970 boost clock:1178 Mhz
970 Superclocked boost clock: 1317 Mhz

Keep talking about spreading fud.
 

wildfire

Banned
i mean for as much as people complain about nvidias prices, its pretty clear a lot of that revenue gets recycled into R&D to keep pushing tech forward

AMD's innovations on the otherhand...its pretty clear their earnings mostly just go to keeping the status quo

No it hasn't. AMD's mistake was pushing tech that didn't matter as much to people.

Eyefinity was a great idea but most people don't have the real estate for 3 displays, only 2. Surround and Eyefinity gamers are too small of an audience even among single display multigpu users.

TressFX is also great because of how garbage hair has been in games but in the end it still is just hair. A lot of people can live without that type of upgrade.

It looks like AMD has finally developed technology that will be relevant for the masses with Mantle and Async Shaders though it is likely Nvidia will be able to match them due to directx changes.


Nice try.

Reference 970 boost clock:1178 Mhz
970 Superclocked boost clock: 1317 Mhz

Keep talking about spreading fud.


I'm sure it took you awhile to gather that information but the person you were responding to made an uneditted post (before you finished your response) where they believe 40 watt difference is amazing for energy cost savings.

I see this as less than fud and more like some very liberal standards.
 
I haven't had bad experiences with the ati radeon card I had in past. However, AMD should not focus too much on "value for money" aspect. There is a market for it, but Apple has proven people are willing to pay premium prices for premium products.

If AMD can't produce what people perceive to be powerhouse products, people will default to Nvidia. And then Nvidia vs Amd will play out exactly like Apple vs Nokia did.
 
Nice try.

Reference 970 boost clock:1178 Mhz
970 Superclocked boost clock: 1317 Mhz

Keep talking about spreading fud.

6818_44_sapphire_radeon_r9_290x_8gb_vapor_x_oc_video_card_review.png


6818_45_sapphire_radeon_r9_290x_8gb_vapor_x_oc_video_card_review.png


6818_46_sapphire_radeon_r9_290x_8gb_vapor_x_oc_video_card_review.png


it isn't fud, all of these cards are overclocked, and they are also all similar in terms of thermals, power draw, and noise. kindly get the fuck out of here with your nonsense.

I'm sure it took you awhile to gather that information but the person you were responding to made an uneditted post (before you finished your response) where they believe 40 watt difference is amazing for energy cost savings.

I see this as less than fud and more like some very liberal standards.



were you referring to me? i never said using ~40 more watts was a cost savings, i said using 40 more watts costs $14/yr assuming 8 hours per day of gaming. it would take 18 years for the 290X to cost the same as the 980 under those conditions. i simply said a 40 watt difference is irrelevant in terms of energy cost differences (its like $1.10 a month)
 
Nope, that HIS R9 290X 4GB works at reference clocks, meanwhile both Nvidias are overclocked.

It isn't that difficult to find reference 970 reviews.

then compare the sapphire to the others. you seem to throw out data that doesn't fit your pro nvidia agenda. the sapphire runs at LOWER temperatures than the 970/980, and only uses 20 watts more than the GTX 980, both are overclocked.

I never said that nvidia didn't have a tdp advantage, it simply isnt some enormous gap like most people want to believe/claim. a 20 - 40 watt difference between the 290X and its GTX 980 equivalent. its roughly the same difference between the 290 and the 970. it certainly isnt 100 - 150 watts as is commonly claimed, under actual usage scenarios.

i already provided proof a few minutes ago that the 290X is within 7% of the GTX 980 at 1080p, 1440p and 4k. so they are absolutely direct competitors performance wise. it isn't my fault that nvidia's R9 290X equivalent is 90% more expensive.
 

Varvor

Member
Yeah it's a terrible time to buy a new video card in Canada atm

My old gal GTX460 died on me two weeks ago....so I set out to buy GTX960 2gig for 260 beavers in GTA, ended up paying 310 beavres for 4gig version.

Wonder if throwing 100 more beavers and buying GTX970 (4gig)would have been better idea?
 
My old gal GTX460 died on me two weeks ago....so I set out to buy GTX960 2gig for 260 beavers in GTA, ended up paying 310 beavres for 4gig version.

Wonder if throwing 100 more beavers and buying GTX970 (4gig)would have been better idea?

holy shit you guys pay way too much for stuff. you could've gotten an R9 290 for $330CAD, which would've been ~40% faster than that 960. I'm just sayin.

EDIT: also, is it true that you guys have money made of plastic?
 

Durante

Member
Some thoughts:

a) I do think re-branding ATi to AMD was a mistake, especially considering that AMD's desktop processors has been a complete joke for the last half a decade.
b) I also do think that AMD's software package is a bit pants. Their driver support as always been sub-par and their attempt to replicate GeForce Experience with Raptr sunk my opinion on Raptr. While in terms of pure pixel-pumping power, AMD's graphics cards are competitive with Nvidia's, I can quite easily see why people are prepared to pay the premium for team green, and it is the software.
c) Ultimately, I do think AMD has to work on improving their reputation with regards to drivers and other software, and considering they haven't even started on fixing that, I can't be too surprised that their fortunes are continuing to decline.
I mostly agree, though regarding your point a), when the rebranding happened AMD wasn't quite in such a terrible position CPU-wise.
 

Varvor

Member
holy shit you guys pay way too much for stuff. you could've gotten an R9 290 for $330CAD, which would've been ~40% faster than that 960. I'm just sayin.

EDIT: also, is it true that you guys have money made of plastic?

yes...and dipped in maple syrup so it smells like sweets for looong time.

I am kind of done with AMD stuff for a while.

EDIT: Based on this link's info there is no 40% gain for R9 290 vs GTX960 (but does have some significant boots in some aspects). Furthermore they are testing 2gig version it seems.

http://gpuboss.com/gpus/Radeon-R9-290-vs-GeForce-GTX-960
 
the 290X is 8% slower than the GTX980 at 1440p/4k (you don't buy cards of this class for 1080p, or you shouldn't rather, 970/290 is the 1080p card IMO). 290X is absolutely in the same class as the 980, it isn't amd's fault that nvidia priced the 980 nearly twice what it should've been.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/29.html


even the reference model 290X (which has throttling issues) is within 7% of the GTX 980 at 1080p, 1440p, and 4k on average. a non reference model will average 2 - 3% better without the throttling. so yes, yes they are in the same fucking class, despite the enormous price difference.

Percentages don't work like that. It is 7% difference relative to the Titan, doesn't mean it is within 7% less powerful than a GTX 980. The difference is higher.

Besides that, the GTX 970 is much closer than that and more within the same price range, so those are in the same class, not the GTX 980.

The GTX 980 is known to not be a great value proposition, but it is not competing with a lot because people can just go for the GTX 970 to compete with the AMD offering.
 
yes...and dipped in maple syrup so it smells like sweets for looong time.

I am kind of done with AMD stuff for a while.

fair enough, i bounce back and forth whenever one offers a significant performance/dollar advantage. i have owned 7 nvidia and 6 amd cards over the years, never had any major issues with any of them. I suspect that eventually i will grab an nvidia card again, but this go around and the last, AMD won the perf/dollar war.
 
Percentages don't work like that. It is 7% difference relative to the Titan, doesn't mean it is within 7% less powerful than a GTX 980. The difference is higher.

Besides that, the GTX 970 is much closer than that and more within the same price range, so those are in the same class, not the GTX 980.

The GTX 980 is known to not be a great value proposition, but it is not competing with a lot because people can just go for the GTX 970 to compete with the AMD offering.

i cannot and will not in good faith reccomend the GTX 970 to anyone, if for no other reason than the deliberate lies coming from nvidia about it for many months until they got caught, and even continuing afterwards. also the design flaw. I will NEVER reccomend a 970 for any reason because of this. If it didn't have the issues it has, i would have no problem reccomending it. as it is, the GTX 980 is the only upper end nvidia card (in the sane price segments) that doesnt have a significant hardware design issue, as well as false advertising. so the GTX 980 is my point of comparison at the upper midrange/high end. at the midrange, compare the $250 R9 290 to the $240 4gb GTX 960. cards that are absolutely in the same price class.
 

Varvor

Member
fair enough, i bounce back and forth whenever one offers a significant performance/dollar advantage. i have owned 7 nvidia and 6 amd cards over the years, never had any major issues with any of them. I suspect that eventually i will grab an nvidia card again, but this go around and the last, AMD won the perf/dollar war.

It may have came off odd on my end...I went through few NVidia cards and found my self happy with them, as result never felt the need to mix it by switching to AMD cards. Besides this constant criticism of their cards does make one stay away from them even further.
 

Fantasmo

Member
I like buying stuff and not having to wonder whether it will work right or not. I don't like waiting a month to have something work.

And if AMD truly is better than it used to be judging by some of these posts, it needs to go on a marketing and interview blitz. It's not my job to change the bad perceptions.
 
Top Bottom