• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD/NVIDIA market-share graph. Spoiler Alert: it ain't pretty.

Ovek

7Member7
Nothing wrong with ATI hardware they have the best price/performance ratio in the market. The biggest problem they have is driver related. From slow updates, nonexistent crossfire profiles for new high profile games for sometimes months after release and the drivers just generally being a bit shit.

I bet GTA5 will be broken in some way when that's released, ATI will blame Rockstar and visa versa and the happy merry go round starts again.
 
i will be glad to go team green again (last nvidia card was 560Ti SLi) as soon as they offer the top performance in their price class, they haven't for awhile. in the ~$600 price class the GTX 980 is stomped into the dirt by the 295x2 and at the $300 price class the 290X is faster than the 970, and without an engineering blunder and deliberate lies surrounding it, so nvidia really has nothing to offer me this go around.
 
As long as their diver performance remains diabolical, I won't even consider an AMD GPU. Unless you're on a modern Intel quad core, AMD aren't even an option due to the huge increase in driver overhead that drags down your CPU performance.

I couldn't afford a quad core Intel system when I put my rig together a year ago, so I'm on an FX 8320. @4.4ghz and I don't fancy downgrading my CPU performance for a GPU upgrade. It makes no sense.
 

Zemm

Member
I hear nothing but bad stuff about ATI and driver support.

Once had an ATI card and had endless compatibility problems. Was so excited for Command and Conquer 3 and the fucking thing never worked for more than 20 minutes before crashing.

Meanwhile I've never had a problem with nvidia, both mobile and desktop GPUs.

ATI can get fucked.

Eh, I've had AMD cards since 2011 and had zero problems with drivers. Maybe in the past they were bad but in the current they're fine. I don't care for the fanboy war (AMD v Nvidia) but the driver stuff is easily overblown or just outdated.

It'll be a shame if nvidia ends up being the only card manufacturer in the future, the prices are already too much and I don't trust them after the 3.5Gb ram lies.
 

Kezen

Banned
I think those who see AMD exiting the discrete GPU space are overreacting, as their marketshare is far from irrelevant.
Hopefully for them the R9 300 series is a smashing success.

I have my preferences that I'm not even remotely trying to hide, but I have nothing to gain from an Nvidia monopoly.

Still, I'll stick to the best combo (Nvidia-Intel).
 
To be fair to the 300 series, the chart clearly shows the exact same market share trend vs. Nvidia that AMD has had vs. Intel for many years. AMD must release an amazing product at the exact same time Intel or Nvidia stumble, for example the Athlon 64 at the same time as Pentium 4, or the Radeon 9800 Pro at the same time as the GeForce FX 5800, in order to temporarily gain significant market share ground. And then once Intel and Nvidia recover with their next generation of products, AMD's market share immediately drops to historical levels.

Since the GeForce GTX 900 series are very good cards, even if the Radeon 300 series are really amazing, they are unlikely to pull that much market share if any from Nvidia. Especially since Nvidia already has the GTX 980 Ti ready to go, the prototype version called Titan X is already available right now.

AMD has HBM, nvidia does not until pascal comes out.
500+GB/sec memory bandwidth will serve their new high end gpus really well at 4k (and just now that single gpus are finally becoming powerful enough to actually do 4k at a respectable framerate)

It should give them a real edge at 4k compared to maxwell.

I'm looking forward to HBM being on all cards on the entire gpu range of both manufacturers, no longer having memory bandwidth be a bottleneck (no more bullshit cards like the gtx 960 or 660 with crippled bandwidth)

I think those who see AMD exiting the discrete GPU space are overreacting, as their marketshare is far from irrelevant.
Hopefully for them the R9 300 series is a smashing success.

I have my preferences that I'm not even remotely trying to hide, but I have nothing to gain from an Nvidia monopoly.

Still, I'll stick to the best combo (Nvidia-Intel).
With the rumored 700 dollar price for the 390x I'm hoping for an initial megabomb (if they're going to price shit as if there's already a monopoly then there's no use keeping competition going as a charity) , forcing a large price drop , followed by sustained success at a competitive price.
Their history with GCN suggests they're going to milk being first to HBM (easy to market too) as hard as they can while they can.
 

Cipherr

Member
I feel like I can see my two Nvidia and one AMD purchases on those graphs lol. I ebb and flow to whoever has the best performance/dollar ratio around ~$500 whenever I build.


It is interesting though, over the decades I have used AMD/Intel CPU's and 3DFX and all the others for GPU's. But I notice with GPU's if someone has a bad experience with one, they let it fuck them up FOREVER.

If I let the times I have had an intel chip fuck me over prevent me from ever considering them in the future, I would have a much less powerful rig right now. Its really confusing to me why people let one Nvidia/AMD screwup put them on the other brand for life. That shit is really crazy.
 

foamdino

Member
I have had a mixed bag of GPUs including the Kyro (remember those?).

Last AMD card I had was god-awful - both from hardware (my fault bought one with a 128bit bus) and software (not my fault - AMD claimed that they would work really hard to be compatible with linux and OS software). Their linux drivers never worked properly and every update was a complete shit shower - I ended up using the true open source drivers developed by the community because they were stable.

After that I decided nVidia would be my next card, but the 280x and 290x looked so damn good until the 970 came out.

Next time I upgrade I'll see what's around for the price/performance I'm willing to spend, but AMD has to improve the driver situation as my experience of their drivers (for linux in particular) has been awful.
 
Ok enough with the driver talk, honestly.

Has anybody ever done any stats, at all? Stuff like crossfire/SLI driver support is a matter of public record, so we could in principle measure that if somebody had no life and cared enough. But stuff like "number of people who have problems with drivers"? I'm not sure how you'd even attempt to measure it. Maybe you'd have to do huge Google searches for "problem with -insert graphics card" for basically every major video card release going back 15 years, then try to divide that by approximate market share.

Unless anybody has done some kind stats work, we're just going to shout anecdotes past each other.
 
That's kind of depressing

Lolling at Kelli's posts tho.

i backed up my posts with actual proof, look at the benchmarks for yourself or shut the hell up about it. the 295X2 is in the same price class as the 980 and obliterates it, and the 290X is faster than the 970,the card that is in it's price class. period, fact, not debatable in any way.
 

industrian

will gently cradle you as time slowly ticks away.
Is this really surprising? Nvidia have more money to spend, have a stable leadership with a good vision, and generally provide a better product that's in-line with the expectations of their customers. AMD have been a mess for the last 3-5 years, with the only thing going for them being that they're perceived to be cheaper and are considered to be the poor underdog fighting a losing battle against Intel and Nvidia's evil empires.

I'm keeping an eye out for the 390x and HBM, but I have the feeling that the next few series of graphics cards and processors released by AMD will make or break the company.
 
Ok enough with the driver talk, honestly.

Has anybody ever done any stats, at all? Stuff like crossfire/SLI driver support is a matter of public record, so we could in principle measure that if somebody had no life and cared enough. But stuff like "number of people who have problems with drivers"? I'm not sure how you'd even attempt to measure it. Maybe you'd have to do huge Google searches for "problem with -insert graphics card" for basically every major video card release going back 15 years, then try to divide that by approximate market share.

Unless anybody has done some kind stats work, we're just going to shout anecdotes past each other.

Anecdotes still mean the problems exist you know.
Different people are playing different games.
I happen to be one of the people who happened to buy a lot of the games that had problems on amd...

If I played other stuff I might have been completely spared like some others in this thread.

It has been better in the last year (but who knows maybe I'm just dodging all the bullets now while I was catching them all before) , I think they've started trying a little harder since the frametime fiasco.

But hey I remember people complaining about amd drivers during the radeon 9800 pro era and back then I had literally zero issues in the games I played on that gpu.
I wasn't gona be a dick and dismiss their issues though.

All I know is that between 2009 and 2013 I had a 4870 then a 6870, and a friend of mine had a gtx 260 then a 560ti, we played a lot of the same games and every time he'd be fine while I'd be in stutter hell waiting 2 months for a patch or a driver update, to the point where it became a running joke between us.

Lately I've been fine (though I've not bought many new games) and even cs go stopped stuttering and is now smooth (no idea how, that game was borked for me for ages). But I'm not forgetting all the grief my 4870 and 6870 caused me next time I decide to buy a new gpu
 

gimmmick

Member
radeon1.jpg


never forget.
 

aravuus

Member
i backed up my posts with actual proof, look at the benchmarks for yourself or shut the hell up about it. the 295X2 is in the same price class as the 980 and obliterates it, and the 290X is faster than the 970,the card that is in it's price class. period, fact, not debatable in any way.

And I said absolutely nothing about the facts or anything related to the GPUs, yet there you go

I'm sure you now see why I find your posts hilarious
 
And I said absolutely nothing about the facts or anything related to the GPUs, yet there you go

I'm sure you now see why I find your posts hilarious

I suppose you think i'm an amd fangirl. I've owned more nvidia gpus than amd gpus over the years. my first real gpu was a geforce 2 MX 400, then i got a GF3 ti 200, then came the AMD 9800pro, the nvidia 6800GT, the 8800GT, the nvidia 260GTX, AMD 5850, nvidia 560Ti SLi, AMD 7950 CF, AMD R9 290, AMD R9 290X 8GB.

so that is 7 nvidia cards to 6 amd cards. nvidia simply doesn't offer leading performance at any price class at the moment, all the way down to the $100 pricepoint, and up to the $600 price class. even the titan-x isn't faster than the 295x2, and its $350 more expensive than it.

and you couldn't pay me to buy an AMD cpu at any price segment, they are too far behind and won't ever have access to the neccesary fabs to match intel, much less beat them.

hell, now that i think about it, if you are especially out of your mind, you could quadfire R9 290's for the cost of a titan-x and obliterate that too. of course you would need your own private nuclear reactor at that point, and access to liquid nitrogen lol. dat thousand watts of gpu awesomeness lol.
 
Ok enough with the driver talk, honestly.

Has anybody ever done any stats, at all? Stuff like crossfire/SLI driver support is a matter of public record, so we could in principle measure that if somebody had no life and cared enough. But stuff like "number of people who have problems with drivers"? I'm not sure how you'd even attempt to measure it. Maybe you'd have to do huge Google searches for "problem with -insert graphics card" for basically every major video card release going back 15 years, then try to divide that by approximate market share.

Unless anybody has done some kind stats work, we're just going to shout anecdotes past each other.

While it is true that the plural of anecdote is not data, it is also true that over the years there have always been a steady stream of anecdotes from people who have had problems with AMD and previously ATI products with many games. Meanwhile there have been fewer and often no similar anecdotes regarding Nvidia products with the same games. Eventually you gain a reputation for having driver problems with many games as AMD and previously ATI have, and you either improve significantly and the anecdotes go away or you don't. With AMD, the anecdotes have not gone away over the decades. You can take from that reality whatever knowledge or lack thereof you care to.
 

DOA

Member
but what about the Steam stats? although it does show that AMD is in the 28.54%, Nvidia is at 51.37% (which isn't the 70% that i'm seeing here).

the rest is mostly Intel, at 19.71%

am i missing something here?
 

Ensirius

Member
but what about the Steam stats? although it does show that AMD is in the 28.54%, Nvidia is at 51.37% (which isn't the 70% that i'm seeing here).

the rest is mostly Intel, at 19.71%

am i missing something here?
Integrated graphics card?
 

OtisInf

Member
That chart looks pretty odd, as if the share number from one vendor is used to calculate the market share of the other, which isn't correct: there are more players in the field, namely intel. It can very well be that both lose market share at some point (and intel gains it) or both gain market share at some point (and intel loses it).

It's also useless (IMHO) for determining whether AMD has a future or not, considering NVidia is completely absent in the console market and low-cost SoC PCs (AMD is too to a lesser extend; Intel has that market).
 

martino

Member
but what about the Steam stats? although it does show that AMD is in the 28.54%, Nvidia is at 51.37% (which isn't the 70% that i'm seeing here).

the rest is mostly Intel, at 19.71%

am i missing something here?

Using steam is a filter this graph don't use.
 
but what about the Steam stats? although it does show that AMD is in the 28.54%, Nvidia is at 51.37% (which isn't the 70% that i'm seeing here).

the rest is mostly Intel, at 19.71%

am i missing something here?

Either it's counting quarterly sales rather than pure market share, or it's counting stuff like workstation GPUs as well, or the Steam stats are just very different. Not sure really.
 

bj00rn_

Banned
nvidia cards aren't any quieter than their amd counterparts if you ignore reference coolers (almost nobody has a reference 290/290X, and nobody is making/selling them anymore). they draw more power, true, but nvidias quoted TDP figures are far far too conservative, the cards often exceed them under load, where AMD cards rarely hit their TDP figures (unless overclocked). its just another way nvidia is lying to people really. I have a 290X in a silverstone RVZ01B case, heat isn't an issue, at all, so stop spreading fud please.


http://media.bestofmicro.com/6/K/462908/original/Noise.png[IMG]

would you look at that, aftermarket 290X is quieter than a GTX 970? you don't say?

[IMG]http://media.bestofmicro.com/6/J/462907/original/Power.png[IMG]

what's that? 290X only uses 30 - 60 watts more than GTX 970? not the 150+ more watts people are constantly claiming? come on get real with that nonsense about heat/power and noise.

[IMG]http://media.bestofmicro.com/6/L/462909/original/Temp.png[IMG]

what's this? the aftermarket 290X runs cooler than the GTX 970 (and thus 980 as well)? Gee.......[/QUOTE]

This takes the cake as one of the most disingenuous posts of the thread. Awful.

The Nvidia tdp advantage is real, you can't just smoke-n-mirror it away like that..
 

reckless

Member
That chart looks pretty odd, as if the share number from one vendor is used to calculate the market share of the other, which isn't correct: there are more players in the field, namely intel. It can very well be that both lose market share at some point (and intel gains it) or both gain market share at some point (and intel loses it).

It's also useless (IMHO) for determining whether AMD has a future or not, considering NVidia is completely absent in the console market and low-cost SoC PCs (AMD is too to a lesser extend; Intel has that market).

Its a graph of discrete cards so Intel isn't counted.
 

JaseC

gave away the keys to the kingdom.
Really? I haven't been paying attention haha. Is it broken on ATI cards though? ;)

I've poked my head into the PC perf. thread a handful of times and haven't noticed a higher-than-usual number of complaints from those on AMD hardware.
 
This takes the cake as one of the most disingenuous posts of the thread. Awful.

The Nvidia tdp advantage is real, you can't just smoke-n-mirror it away like that..


6818_44_sapphire_radeon_r9_290x_8gb_vapor_x_oc_video_card_review.png


6818_45_sapphire_radeon_r9_290x_8gb_vapor_x_oc_video_card_review.png


6818_46_sapphire_radeon_r9_290x_8gb_vapor_x_oc_video_card_review.png



fine, here's one that shows overclocked non reference cards across the board. have fun ;) you still want to try to argue this point? because you will lose sir. this one is even more in favor of my point, despite using non reference, overclocked designs for all parts involved. i'm just saying.....

So, where's the so called smoke and mirrors now? hmm?

yes, nvidia has an advantage when it comes to TDP, no, it isn't anywhere remotely close to what is claimed.
 

industrian

will gently cradle you as time slowly ticks away.
I feel like I can see my two Nvidia and one AMD purchases on those graphs lol. I ebb and flow to whoever has the best performance/dollar ratio around ~$500 whenever I build.

It is interesting though, over the decades I have used AMD/Intel CPU's and 3DFX and all the others for GPU's. But I notice with GPU's if someone has a bad experience with one, they let it fuck them up FOREVER.

If I let the times I have had an intel chip fuck me over prevent me from ever considering them in the future, I would have a much less powerful rig right now. Its really confusing to me why people let one Nvidia/AMD screwup put them on the other brand for life. That shit is really crazy.

Outside of laptops (which have all been Intel, for obvious reasons) my roadmap so far has been:

Pentium 3 + Rage Pro
2400+ XP + MX440
Pentium 4 + FX5950 Ultra
Pentium 4 + 6800 GT
3800+ X2 + 6800 GT
3800+ X2 + 1900XT
Q9450 + 1900XT
Q9450 + 550 Ti
Q9450 + 750 Ti

(The P3 build was a PC my parents bought for the family that essentially became mine because I was the only dude using it. The 2400+ XP build was the first PC I ever built myself so it was cheap as hell. Since then I've went with whoever was deemed to have provided the best performance for the budgets that I set.)

That said, I remember hating my Pentium 4 to the point where after using the 3800+ X2 I doubted that I would ever buy another Intel chip again. Then they knocked it out of the park with the C2Ds and C2Qs and AMD have never been able to keep up.
 
AMD has HBM, nvidia does not until pascal comes out.
500+GB/sec memory bandwidth will serve their new high end gpus really well at 4k (and just now that single gpus are finally becoming powerful enough to actually do 4k at a respectable framerate)

It should give them a real edge at 4k compared to maxwell.

I'm looking forward to HBM being on all cards on the entire gpu range of both manufacturers, no longer having memory bandwidth be a bottleneck (no more bullshit cards like the gtx 960 or 660 with crippled bandwidth)

I just wonder how much silicon AMD can pack into a GPU still stuck at the 28nm process node to take advantage of the increased memory bandwidth of HBM. Nvidia has done a lot of work with Maxwell to optimize performance per watt because they're using Tiny Maxwell in their ARM tablet/microconsole Tegra SoCs, AMD has not done this at all because they aren't in that market.

Even if we assume AMD has reached rough parity with Nvidia in a GPU's ability to deliver this much graphical gruntwork with that much power usage and heat dissipation, they will still not be able to make full use of 500+ GB/s of memory bandwidth. I guess we'll see how the 300 series does while GPUs are unable to advance to smaller semiconductor manufacturing processes.
 
I was almost buying a 290. It has some insane good value.
But being my first desktop, I swayed to 970 since I want to have at least possible-headache as possible.

I want AMD to be much more successful though. They need an image-overhaul. The "bad" image they have is hurting them much more than what actual tech they produce deserves.
 

OtisInf

Member
Its a graph of discrete cards so Intel isn't counted.
Ok in that light, it makes a bit more sense, but overall, IMHO, the graph has little meaning, as it also assumes the market for discrete gfx cards stays stable. I mean, what does it mean a vendor is the no.1 in the 'discrete gfx card market' ? The graph does suggest the vendor rules the gfx market, but that's a fallacy. Maybe I read too much into it though. ;)
 

martino

Member
I've poked my head into the PC perf. thread a handful of times and haven't noticed a higher-than-usual number of complaints from those on AMD hardware.

Is there a lot of amd user ar gaf ? look at the thread.
(i have amd for now but i will never show love to a gpu company (i've changed each time because of context matter)
i need good price/perf and real and not gimmick features when gaming (and it seems gsync superiority over freesync is one for next change))
 
Outside of laptops (which have all been Intel, for obvious reasons) my roadmap so far has been:

Pentium 3 + Rage Pro
2400+ XP + MX440
Pentium 4 + FX5950 Ultra
Pentium 4 + 6800 GT
3800+ X2 + 6800 GT
3800+ X2 + 1900XT
Q9450 + 1900XT
Q9450 + 550 Ti
Q9450 + 750 Ti

(The P3 build was a PC my parents bought for the family that essentially became mine because I was the only dude using it. The 2400+ XP build was the first PC I ever built myself so it was cheap as hell. Since then I've went with whoever was deemed to have provided the best performance for the budgets that I set.)

That said, I remember hating my Pentium 4 to the point where after using the 3800+ X2 I doubted that I would ever buy another Intel chip again. Then they knocked it out of the park with the C2Ds and C2Qs and AMD have never been able to keep up.

Ouch you bought a geforce 4 mx , that must have stung.

Why did you go from an athlon xp to a pentium 4 btw? seems like a downgrade or at best a sidegrade.

I went from a 2ghz p4 to an athlon xp 2800+ and it was a huge upgrade
 

fertygo

Member
I still remember AMD seem turning thing around with 4xxx and 5xxx series, enthusiast seem liking their price-performance ratio. What happened?
 
I still remember AMD seem turning thing around with 4xxx and 5xxx series, enthusiast seem liking their price-performance ratio. What happened?

ATI bought out by amd is one possibility

Another is arrogance with GCN (first to market on 28nm when people were thirsty to get away from 300W 40nm gpus)

Or just plain old fashioned corporate fuckery.
They started with their renaming schemes (moving their model numbers down a notch starting with the 6000 series, which confused and misdirected people, even people that tended to follow gpu info on gaf, for 2 years and still makes people underestimate the 5870 and overestimate the 7770 and r7 cards today) so the tone was already set for the price hikes.
 

industrian

will gently cradle you as time slowly ticks away.
Ouch you bought a geforce 4 mx , that must have stung.

It was the first PC I built, and I was still researching stuff when I put it together. Or to put it in correct terms that will make people laugh at me and make me run home crying: as my first PC had integrated graphics I just thought that this was standard on every motherboard. It wasn't, and I was left with a functioning PC without a graphics card. Not wanting to wait to use my shiny new PC, I ran out to my nearest store, asked about graphics cards and just bought the cheapest they had: the MX440 SE.

If you will give me ten minutes to wipe away my tears and compose myself after outing myself as once being a noob like that: I didn't really mind it. It was a step up from what I was used to and it played everything I needed (Max Payne 2, ePSXe, etc) but when games started requiring Shader Models then it was literally worthless.

Why did you go from an athlon xp to a pentium 4 btw? seems like a downgrade or at best a sidegrade.

I went from a 2ghz p4 to an athlon xp 2800+ and it was a huge upgrade

To be honest I can't really remember. It was 11 years ago now. Maybe I bought into the hype. It was the socket 478 P4 3GHz with Hyper Threading.
 

KHlover

Banned
nvidiots.
1. Careful with that, people got banned for less.

FAQ said:
- Console War. Obviously arguments about consoles, platforms, and publishers are going to get heated. Don't make things worse by calling out "xbots" or "ntards" or "fanboys" or "sdf" or "xdf" or insulting someone else by calling them one of those things.

2. Higher is better for temperature and power consumption? What's the logic behind that? Wouldn't "lower is better" usually be used for those values?
 

DSix

Banned
I'm doing my part, me and all my family/friends are sporting AMD cards.

I'm waiting for the 3xx series to upgrade, but they're way late. Big games are coming out and nvidia is the only new offer here. I can understand the plunge.
 

orborborb

Member
I was on team red for almost 19 years because I had no complaints with their products and I only ever heard bad things about Nvidia's business practices and image quality, but I recently bought a 970 because the latest AMD cards no longer had analog video output to support my CRT monitors.
 
I don't know about you but I have serious doubts regarding the professionalism of anyone who thinks that HIGHER TEMPERATURES are better for video cards.

I do not want my video card to set itself and my computer on fire.

i never said higher is better, and the sapphire card runs cooler than those 970 and 980 models. my new post was to show that the tdp difference is NOT what is claimed, nor was the noise/temp difference. the overclocked non reference models for both amd and nvidia are very similiar in terms of heat, power draw, and noise, and i proved it.

1. Careful with that, people got banned for less.



2. Higher is better for temperature and power consumption? What's the logic behind that? Wouldn't "lower is better" usually be used for those values?


i thought backseat modding and shitting up threads with backseat modding was also against the rules.... hmmm my mistake?
 

Tenebrous

Member
I'm not surprised at all, sadly. I said in another thread that nVidia just feels like a more premium product for the price... Driver/software support is a lot better, they've much more interesting proprietary tech (PhysX, 3D vision, GSync, Hairworks/TressFX are essentially the same), and get more deals on games than AMD.

I want AMD to survive, but only to keep nVidia prices somewhat reasonable. If they go under, I know some will view my attitude as part of the problem, but in my opinion, their problem is putting out subpar hardware - Especially on the CPU side.

I'd love to see an AMD vs Intel CPU graph.
 
Top Bottom