• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD/NVIDIA market-share graph. Spoiler Alert: it ain't pretty.

In that future scenario, even if Sony/MS attempt non-Nvidia SoC for next console, Nvidia will sue them for major patent infringement.

Imagine if a Sony/MS/Ninty or a hypothetical Amazon console is powered by an ARM processor in the future.

But it's likely that will only occur the gen after the successor to this one, as no doubt all companies are already quite a way into planning for the PS5/XB2 with Intel CPUs or possibly a secret, custom next-gen AMD APU.

I just want to ask one question;

I have a friend who is trying to buy a new GPU (mid-high range). He is not the biggest PC gamer, and doesn't really want to spend a ton of money.

I personally have always been an ATi/AMD fan, and on Windows have never had issues with the drivers and have always found RadeonPro to meet all my needs that CCC doesn't.

But is it not ok to recommend an AMD card any more? Is the 9xx series a better "bang for your buck" at this point?

Just looking for a concise answer, if one is possible. Sorry if this isn't an appropriate place to ask.

Ha ha this should be the crux of the back and forth on here really.

The best bang for your buck is definitely an R9 290 if your budget can stretch to it. And I think it's common knowledge that the price/performance offered by the 290X is better than anything Nvidia offers, as it's close to a 980 in general and definitely when gaming in 2k/4k whilst being hugely cheaper. When the 290/290x get reduced in price after the release of the new 3XX series, they will be my next cards.

The R9 270X and 280 also offer great bang for your buck. Don't believe the hype - I personally own a 270X ITX and it runs near silent and cool. No issues with drivers either.

EDIT: Read through the I Need a New PC! thread where you'll find the 9XX series, especially the 290/290x, recommended as the best for price/performance, no question. It's whether you are willing to pay a significant premium for better software on the Nvidia side.
 

Serandur

Member
Happily!

Let's start out by theoretical observations.
73019.png

What you should look at here is the "D3D11ST" and "D3D11MT" values. As you can see, AMD achieves no parallel scaling at all, and their sequential DX11 is also far less CPU-efficient than NV's.



Now, let's see if this has an impact in actual games.
As you can see above, in this heavily CPU-limited scenario NV performs significantly better, and reaches a GPU limit (at around 230 FPS min) much sooner.

In a more recent test, Eurogamer investigated GTA 5 performance.
cpuperf2nu2u.png

This is perhaps the most direct test of this kind ever performed, and shows the true difference in CPU overhead. The much faster AMD GPU performs quite a bit worse in terms of min FPS on the slower CPU.

--

The light at the end of the tunnel for AMD with all this is that it will be far less important with lower-level APIs, but there are still thousands of DX9 and DX11 games out there, and there will likely be a lot more released over the coming years. This issue makes suggesting AMD GPUs for low-cost builds much less straightforward than it would be otherwise.
This right here is the only reason left I have any real preference for Nvidia with equal hardware. CPU-intensive games are the bane of my PC, I prefer driver overhead to be minimal.

If AMD corrected this deficiency, I would be leaning more towards a 390X than the "1080/980Ti". I doubt most people are aware of this overhead disparity, but it's my sole gripe with Radeons.
 

tuxfool

Banned
"But the drivers!" Noone can seem to objectively state *why* AMD's drivers are somehow worse than Nvidia's. It just gets parroted around the internet echo chamber. If they're making games perform worse, that'll show up in benchmarks. I'd eat my hat if either manufacturer has any common issues with their drivers that would cause crashes and the like. Oddly enough, I'm actually having some issues with the drivers/software on my laptop right now - and it's a Nvidia chip. It happens, but unless someone can pull some actual numbers out of their ass, there's no reason to think that it happens any more with AMD than Nvidia.

It would be great if people could qualify, categorise or list what is wrong with the drivers, beyond CPU overhead features. Something a bit more in depth other than "it's shit". What are the problems people are having with them.

64-bit ARM architecture isn't something to sneeze at these days you know. They're putting those things in servers now.

As ARM has become more powerful, its power consumption has also grown. Meanwhile x86 has moved towards lower power consumption as Intel has been trying to catch up in mobile. Eventually we will cross the streams and then things will become super interesting.

High end Tegra chips aren't using ARM cores. ATM they're using Denver cores, which emulate the ARM ISA with dynamic recompilation. Obviously if a future console were to use Nvidia's architecture they probably would use native instructions.
 

Theonik

Member
No way, NVIDIA has long had the edge in Linux and OpenGL, for at least a decade if I remember right. They've been the standard GPU in Linux workstations at ILM and Pixar as long as I can recall at the very least.
Right. For companies like ILM and Pixar it is irrelevant if the drivers are FOSS as long as they work and their hardware is supported though they'd benefit if they were regardless. This is about the Linux and FOSS community at large who are annoyed by this especially Linux developers that need to reverse engineer them to make Linux drivers.

The FOSS AMD driver doesn't give away all the performance secrets, it just makes it possible to use the card without completely reverse-engineering it. NVidia could do the same thing if they wanted to without losing any competitive advantage.
Yes. In fact they should.

At what price though? 600? 700?

One of the main reason why OG Xbox lost tons of money, and the reason why MS went with IBM was because of intel chip and nvidia video chip. Their licensing doesn't go well with huge volume discount, I heard.
And unlike previous gens, console makers no longer support the idea of 'losing shit ton of money in the beginning for the sake of marketshare' anymore. We might end up getting atom processor with intel =P. Also, when AMD goes under intel and Nvidia no longer have any reason to negotiate the price.
because they are the only one left in the market. =P Sony and MS and nintendo are at their mercy.
The problem Microsoft run into with the Xbox had little to do with going nVidia/Intel but rather in that they DIDN'T license the technology at all. They bought the parts straight from Intel and nVidia and were unable to negotiate real cost cuts later in the gen while they were losing big from the start.
They realised this with the 360 and licensed and co-developed the IBM/AMD solution on the 360. This enabled them to scale cost as the gen progressed and lose far less money on the 360 than they did with the Xbox. Better still the IBM CPU on the 360 was derived from Cell which IBM co-developed with Sony and Toshiba, leaving with MS exploiting the fruits of Sony's labours.

You do realize that the PS3 was $600 for reasons that had no bearing on NVIDIA, right? And the Xbox was reasonably priced at $300 for a console that also included a built-in hard drive and Ethernet adapter. Both of which were separate add-ons for the PS2.

ps3_cost.jpg
The Xbox ended up losing MS some $2-5bn iirc because they couldn't get the price re-negotiation they had hoped during that gen. Now, in a scenario of the competition dying out they might end up being screwed once more.

You don't understand what I'm trying to say.

Nvidia is notorious for not discounting their tech licensing even after millions of volume order. Your chart says that. IBM chip after 3 years saw hefty discount, over 80%. Nvidia? barely even 30%.

Microsoft and Sony, after AMD goes bankrupt and no longer exist on the face of the earth, has only two companies to turn to. Nvidia. And Intel. There is no other alternative on the market right now.

If you think Nvidia and Intel will agree both of them for heavy licensing discount per volume and agree to let their engineers work on Intel/Nvidia hybrid SoC, you are dreaming. That Research and Development fee is not free, and both MS and Sony won't spend a lot of money on future console.
Again it wasn't licensing in this case though in that scenario MS/Sony are pretty much screwed, though they both CAN afford to drop a ton of money on their next systems, the question is if they will and in the case of the Xbox Juan Dos, the question is whether it's even happening, at least by MS at this point.
 

Renekton

Member
Imagine if a Sony/MS/Ninty or a hypothetical Amazon console is powered by an ARM processor in the future.

But it's likely that will only occur the gen after the successor to this one, as no doubt all companies are already quite a way into planning for the PS5/XB2 with Intel CPUs or possibly a secret, custom next-gen AMD APU.
Possibly no more next-gen console if AMD bankrupts.

Intel or Nvidia console involvement is extremely unlikely as they want to keep their super high margins, or in the case of Nvidia promote own SHIELD.

If Sony/MS engineer a custom GPU with another vendor, they will get sued by Nvidia.
 
I like Nvidia products, but hope AMD can regain a higher market share.

Same with AMD vs Intel. Healthy competition is always a good thing for consumers.
 

Irobot82

Member
That isn't a reference cooler. That is the XFX version of it. Unfortunately both their DD coolers and this version were an absolute pile of crap.

It sure does look like one. It looks like garbage. My I should change the rule to never buy XFX or never buy a GPU with one fan.
 

Izcarielo

Banned
Possibly no more next-gen console if AMD bankrupts.

Intel or Nvidia console involvement is extremely unlikely as they want to keep their super high margins, or in the case of Nvidia promote own SHIELD.

If Sony/MS engineer a custom GPU with another vendor, they will get sued by Nvidia.

How can Nvidia sue them for that???
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
Right. For companies like ILM and Pixar it is irrelevant if the drivers are FOSS as long as they work and their hardware is supported though they'd benefit if they were regardless. This is about the Linux and FOSS community at large who are annoyed by this especially Linux developers that need to reverse engineer them to make Linux drivers.

This was posted earlier on that regard:

Nvidia heard Linus loud and clear and quickly made him happy. Linus is quite happy with Nvidia now.

Source: http://youtu.be/5PmHRSeA2c8?t=1h4m24s
 

Pooya

Member
How can Nvidia sue them for that???

They just sued Qualcomm and Samsung, seemingly successfully. The have patents on pretty much everything involving modern GPU design. Intel, AMD and Nvidia have cross licensing agreements, it works out between them, but for a new player? it's almost impossible to be competitive right now. Needless to say AMD is a patent goldmine and could prevent this kind of behavior, if it comes to it they will be bought out very quickly, hopefully not by Intel or Nvidia though lol.
 
This right here is the only reason left I have any real preference for Nvidia with equal hardware. CPU-intensive games are the bane of my PC, I prefer driver overhead to be minimal.

If AMD corrected this deficiency, I would be leaning more towards a 390X than the "1080/980Ti". I doubt most people are aware of this overhead disparity, but it's my sole gripe with Radeons.

It's far too easily dismissed when people recommend parts. If you're not on a modern quad core Intel CPU then AMD simply isn't a sensible option.

R9 290/X should only be considered by those on an overclocked Intel quad core.

This isn't a small difference either, in the Alien benchmarks AMD's driver overhead is so vast that a 2 generation old low end core i3 2100 performs better with Nvidia hardware than a current generation i5 4670k on AMD hardware.

Why buy a high end quad core Intel CPU when choosing an AMD GPU can mean it performs worse than a low end dual core CPU?
 
No, you don't. Nobody that's had an AMD card in the last 3 or 4 years should have any complaints.

The fact that you're referring to them as "ATI" in 2015 tells me all I need to know about your current knowledge and experience with their products.

^ this

Have been running AMD CPU (phenom x4) and GPU (6850) since 2011 w/ 0 issues.
 
At what price though? 600? 700?

One of the main reason why OG Xbox lost tons of money, and the reason why MS went with IBM was because of intel chip and nvidia video chip. Their licensing doesn't go well with huge volume discount, I heard.

And unlike previous gens, console makers no longer support the idea of 'losing shit ton of money in the beginning for the sake of marketshare' anymore. We might end up getting atom processor with intel =P. Also, when AMD goes under intel and Nvidia no longer have any reason to negotiate the price.


because they are the only one left in the market. =P Sony and MS and nintendo are at their mercy.

thank you
 

arevin01

Member
Last AMD card I bought was 7850. Couldn't even play POE or LoL without crashing. Never gone back since. AMD burned many bridges with their inferior product in the past and lost them forever.
 

SURGEdude

Member
AMD tends to have non-efficient cards and the drivers while much improved of recent are still behind Nvidia's. Also OpenGL support is shit.

Nvidia is a pack of weasels and tends to overprice their cards.

Both of them suck. That said shitty nvidia mobile gpus destroyed 2 Macbooks of mine a few years ago and they pretty much tried to cover it up for years. The 8600m GT was a massive fucking pile of shit that affected tons of brands of notebooks built around 2011.
 

ZOONAMI

Junior Member
Jesus christ, how the hell does the 290X BTFO the gtx980 on DX12? The 290X is old as balls.

Because the 980 is a midrange card that nvidia and it's board partners are charging ~$600 for. The 290x performs only 8% behind a 980 on average across 20 of the bigger games at 4k. The 290x was a beast when it was was released, and it still is.
 

IMACOMPUTA

Member
Because the 980 is a midrange card that nvidia and it's board partners are charging ~$600 for. The 290x performs only 8% behind a 980 on average across 20 of the bigger games at 4k. The 290x was a beast when it was was released, and it still is.

Didn't the 290x launch at ~$600?
 

ZOONAMI

Junior Member
What is the data you're working from?

http://jonpeddie.com/press-releases/details/gpu-market-upintel-and-nvidia-graphics-winners-in-q4-amd-down/

Jon Peddie, if the total graphics market was roughly stagnant 2014 vs. 2013, with AMD and Intel sitting at roughly the same shares, I don't see how the graph in the OP can be accurate. The info I just linked actually says AMDs discrete GPU shipments increased 1.8%. It says the 10.4% decrease in in total graphics shipments was mainly due to AMD being late w/parts for notebooks.
 

ZOONAMI

Junior Member
Didn't the 290x launch at ~$600?

Yeah, but when it launched it was actually a high end chip, as it demonstrates by hanging with nvidias 980, two years later. When the 390x launches it will hang with a titan x, for $600, instead of whatever ridiculous sum the titan x is right now ($1100). So the point is, nvidia massively overcharges for little to no performance gain, all while exaggerating their power efficiency specs. AMD is clearly the performance to dollar leader, but people don't seem to see any value to that.
 
AMD tends to have non-efficient cards and the drivers while much improved of recent are still behind Nvidia's. Also OpenGL support is shit.

The HD 4000, 5000 and 6000 series were more efficient than Nvidia's line at the time. Hell, the HD 5870 averaged 100W less than the GeForce 480 during gaming.

The 7000 series was more or less equal with Kepler.

I don't know why there's this pervasive thought that because the R9 200 series isn't very efficient, everything AMD ever made wasn't efficient.
 

AJLma

Member
I think AMD stopped chasing efficiency in their released cards to stay competitive. They clearly have been working on it though, we've seen the earliest iteration of their efficiency improved tech in the R9 285.

Now we just have to see if that PR gamble paid off with their next big card.
 

wachie

Member
The HD 4000, 5000 and 6000 series were more efficient than Nvidia's line at the time. Hell, the HD 5870 averaged 100W less than the GeForce 480 during gaming.

The 7000 series was more or less equal with Kepler.

I don't know why there's this pervasive thought that because the R9 200 series isn't very efficient, everything AMD ever made wasn't efficient.
If you repeat it enough, it becomes true. Making things worse is when people with "credibility" start joining in this.
 
I have zero loyalty to any brand.

Whoever has the best CPU/GPU for the money, at the time I decide to do an upgrade, is the brand that gets my money.

I've gone back and forth between AMD/ATI/Intel/Nvidia for over a decade and I'm not stopping now.

It's been a tug-of-war, but I'm not rooting for anyone. I'm just hoping it continues so we don't have a monopoly screwing us all out of great values.
 
The high-end doesn't really matter as far as marketshare goes.

What they need to nail is the $200-300 price bracket, because that's where the majority of sales are.

If they can release an updated, more efficient version of the 290X that performs close to the 980 at $299, that'll do wonders.
 

ZOONAMI

Junior Member
The high-end doesn't really matter as far as marketshare goes.

What they need to nail is the $200-300 price bracket, because that's where the majority of sales are.

If they can release an updated, more efficient version of the 290X that performs close to the 980 at $299, that'll do wonders.

That is exactly what the 380x needs to be.
 

EatMyFace

Banned
So lets say in a few years AMD is out of the market(or not able to provide Sony/MS with chips) and Nvidia is still charging a premium for theirs.......what could Microsoft/Sony do? Mobile chips would be too weak. So maybe develop their own(aka Cell 2.0)?
 
Isn't that worse? Nvidia is probably holding back DX12 drivers. I refuse to believe the 290X outclasses a card a whole year newer on a new API.

Isn't what worse? The API overhead graphs on the previous page? Who knows what the final release drivers will look like but API overhead isn't something that is necessarily going to change drastically from generation to generation as you can see from the DX11 numbers.
 

wachie

Member
So lets say in a few years AMD is out of the market(or not able to provide Sony/MS with chips) and Nvidia is still charging a premium for theirs.......what could Microsoft/Sony do? Mobile chips would be too weak. So maybe develop their own(aka Cell 2.0)?
Most likely they will go with Nvidia and release at $599.
 

diaspora

Member
Isn't what worse? The API overhead graphs on the previous page? Who knows what the final release drivers will look like but API overhead isn't something that is going to change drastically from generation to generation as you can see from the DX11 numbers.
For an older GPU to post better performance metrics on DX12 than a superior GPU?
 
Because the 980 is a midrange card that nvidia and it's board partners are charging ~$600 for. The 290x performs only 8% behind a 980 on average across 20 of the bigger games at 4k. The 290x was a beast when it was was released, and it still is.

Very true.

I hate how Nvidia overcharges for all its products. If AMD had been more competitive with Nvidia in recent years, they never would've been able to get away with this.
 

jwhit28

Member
The high-end doesn't really matter as far as marketshare goes.

What they need to nail is the $200-300 price bracket, because that's where the majority of sales are.

If they can release an updated, more efficient version of the 290X that performs close to the 980 at $299, that'll do wonders.

The 280x handily beats the 960 around the $200-$225 range. The 290 can be had for $250-$260 and hangs with the 970 performance wise. Whatever hooks Nvidia has on customers that refuse to look at AMD goes deeper than performance/dollar. Maybe it's that products don't seem NEW enough. They will have to go back to releasing a new line each year even if nearly the whole thing is rebadges or powersavers that don't necessarily perform better like the 285.
 

IMACOMPUTA

Member
Very true.

I hate how Nvidia overcharges for all its products. If AMD had been more competitive with Nvidia in recent years, they never would've been able to get away with this.

Exactly.
AMD being the cheap option is a result of their failings, not some service to the consumer. It is true that they have the price/performance king in the 290x, but that's only because it can't hang at its original MSRP.
Is that anything to be proud of?
 

diaspora

Member
The 280x handily beats the 960 around the $200-$225 range. The 290 can be had for $250-$260 and hangs with the 970 performance wise. Whatever hooks Nvidia has on customers that refuse to look at AMD goes deeper than performance/dollar. Maybe it's that products don't seem NEW enough. They will have to go back to releasing a new line each year even if nearly the whole thing is rebadges or powersavers that don't necessarily perform better like the 285.
They arent new enough. These GPUs are 18 months old.
 
For an older GPU to post better performance metrics on DX12 than a superior GPU?

It's not general performance numbers, so while API overhead maybe lesser at the time that graph was created who knows what that difference means in real world performance. I wouldn't be surprised if AMD does a bang up job with DX12 efficiency, though. They were ahead of the game with Mantle, but the API and the final drivers won't be available for months.
 

diaspora

Member
It's not general performance numbers, so while API overhead maybe lesser at the time that graph was created who knows what that difference means in real world performance. I wouldn't be surprised if AMD does a bang up job with DX12 efficiency, though. They were ahead of the game with Mantle, but the API and the final drivers won't be available for months.
Yeah, it's why I said NV might be holding out right now =P
 

ZOONAMI

Junior Member
For an older GPU to post better performance metrics on DX12 than a superior GPU?

It isn't really that superior though. They pretty much trade blows. In this case newer doesn't necessarily mean superior. It really is an 18 mos old high end GPU vs a 6 month old midrange GPU, and thus, it makes sense that the 290x and 980 compete with each other. The 290x can be had for half the cost though.
 
Exactly.
AMD being the cheap option is a result of their failings, not some service to the consumer. It is true that they have the price/performance king in the 290x, but that's only because it can't hang at its original MSRP.
Is that anything to be proud of?

I still cannot believe AMD hasn't gotten the 390X out by now. They are sitting LITERALLY on a golden opportunity to launch a new powerful GPU line before Nvidia has a change to properly counter. The longer they wait, the more time they give Nvidia to respond. Not sound business strategy.
 

AJLma

Member
The R9 290x is a MUCH bigger card with many more shader units, so it makes complete sense that when overhead is lifted off of it with DX12 drivers that it keeps up with the 980. AMD cards also do something better than nVidia's cards compute wise, maybe someone more technically knowledgeable could explain, but it's the same reason people were buying AMD cards over nVidia during the bitcoin craze in 2013 I believe.

I also feel this exact reason is why the R9 390x is such an exciting card.

The 980 is more efficient and runs at higher clock speeds, but it's still minus 800 cores. The Titan X is nVidia's real deal big card, and it's only barely larger than a 290x, that speaks a lot to nVidias leaps towards efficiency, but the 390x is rumored to have a shader count of over 4000. It might actually end up being the card to have by the end of the year.
 
Top Bottom