• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD/NVIDIA market-share graph. Spoiler Alert: it ain't pretty.

I would love to see if the APU sales statistics and see if that strategy has paid off.

If AMD had an APU that combined mid-level GPU and mid-level/i5 level CPU performance, they would resurge in a big way, IMO.
 

kuroshiki

Member
So, if AMD goes under, then who's going to design the next gen console? Nvidia? (lol)

next gen console with intel cpu and nvidia graphic card will be crazy expensive.
 

Rur0ni

Member
I was exclusively AMD/ATI for quite some time, back in the K6 days up to Thunderbirds for CPUs, and the 9800 Pro to X1950 Pro (AGP!). They were competitive back then. I remember dearly wanting a completely unified platform top to bottom once AMD bought ATI, and to reap any benefits that would come from that. Year by year, they disappointed.

I made the switch to Intel from the Wolfdale generation, and never looked back on the motherboard/cpu side of things. I dabbled with some Phenoms for budget/fun. Sold them. I switched to nVidia exclusively with the 8800 GTX. AMD hit back with the 4000 and 5000 series at some point, and I think that's the only generation of note to speak on. nVidia had some power hungry cards when AMD introduced the 5000 series, so that helped with perception a bit, but It's been all nVidia since then.

Now that I look back at the chart, I think that aligns with where the market went as well as far as GPUs are concerned. Once you slip far enough, it's hard to ever really reclaim it. As the slides posted above noted, nVidia is in it for more than just pure graphics, and they are dominating. They have more competitors than just AMD to be concerned about. That's a war largely won already.
 
So, if AMD goes under, then who's going to design the next gen console? Nvidia? (lol)

next gen console with intel cpu and nvidia graphic card will be crazy expensive.

Licensing costs will be higher, but it's not like it would be totally infeasible. Intel+Nvidia was the exact setup the OG Xbox had. Not sure how them collaborating to build a SoC would work, but I guess it should be possible since we saw IBM+ATI SoCs in the X360 revisions.

The thorough spanking that Intel tech gives to AMD CPUs means we might actually end up with console CPUs that aren't pieces of total shit!

:D
 

Iacobellis

Junior Member
So, if AMD goes under, then who's going to design the next gen console? Nvidia? (lol)

next gen console with intel cpu and nvidia graphic card will be crazy expensive.

No it wouldn't. It would be a better leap in visuals than what we got after waiting for eight years.
 
My thoughts exactly. We saw the same argument for the Xbox One back when things were looking super bleak. Competitive competition is great. Competition for competition's sake is not.

There is still competition in the gpu market. They have staggered launches now (I guess they both figured out this helps both their sales as it avoids direct competition within those 6 months or so each time) so they beat eachother in turns every time but they both deliver similar hardware. It's features where amd have been seriously behind (mainly no downsampling support which was awful) but they've caught up pretty well on features.


Cpu market has 0 competition though, it's a monopoly already and it shows with what intel is doing.


I also agree that there there is no use in buying an inferior product just to belife support a duopoloy, it's not a charity and it's definitely not the consumer's responsibility to keep amd going for the sake of it.
But when they both have products that are pretty close it's definitely in your best interest to buy the AMD one, because if/when amd goes under you can look forward to nvidia acting exactly like intel.



Licensing costs will be higher, but it's not like it would be totally infeasible. Intel+Nvidia was the exact setup the OG Xbox had. Not sure how them collaborating to build a SoC would work, but I guess it should be possible since we saw IBM+ATI SoCs in the X360 revisions.

The thorough spanking that Intel tech gives to AMD CPUs means we might actually end up with console CPUs that aren't pieces of total shit!

:D
I doubt it.
I don't know where you expect the cpu market (desktop and console one, not mobile) to be by 2020, but I reckon it'll be pretty much exactly where it's at today and where it was at 4 years ago.
And then the generation after that (if they even do one) will still be at that level give or take 20 percent.

So you'll get what, a 65watt (or 35 watt) quad core i5 equivalent in 2020 (this is going to happen regardless of wether amd goes under or not, the cpu market is already done, if the ps5 has anything more than a 3.5 ghz 4 core haswell equivalent you can take my tag from me)
And without amd to keep nvidia motivated to one up eachother every year you'll get the same with the gpu, it'll be stuck at whatever the midrange performance was when amd went out of business or gave up competing.

Nvidia will be selling people new gpus based on software features alone (like you'd have to buy a new gpu to get a new feature like shadowplay but other than that it'll be the same as the one you already have)
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
There is still competition in the gpu market. They have staggered launches now (I guess they both figured out this helps both their sales as it avoids direct competition within those 6 months or so each time) so they beat eachother in turns every time but they both deliver similar hardware. It's features where amd have been seriously behind (mainly no downsampling support which was awful) but they've caught up pretty well on features.

The consumer GPU market you mean, AMD is still well behind in the higher end professional market. NVIDIA consistenly has had like 85-90% of that market, and while sales numbers in that area are a very small % of overall sales units, that small slice generates something like over 25% of total revenue.
 

Sir_Vival

"I don't remember PEWPEWPEWPEWPEW being a part of the Star Wars movies."
I find graphics card fanboyism to be the most insane form of it that I'm personally aware of. We can test these things; every videocard has oodles of benchmarks performed on it. You know your budget - look up the benchmarks, find the card that has the best performance for that price. Unless there's some feature that you desperately need from the competitor, that's all that really matters. That usually seems to be AMD/ATI for me over the years, but I've had my fair share of dedicated and integrated (in the case of laptops) Nvidia cards.

"But the drivers!" Noone can seem to objectively state *why* AMD's drivers are somehow worse than Nvidia's. It just gets parroted around the internet echo chamber. If they're making games perform worse, that'll show up in benchmarks. I'd eat my hat if either manufacturer has any common issues with their drivers that would cause crashes and the like. Oddly enough, I'm actually having some issues with the drivers/software on my laptop right now - and it's a Nvidia chip. It happens, but unless someone can pull some actual numbers out of their ass, there's no reason to think that it happens any more with AMD than Nvidia.

It's just..insane. Buy the card that gives you the best performance for the price. This isn't a car or laptop manufacturer where you can literally touch the product and have preferences to the way it's built - they do the same damn thing. This fanboyism is going to run AMD in to the ground and create a monopoly in both the standard and graphics processing sectors that'll stagnate development for years.
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
I find graphics card fanboyism to be the most insane form of it that I'm personally aware of. We can test these things; every videocard has oodles of benchmarks performed on it. You know your budget - look up the benchmarks, find the card that has the best performance for that price. Unless there's some feature that you desperately need from the competitor, that's all that really matters. That usually seems to be AMD/ATI for me over the years, but I've had my fair share of dedicated and integrated (in the case of laptops) Nvidia cards.

"But the drivers!" Noone can seem to objectively state *why* AMD's drivers are somehow worse than Nvidia's. It just gets parroted around the internet echo chamber. If they're making games perform worse, that'll show up in benchmarks. I'd eat my hat if either manufacturer has any common issues with their drivers that would cause crashes and the like. Oddly enough, I'm actually having some issues with the drivers/software on my laptop right now - and it's a Nvidia chip. It happens, but unless someone can pull some actual numbers out of their ass, there's no reason to think that it happens any more with AMD than Nvidia.

In some cases, it's very much the drivers. As several people have posted earlier with their own experiences, AMD does not hold up if you need something on a Linux setup.
 
I knew they were doing bad but, sheesh AMD. I am still rocking this R9 270 I bought in December of 2013 when I built my new computer. I don't max everything so that's why I am pretty good with this but, I think Ill buy a NVIDIA card if I do get a new one...
 

kuroshiki

Member
Licensing costs will be higher, but it's not like it would be totally infeasible. Intel+Nvidia was the exact setup the OG Xbox had. Not sure how them collaborating to build a SoC would work, but I guess it should be possible since we saw IBM+ATI SoCs in the X360 revisions.

The thorough spanking that Intel tech gives to AMD CPUs means we might actually end up with console CPUs that aren't pieces of total shit!

:D

No it wouldn't. It would be a better leap in visuals than what we got after waiting for eight years.

At what price though? 600? 700?

One of the main reason why OG Xbox lost tons of money, and the reason why MS went with IBM was because of intel chip and nvidia video chip. Their licensing doesn't go well with huge volume discount, I heard.

And unlike previous gens, console makers no longer support the idea of 'losing shit ton of money in the beginning for the sake of marketshare' anymore. We might end up getting atom processor with intel =P. Also, when AMD goes under intel and Nvidia no longer have any reason to negotiate the price.


because they are the only one left in the market. =P Sony and MS and nintendo are at their mercy.
 

Iacobellis

Junior Member
At what price though? 600? 700?

One of the main reason why OG Xbox lost tons of money, and the reason why MS went with IBM was because of intel chip and nvidia video chip. Their licensing doesn't go well with huge volume discount, I heard.

And unlike previous gens, console makers no longer support the idea of 'losing shit ton of money in the beginning for the sake of marketshare' anymore. We might end up getting atom processor with intel =P. Also, when AMD goes under intel and Nvidia no longer have any reason to negotiate the price.


because they are the only one in the market. =P Sony and MS and nintendo are at their mercy.

You do realize that the PS3 was $600 for reasons that had no bearing on NVIDIA, right? And the Xbox was reasonably priced at $300 for a console that also included a built-in hard drive and Ethernet adapter. Both of which were separate add-ons for the PS2.

ps3_cost.jpg
 

kuroshiki

Member
You do realize that the PS3 was $600 for reasons that had no bearing on NVIDIA, right? And the Xbox was reasonably priced at $300 for a console that also included a built-in hard drive and Ethernet adapter. Both of which were separate add-ons for the PS2.

RIGHTTT, which has nothing to do with my point.
 

kuroshiki

Member
Your point doesn't make sense. Microsoft's deal fell through with NVIDIA, not Intel. Which didn't matter much because the 360 was already nearing release.

You don't understand what I'm trying to say.

Nvidia is notorious for not discounting their tech licensing even after millions of volume order. Your chart says that. IBM chip after 3 years saw hefty discount, over 80%. Nvidia? barely even 30%.

Microsoft and Sony, after AMD goes bankrupt and no longer exist on the face of the earth, has only two companies to turn to. Nvidia. And Intel. There is no other alternative on the market right now.

If you think Nvidia and Intel will agree both of them for heavy licensing discount per volume and agree to let their engineers work on Intel/Nvidia hybrid SoC, you are dreaming. That Research and Development fee is not free, and both MS and Sony won't spend a lot of money on future console.
 

Iacobellis

Junior Member
You don't understand what I'm trying to say.

Nvidia is notorious for not discounting their tech licensing even after millions of volume order. Your chart says that. IBM chip after 3 years saw hefty discount, over 80%. Nvidia? barely even 30%.

Microsoft and Sony, after AMD goes bankrupt and no longer exist on the face of the earth, has only two companies to turn to. Nvidia. And Intel. There is no other alternative on the market right now.

If you think Nvidia and Intel will agree both of them for heavy licensing discount per volume and agree to let their engineers work on Intel/Nvidia hybrid SoC, you are dreaming. That Research and Development fee is not free, and both MS and Sony won't spend a lot of money on future console.

Why would Sony not invest a good amount into their next console? The PS4 sales are Wii status right now.
 

Sandfox

Member
Why would Sony not invest a good amount into their next console? The PS4 sales are Wii status right now.

I think you're overestimating the PS4 and how much Sony would be willing to lose on sales. I don't think they want another PS3. They'll spend money but I don't think its going to translate into giant leaps.
 

kuroshiki

Member
Why would Sony not invest a good amount into their next console? The PS4 sales are Wii status right now.

Because PS4 proved that little RandD can still sell shitton of console, and even though it is number one console right now, the margin of profit for hardware sale itself is very slim.

If anything all the future consoles from MS and Sony, if there ever one in the future, will have inherently same philosophy as Xbone and PS4. SoC, easily manufactured parts, and no heavy special customization like PS3.
 

Iacobellis

Junior Member
I think you're overestimating the PS4 and how much Sony would be willing to lose on sales. I don't think they want another PS3.

Not another PS3, but another PS2 in terms of hardware and costs. This generation has already been notorious for "resolutiongate" and the comparisons to low-end PC's that can match or beat the performance of either console in the same price range.
 

kuroshiki

Member
Not another PS3, but another PS2 in terms of hardware and costs. This generation has already been notorious for "resolutiongate" and the comparisons to low-end PC's that can match or beat the performance of either console in the same price range.

The age of 'selling console at loss in the beginning for the sake of market share' is essentially over.

PS2 also in the beginning bleeds money for Sony. Sony recuperated somewhat by software sales, but sony no longer has R and D team lead by crazy ken to make another custom hardware like PS2, and no will to do so either.

MS also no longer has luxury for pouring money into Xbox division.

asfd3221125511352.jpg
 

Durante

Member
"But the drivers!" Noone can seem to objectively state *why* AMD's drivers are somehow worse than Nvidia's.
Oh, easily.
  • AMD's CPU overhead per operation in DX9 and DX11 is ~2 times higher than NV's.
  • AMD's DX11 drivers scale very badly in multithreaded scenarios compared to NV's.
  • NV's implementation of secondary features is far more solid -- e.g. Shadowplay supports the background capture of desktop content and a wider range of options than AMD's equivalent.
  • NV's OpenGL support is much better, with higher performance and a more conformant implementation overall.
  • Driver-level overrides are more extensive with NV than with AMD, featuring broader support for AA modes such as SGSSAA and the ability to inject HBAO+ in some games.
  • In the case of non-Windows OSes NV generally offer much higher performance and stability as opposed to AMD.

The true "myth" is that "average FPS in a handful of AAA games on WIndows with a fast CPU" is the only valid way to judge a graphics card, and that everyone who applies a more involved thought process when determining their purchase must be a fanboy.
 

Sandfox

Member
Not another PS3, but another PS2 in terms of hardware and costs. This generation has already been notorious for "resolutiongate" and the comparisons to low-end PC's that can match or beat the performance of either console in the same price range.

I don't think Sony cares about that when the PS4 is beating their expectations regardless. If anything Sony will probably try to duplicate this generation from now and use their a lot of their money in other ways to push HW.
 

kuroshiki

Member
Hell I think to avoid huge money loss we might end up getting one of those Nvidia SoC, aka Tegra for PS5. lol.

When that happens try to tell if PS4 has shitty CPU =P.
 

Foaloal

Member
I haven't been following computer hardware very closely for a few years, only just enough to know what the latest series are and other basic stuff like that.

So I can't tell how much exaggeration is or is not going on in this thread.

I just want to ask one question;

I have a friend who is trying to buy a new GPU (mid-high range). He is not the biggest PC gamer, and doesn't really want to spend a ton of money.

I personally have always been an ATi/AMD fan, and on Windows have never had issues with the drivers and have always found RadeonPro to meet all my needs that CCC doesn't.

But is it not ok to recommend an AMD card any more? Is the 9xx series a better "bang for your buck" at this point?

Just looking for a concise answer, if one is possible. Sorry if this isn't an appropriate place to ask.
 

Sir_Vival

"I don't remember PEWPEWPEWPEWPEW being a part of the Star Wars movies."
Oh, easily.
...

The true "myth" is that "average FPS in a handful of AAA games on WIndows with a fast CPU" is the only valid way to judge a graphics card, and that everyone who applies a more involved thought process when determining their purchase must be a fanboy.

While that may be true that there isn't one way to judge a graphics card, it's the way that matters for the great majority of consumers. Most people that buy a videocard just want it to play games well on Windows PC. For all of those people, AMD is just as good of a choice as Nvidia - yet this chart shows that doesn't equate to reality.
 
Hell I think to avoid huge money loss we might end up getting one of those Nvidia SoC, aka Tegra for PS5. lol.

When that happens try to tell if PS4 has shitty CPU =P.

64-bit ARM architecture isn't something to sneeze at these days you know. They're putting those things in servers now.

As ARM has become more powerful, its power consumption has also grown. Meanwhile x86 has moved towards lower power consumption as Intel has been trying to catch up in mobile. Eventually we will cross the streams and then things will become super interesting.
 

Oublieux

Member
But is it not ok to recommend an AMD card any more? Is the 9xx series a better "bang for your buck" at this point?

Just looking for a concise answer, if one is possible. Sorry if this isn't an appropriate place to ask.

This is such a loaded question to ask in this thread that has become Nvidia vs. AMD.

Personally, I believe that Nvidia holds the performance crown but at the cost of a higher markup; this is worthwhile if your friend is into value-added features like PhysX or ShadowPlay.

AMD holds the value/dollar crown but doesn't have as robust additional features or are lacking in comparison to Nvidia. The drivers are fine from the perspective of the average consumer, I think that issue is being overblown.

That's my two cents.
 

Muzicfreq

Banned
How is this so? I thought AMD were the cards to buy?

Because Nvidia is always ready for the next card. when the 300s drop expect nvidia to probably have something up their sleeve to combat it. Either aggressive advertising, money hatting a new game that has a lot of attention so it runs far better and has features "only with physX" to hold them over till Nvidia is ready to reveal their new cards.

Oh and my joke about them saying VR optimized
How Maxwell’s VR Direct Brings Virtual Reality Gaming Closer to Reality
-_- I knew they would do it. Just expect more marketing behind that as well.
 

Renekton

Member
Nvidia is notorious for not discounting their tech licensing even after millions of volume order. Your chart says that. IBM chip after 3 years saw hefty discount, over 80%. Nvidia? barely even 30%.

Microsoft and Sony, after AMD goes bankrupt and no longer exist on the face of the earth, has only two companies to turn to. Nvidia. And Intel. There is no other alternative on the market right now.

If you think Nvidia and Intel will agree both of them for heavy licensing discount per volume and agree to let their engineers work on Intel/Nvidia hybrid SoC, you are dreaming. That Research and Development fee is not free, and both MS and Sony won't spend a lot of money on future console.
In that future scenario, even if Sony/MS attempt non-Nvidia SoC for next console, Nvidia will sue them for major patent infringement.
 

Foaloal

Member
This is such a loaded question to ask in this thread that has become Nvidia vs. AMD.

Yeah, after writing the post I took a look at it and realized that...

Personally, I believe that Nvidia holds the performance crown but at the cost of a higher markup; this is worthwhile if your friend is into value-added features like PhysX or ShadowPlay.

AMD holds the value/dollar crown but doesn't have as robust additional features or are lacking in comparison to Nvidia. The drivers are fine from the perspective of the average consumer, I think that issue is being overblown.

That's my two cents.

This sounds like the status quo to me. Thanks for the reply. I was just worried based off how people were talking in this thread that somehow AMD had totally fallen off, but I'm guessing that's mostly just the usual sensationalism that goes along with people discussing their preferences in $300+ computer hardware.
 
Oh, easily.

  • [*]AMD's CPU overhead per operation in DX9 and DX11 is ~2 times higher than NV's.
    [*]AMD's DX11 drivers scale very badly in multithreaded scenarios compared to NV's.
  • NV's implementation of secondary features is far more solid -- e.g. Shadowplay supports the background capture of desktop content and a wider range of options than AMD's equivalent.
  • NV's OpenGL support is much better, with higher performance and a more conformant implementation overall.
  • Driver-level overrides are more extensive with NV than with AMD, featuring broader support for AA modes such as SGSSAA and the ability to inject HBAO+ in some games.
  • In the case of non-Windows OSes NV generally offer much higher performance and stability as opposed to AMD.

The true "myth" is that "average FPS in a handful of AAA games on WIndows with a fast CPU" is the only valid way to judge a graphics card, and that everyone who applies a more involved thought process when determining their purchase must be a fanboy.

Could you please provide benchmarks for the bolded two? No offense but that sounds like a complete pile of bullshit to me.
 

undu

Member
Nvidia heard Linus loud and clear and quickly made him happy. Linus is quite happy with Nvidia now.

Source: http://youtu.be/5PmHRSeA2c8?t=1h4m24s

That's before they started to shit on nouveau drivers, by not providing them signed firmware blobs needed to make drivers work in maxwell cards, like AMD or Intel do.

Or why do they remove driver features because fuck you that's why, like multimonitor setups or even actively trying to hinder performance for some situation, like using the gpu in a virtual machine, just in case you wanted to play in windows using a virtual machine. (see question 10)

Nvidia drivers work better than AMD's, but let's not kid ourselves, they aren't in linux for the community or openness, they're in linux because of the professional sector and it shows. They aren't willing to provide professional-oriented features to consumer cards, even if this means cutting features from drivers and blocking other drivers that might provide them.
 

KePoW

Banned
Could you please provide benchmarks for the bolded two? No offense but that sounds like a complete pile of bullshit to me.

Personally, I do not know if those statements are accurate

But are you aware that Durante is a well-known and pretty famous coder here on GAF??
 
Personally, I do not know if those statements are accurate

But are you aware that Durante is a well-known and pretty famous coder here on GAF??

Yeah, I'm pretty well aware, but those assertions still sound like bullshit.

First of all, here's the benchmark that kicked the numbers around, and have a look at the warning: http://www.gamersnexus.net/guides/1885-dx12-v-mantle-v-dx11-benchmark

Note that results cannot be compared between GPUs. This is not like a standard FPS or frame-time benchmark. This is purely an API test, and so any delta between nVidia and AMD hardware should not be regarded as superiority or inferiority. We tested using a Titan X – just because it's new and we thought it'd be interesting to see how many draw calls it can pull off – and the 290X for Mantle testing.

The fact that the overhead is 2x is meaningless; the only valid argument to be made from the bench is that DX12 reduces the overhead for both. The scores are uncomparable.

Regardless, the bench has its own validity problems, as people have had very varying scores: http://www.overclock.net/t/1495236/amd-vs-nvidia-cpu-overhead

LL


And that is on a CPU bottlenecked FX 8350 system..

So yeah..
 

KePoW

Banned
Yeah, I'm pretty well aware, but those assertions still sound like bullshit.

Based on what, your own personal expertise in that particular field? I mean I guess I just don't understand what causes you to think that from a general user standpoint, unless you actually do graphics coding yourself

Doesn't seem far-fetched to me
 
Based on what, your own personal expertise in that particular field? I mean I guess I just don't understand what causes you to think that from a general user standpoint, unless you actually do graphics coding yourself

Doesn't seem far-fetched to me

Updated my post, please check above. You don't have to be a rocket scientist to know apples can't be compared with oranges. I'd very much like to hear from Durante though..
 

marmoka

Banned
Are Nvidia cards better than AMD cards for laptops? I'm considering to buy a new laptop with Nvidia, because I didn't have a good experience with AMD drivers at the laptop I have now.
 

Marlenus

Member
Ultimately the fall of AMD can be summarized in a single sentence: They bought ATI and sold their fabs.

They should never have bought ATI, and it was the enormous debt they took on from the ATI acquisition combined with falling profits that forced them to sell all physical assets to remain solvent. That was the beginning of the end for them.

It's funny now to read that article and realize that if AMD has merged with Nvidia and let Jen-Hsun Huang run the combined company as he demanded, the situation would be completely the opposite of what it is now for AMD. Hindsight is always 20/20 but in business it's like an electron microscope and orders of magnitudes more painful.

A bit more complicated than that. Back with the A64 vs P4 Intel were doing everything they could to stop AMD from gaining market share and in turn revenue. Heck AMD gave away a pallet full of Opteron CPUs to HP and HP still would not sell very many as the financial impact Intel would have had on them was greater than the benefit of getting free stock. The loss of revenue at that time really harmed AMD and while they got a $1 billion settlement and a favourable cross licencing agreement it was too late.

If you remember back to the Phenom II it was between the Q6 and the Q9 in performance. By then though the first Nehalem i series had launched so AMD were miles behind but if they could have gotten that architecture out around the same time as the Conroe Q6 they would have been faster than Intel still. I think the money they lost due to Intel's tactics and abuse of their monopoly is what set back those architectures. That in turn caused AMD to look elsewhere so they purchased ATi to try and get fusion off the ground and they went with a radical design in bulldozer that failed.

As far as AMD GPU market share goes, it is a shame because they make good products and no matter what anybody says the drivers are generally good in both camps. Both companies cock them up now and again but NV has had the larger cock ups. I do not recall any AMD driver causing the fan to stop spinning and in turn killing the GPU but that happened to NV.

I just hope the 3xx series is a new line from top to bottom so it can fully support freesync and stop this feature segmentation within a GPU series. If the 390X is as fast as rumoured and priced reasonably I think it will be a success.
 

wildfire

Banned
I understand Linus loves opening up source code but it boggles my mind that Linus would think that Nvidia would ever consider doing that.


Well for profit companies have and continue to open themselves up with open source variations of their software. He's expecting Nvidia to see it as both beneficial to themselves as well as their customers. He can't handle that Nvidia doesn't like the trade offs. *shrugs*
 
A bit more complicated than that. Back with the A64 vs P4 Intel were doing everything they could to stop AMD from gaining market share and in turn revenue. Heck AMD gave away a pallet full of Opteron CPUs to HP and HP still would not sell very many as the financial impact Intel would have had on them was greater than the benefit of getting free stock. The loss of revenue at that time really harmed AMD and while they got a $1 billion settlement and a favourable cross licencing agreement it was too late.

If you remember back to the Phenom II it was between the Q6 and the Q9 in performance. By then though the first Nehalem i series had launched so AMD were miles behind but if they could have gotten that architecture out around the same time as the Conroe Q6 they would have been faster than Intel still. I think the money they lost due to Intel's tactics and abuse of their monopoly is what set back those architectures. That in turn caused AMD to look elsewhere so they purchased ATi to try and get fusion off the ground and they went with a radical design in bulldozer that failed.

As far as AMD GPU market share goes, it is a shame because they make good products and no matter what anybody says the drivers are generally good in both camps. Both companies cock them up now and again but NV has had the larger cock ups. I do not recall any AMD driver causing the fan to stop spinning and in turn killing the GPU but that happened to NV.

I just hope the 3xx series is a new line from top to bottom so it can fully support freesync and stop this feature segmentation within a GPU series. If the 390X is as fast as rumoured and priced reasonably I think it will be a success.

This is a correct assessment of the AMD situation AFAIC; they realized they couldn't beat a monopoly by simply delivering the superior product because the channels, to especially profitable markets, were completely corrupted by Intel.

Unfortunately, ATi has been suffering the same problem against nVidia for a while now. In essence, two weaker competitors did not add up to a single strong competitor.

AMD's bets are in the Fusion basket now. If that really turns out to be the future, AMD is in a great place to exploit it..
 
Why would Sony not invest a good amount into their next console? The PS4 sales are Wii status right now.
Because they didn't with the ps4 and it's still selling well.
They were given a finger, now they will be taking an arm.


Hell I think to avoid huge money loss we might end up getting one of those Nvidia SoC, aka Tegra for PS5. lol.

When that happens try to tell if PS4 has shitty CPU =P.

A part of me wants to see this happen just in a 'want to watch the world burn' kind of way :p
But mostly I shuddered


While that may be true that there isn't one way to judge a graphics card, it's the way that matters for the great majority of consumers. Most people that buy a videocard just want it to play games well on Windows PC. For all of those people, AMD is just as good of a choice as Nvidia - yet this chart shows that doesn't equate to reality.
You didn't read his post at all.
He just told you that on lower end cpus (and there's a lot of people with amd fx cpus or older core2quads or i3s or pentiums) the amd drivers will cause a bigger cpu bottleneck than nvidia ones.

That's why he also said that
-benchmarks only measuring average fps (by which he is probably alluding to the fact that any respectable benchmark measures performance in frametimes and not a meaningless metric like average fps , and there are not many respectable benchmarks out there)
-and only testing a select few AAA games (these are usually the ones that get most attention on the driver side and will not tell you about the other 99 percent of games and the problems they have)
Does not tell you what your actual experience using a certain gpu will be.


For a while when amds drivers REALLY were in the gutter there was not a single review that would warn you about the issues with games like prototype 1-2, from dust, red orchestra 2, homefront etc that had hopelessly broken performance at launch and for months after before finally being fixed.

People who are pissed at amd drivers will generally have a good reason based on their experience with the gpus, especially if they happened to own a 4-5-6000 series amd gpu.
And as much as things have improved recently, people are not going to forget about them so fast and will not trust amd to provide solid drivers and solid performance on launch day for new games. (and it's going to take time and consistency from amd for people to forget about it)

Personally if you asked me in 2013 if I'd ever buy another amd gpu I'd ve said HELL NO
Now that they did the frametime improvement drivers, added downsampling support and now that I haven't had any real issues for a while with my 6870 I've come around a bit (as someone who hasn't bought anything intel or nvidia since 2002, they still managed to alienate me) . Nvidia's recent false advertising with the 970 sure evened things out a lot as well...

But seeing people like you dismiss people's issues (including mine) with amd as some kind of fanboy loyalty to nvidia driven FUD really annoys me.
This kind of stifling criticism or silencing feedback brand wars shit is not good for anyone. I see it in console wars threads and I see it in all the equally pathetic team green vs team red threads. For shame.
 

Durante

Member
Could you please provide benchmarks for the bolded two? No offense but that sounds like a complete pile of bullshit to me.
Happily!

Let's start out by theoretical observations.
73019.png

What you should look at here is the "D3D11ST" and "D3D11MT" values. As you can see, AMD achieves no parallel scaling at all, and their sequential DX11 is also far less CPU-efficient than NV's.
Anandtech said:
At 1.9M draw calls per second in DX11ST and 2.2M draw calls per second in DX11MT, NVIDIA starts out in a much better position than AMD does; in the latter they essentially can double AMD’s DX11MT throughput (or alternatively have half the API overhead).


Now, let's see if this has an impact in actual games.
As you can see above, in this heavily CPU-limited scenario NV performs significantly better, and reaches a GPU limit (at around 230 FPS min) much sooner.

In a more recent test, Eurogamer investigated GTA 5 performance.
cpuperf2nu2u.png

This is perhaps the most direct test of this kind ever performed, and shows the true difference in CPU overhead. The much faster AMD GPU performs quite a bit worse in terms of min FPS on the slower CPU.

--

The light at the end of the tunnel for AMD with all this is that it will be far less important with lower-level APIs, but there are still thousands of DX9 and DX11 games out there, and there will likely be a lot more released over the coming years. This issue makes suggesting AMD GPUs for low-cost builds much less straightforward than it would be otherwise.
 
Top Bottom