• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia vs. AMD: A well-known retailer currently sees a clear winner in terms of graphics cards.

Spyxos

Member
GhjrHEK.jpg

A surprise: AMD is clearly ahead of Nvidia in the current sales figures of graphics cards in the German store Mindfactory. What could be the reason for that?

One of the biggest electronics retailers in Germany, Mindfactory, provides more detailed insight into its sales figures. The Twitter user TechEpiphany has evaluated the corresponding data of current graphics cards in the 13th calendar week of 2023.

The result: Mindfactory sold a total of 2,375 graphics cards. 905 of them were Nvidia graphics cards and 1,470 AMD graphics cards. Nvidia graphics cards therefore only account for about 38 percent of total sales.

The majority of the sold graphics cards, or just under 62%, come from AMD. Intel does not appear in the statistics because the company's sold graphics cards were not evaluated. But how does AMD's not necessarily expected success come about in this snapshot?

The most popular graphics cards at a glance
Among the five most sold graphics cards, there is only one card from Nvidia: The RTX 3060. All other graphics cards are from AMD. Let's first take a look at the top 5 including approximate prices with reference to Mindfactory's current offer:

PlacementGraphics cardCopies soldPrice
1RTX 3060260320 Euro
2RX 6700 XT240399 Euro
3RX 7900 XT230848 Euro
4RX 7900 XTX2201.099 Euro
5RX 6800200529 Euro
Table of the five most sold graphics cards on Mindfactory.

The RTX 3060 is probably one of the Nvidia graphics cards with the best price-performance ratio of the last generation. That's why it's not so surprising that the graphics card is in first place among the sold graphics cards at Mindfactory.

Also not surprising is the RX 6700 XT in second place. This is also a good graphics card from the last generation, which, like the RTX 3060, is available for clearly less than 500 Euros. It offers even better performance in video games, but is also more expensive than the RTX 3060.

In the test, the RX 7900 XT ranks below the RTX 4080, but above the RTX 3080. However, with its price starting at 848 Euros on Mindfactory, it is much cheaper than an RTX 4080, which is available on Mindfactory starting at 1,239 Euros. The similarly fast RTX 4070 Ti also costs more there, about 890 Euros to be exact.

The situation is similar for the RX 7900 XTX. In terms of performance, it is nowhere near Nvidia's RTX 4090. However, it shows similarly good rates as the RTX 4080 at a price that is 200 Euros lower.

The RX 6800 brings up the rear in terms of the most sold graphics cards in the 13th week of 2023. It is in direct competition with the RTX 3070 and the 2080 Ti. At Mindfactory, an RTX 3070 costs around 530 Euros, so it is similarly more expensive than the RX 6800. Nevertheless, many customers seem to opt for the AMD graphics card.

In the current economic situation, many Mindfactory customers probably focus more on inexpensive cards that perform well for a longer period of time.

Nvidia's RTX 3060 and AMD's RX 6700 XT can both still produce more than 60 fps in current gaming titles. Both graphics cards will likely be able to render new games in 1080p at medium to high settings in the coming years as well.

In the higher price segment, AMD graphics cards are currently cheaper than Nvidia graphics cards. AMD's RX 7900 XT is similarly fast as the RTX 4070, but costs 50 Euros less. The situation is similar for the RX 7900 XTX: It is similarly powerful as the 4080, but even around 200 Euros cheaper.

Accordingly, it is understandable that customers choose AMD for high-performance graphics cards. At the same time, you have to keep in mind that this is only a weekly snapshot from a single German store. However, it is still interesting.

Source: https://www.gamestar.de/artikel/nvidia-vs-amd-grafikkarten-verkaufszahlen,3392375.html
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Y'all remember when 2080 Ti owners got laughed at because NVIDIA would sell 2080 Ti-performance for "$500" with the 3070? Then the cryptominers snapped them all up, effectively making the card sell for $1000+. Now that it's actually useful, people realize it's effectively gonna be reduced to being a 1080p GPU.

lol
 
Last edited:

Hot5pur

Member
Low VRAM from Nvidia. MLID have some interviews with devs, they explained that. Like 12 gb is minimum for 1080p
Wait 12 GB is the minimum for upcoming games, by which date?

Because I can comfortably play games at 4k60 on a 10GB 3080, albeit some needing DLSS to maintain it at 60 fps all the time, and DLSS looks as good as native most of the time.
 

Kataploom

Gold Member
Europe suffered a lot from price increases in the GPU segment, it lasted way more there than it did in America (continent), so they're very price driven right now it seems, many people were more than ready to upgrade when crypto fiasco hit... AMD is way cheaper than Nvidia for same performance cards and VRAM limitations is a hot topic, no wonder why more people could opt for AMD, I don't think it will be enough to switch marketshares but at least it should give them a good boost... Also mid range cards are about to release
 

Buggy Loop

Member
It is one of the biggest.

awkward zach galifianakis GIF


It’s the place for AMD in Germany since they have lower prices than the others because of some deal with AMD.

But one supplier.. it’s meaningless

Last month



2 months



2022
nrMF1Oq.png


In the end, there’s 4000 series entering steam hardware survey that totals roughly 67% of all RDNA 2 sales, not a single 7000 series in the list (too low %). So it’s really just a supplier in the grand scheme of things

The fuck kind of article from the depths of clickbait shit, pulled data from this supplier and extrapolated a winner from a week of sales?

What the hell
 
Last edited:

Kataploom

Gold Member
Wait 12 GB is the minimum for upcoming games, by which date?

Because I can comfortably play games at 4k60 on a 10GB 3080, albeit some needing DLSS to maintain it at 60 fps all the time, and DLSS looks as good as native most of the time.
OP estimations are all over the place, like comparing the 3060 with the 6700xt, also saying the 6700xt is good for "new games at 1080p and medium settings", lol literally that card can play A Plague Tale at 1440p on High at 60+ fps and even the unoptimized and buggy mess that is TLOU time at 1440p with mid to high settings at around 60 fps too
 

SolidQ

Member
Wait 12 GB is the minimum for upcoming games, by which date?
Yep they said, things will getting worse for PC users. once devs start use more PS5 power. Tlou is like light game for PC :messenger_fearful:

DLSS looks as good as native most of the time.
once FSR will come for Phones and with some fix, also consoles next year start more use FSR. DLSS will be like Gsync, almost dead. But that was obviously Open almost always wins
 
Last edited:

Crayon

Member
Wait 12 GB is the minimum for upcoming games, by which date?

Because I can comfortably play games at 4k60 on a 10GB 3080, albeit some needing DLSS to maintain it at 60 fps all the time, and DLSS looks as good as native most of the time.

I was just listening to the one where this is said. The explanation was to that the more games lean into the io systems of the consoles, the more they will be shoveling stuff into vram on pc to make up the gap. Just repeating what I heard.
 

Miyazaki’s Slave

Gold Member
Nvidia is CRUSHING the AI race with their GPU's. Gamer adoption, while not an afterthought for them, in all likelihood doesn't mean what it used to for Nvidia.
 

Puscifer

Member
Nvidia is CRUSHING the AI race with their GPU's. Gamer adoption, while not an afterthought for them, in all likelihood doesn't mean what it used to for Nvidia.

This is pretty much true, the thing everyone needs to realize at this point enterprise customers are their concern. Buy Nvidia and Intel, it's the path forward for PC gaming unless they create a card that can decouple all that AI processing which just isn't likely.
 

Hot5pur

Member
From what I can tell Nvidia is still the desirable GPU, and AMD is there to ride the high margins while undercutting them on price just a little.

I don't know that AMD can pull off a Ryzen, though last gen AMD cards were kinda impressive. They really traded blows on the high end save for RTX. Mostly because Nvidia was on a less than optimal 8nm process. Once they came onto the same node Nvidia pulled ahead, I mean the 4090 just blows away anything.

The best hope is that when the mid range AMD cards come out there are inexpensive alternatives, but I don't see that happening. I don't know how much AMD cares about market share in the GPU space, especially since (I think) the margins are good and for them Nvidia jacking up prices is desirable.
 

SatansReverence

Hipster Princess
Starting to really wish I went with a AMD instead of a 3070 with all these recent reports.
I enjoyed my time with my 3070 but am glad I made the switch. 8gb of vram was a problem almost from the get go.

Don't think I'm gonna be struggling any time soon on the 7900xtx.
 

Hot5pur

Member
I enjoyed my time with my 3070 but am glad I made the switch. 8gb of vram was a problem almost from the get go.

Don't think I'm gonna be struggling any time soon on the 7900xtx.
It's still going to be a good 1440p card for a while. I wouldn't worry. By the time you'd need the extra RAM you'd probably need to upgrade anyway.


Can I touch your ding ding dong?
 
Last edited:

Joramun

Member
I wouldn't recommend doing that. Those things are tortured by most amateur shitcoin miners.

Also Nvidia still has 80% marketshare so....the needle has hardly moved.
This.

You'd be an idiot to buy a card used for mining.

They are on 24/7 and are pushed to the limit.
 

MikeM

Member
7900xt posting some strong sales numbers despite the review hate.

AMD is where the smart money goes if you game only and only do part time productivity unless you need ai.
 

Hoddi

Member
I enjoyed my time with my 3070 but am glad I made the switch. 8gb of vram was a problem almost from the get go.

Don't think I'm gonna be struggling any time soon on the 7900xtx.
That's how I feel about it. My 2080 Ti is still pretty okay but if it fails then I'll 100% go for one of those AMD cards.

Those 8GB 3070 cards were already a joke in 2020. The sole reason I bought the Ti card was to have 11GB because these new consoles were already on the horizon. The only hope these 8GB cards have left is that the Series S will save them.
 

Thebonehead

Gold Member
Low VRAM from Nvidia. MLID have some interviews with devs, they explained that. Like 12 gb is minimum for 1080p
Two stupid things here in starting with that 12 GB of ram is required for 1080p

  1. You referenced Moore's law is dead
  2. You referenced Moore's law is dead

Now I know technically that's one thing, but you were so stupid there that it had to be mentioned twice.


Yep they said, things will getting worse for PC users. once devs start use more PS5 power. Tlou is like light game for PC :messenger_fearful:


once FSR will come for Phones and with some fix, also consoles next year start more use FSR. DLSS will be like Gsync, almost dead. But that was obviously Open almost always wins

I don't even know where to start here in your rabbis fanboy delusions, but let's give it a go.

First off you're holding the tlou port on PC as some mythical powerhouse example of how ps5 is an all conquering supercomputer against a pc.

For this I can simply load up tlou2, with it's larger levels, better graphics and animations on a 1.8tf PS4.

Naughty dogs utter incompetence shitting the bed with their PC release should not be held as any kind of standard.

Now onto your second paragraph of comedy gold.

FSR for phones with some fix, whatever this magical fix is, and FSR on consoles, which is already there by the way is going to kill off DLSS?

I would say stay off the crack pipe before posting next time, but I've seen your utterly nonsensical previous posts so will leave you with this nugget of wisdom.

Laugh Lol GIF
 
Last edited:
Wait 12 GB is the minimum for upcoming games, by which date?

Because I can comfortably play games at 4k60 on a 10GB 3080, albeit some needing DLSS to maintain it at 60 fps all the time, and DLSS looks as good as native most of the time.
You need the memory if you want to keep playing converted PS5 and maybe even Xbox games for the rest of the generation. They do their memory management very different so you need more ram on your pc to be able to balance out the power.

For native pc games you’re right though since those games are built for pc architecture
 
Last edited:

bbeach123

Member
Last year its take me a week to decide to go with the 3070 (freaking mistake) over the 6700xt (because they're the same price) .

I wouldnt even look at the 3070 today , 6700xt was so much cheaper and better .

Now I stuck with 8g ram .
 
Last edited:

The Skull

Member
Picked up a 6800xt for £500 to replace my 5700xt. There's trade offs to the Nvidia counterpart for sure but extremely happy with this card.
 

SolidQ

Member
Now I know technically that's one thing, but you were so stupid there that it had to be mentioned twice.
You won't read, it's Interview

First off you're holding the tlou port on PC as some mythical powerhouse example of how ps5 is an all conquering supercomputer against a pc.
again you miss decompression block for ps5 and many things

I would say stay off the crack pipe before posting next time, but I've seen your utterly nonsensical previous posts so will leave you with this nugget of wisdom.
I'm understand, you today have bad sleeping
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Point
  1. Referencing MLiD.
  2. 12GB VRAM for 1080p.
  3. AMD is winning.
Counter point
  1. Never.
  2. Not this generation. (on average outliers will always exist, if plagues tale requiem can look that good and UE5 doesnt need 12GB VRAM for 1080p)
  3. Hahahaha
 

winjer

Gold Member
AMD usually does well in the DYI market.
But NVidia is dominating everywhere else.
Unfortunately for AMD, the DYI is rather small.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
We not far from what devs says. 2024-2025 will be normal for that
vram.png
If someone quotes The Last of Us to me one more time im going blow a gasket.


A shitty outlier does not a trend make.
The Last of Us is one of the worst PC ports on a technical level.....dont use it as an example of anything other than how NOT to port a game.
A game with similar texture quality thats nextgen only, but had a competent port team behind them:
vram.png


P.S Thats at the highest settings too......do PC games only have one setting?
 

bbeach123

Member
Things only getting worse .

Does people forgot the start of ps2, ps3 , ps4 or what .

Start of ps3 : 512mb vram was fine .
End of ps3 : Even 1G suck dick .

Start of ps4 :2g Vram was fine , even 1g still playable .
End ps4 : 4G or gtfo .
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
AMD usually does well in the DYI market.
But NVidia is dominating everywhere else.
Unfortunately for AMD, the DYI is rather small.
Thats not true.
Even DIY dont want to buy AMD GPUs.
I used to work at a PC building shop most of my college life, and have since started building PCs for friend, friends of friends and family on the side, just to keep me entertained, abreast of trends and for a little pocket change.

Very very very very few people ever want an AMD GPU in their system.
Nvidia has massive massive mindshare.
How the fuck you think the RTX 3050 is on the SteamCharts......people would rather buy a shitty ~350 dollar Nvidia GPU than get an actually competent GPU from AMD......note the RTX 3050s MSRP was 250 dollars.
 

winjer

Gold Member
Thats not true.
Even DIY dont want to buy AMD GPUs.
I used to work at a PC building shop most of my college life, and have since started building PCs for friend, friends of friends and family on the side, just to keep me entertained, abreast of trends and for a little pocket change.

Very very very very few people ever want an AMD GPU in their system.
Nvidia has massive massive mindshare.
How the fuck you think the RTX 3050 is on the SteamCharts......people would rather buy a shitty ~350 dollar Nvidia GPU than get an actually competent GPU from AMD......note the RTX 3050s MSRP was 250 dollars.

DIY is people who buy their own GPUs to build a PC themselves.
What you are describing are pre-built systems.

The DIY market is much better informed about value and performance.
The people that buy pre-built systems have much less knowledge about about hardware. SO they just choose the brand.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
DIY is people who buy their own GPUs to build a PC themselves.
What you are describing are pre-built systems.

The DIY market is much better informed about value and performance.
The people that buy pre-built systems have much less knowledge about about hardware. SO they just choose the brand.
Mate you could poll our PC building thread, Eras PC building thread, Reddits PC building subreddit and any number of people who actually build their own machines and/or pick their own parts.......Nvidia will vastly vastly outnumber AMD.
Thats why retail stores have had Nvidia leading.....those are people walking into stores and buying GPUs....prebuilders dont order from retailers.

Tell me the places where you even anecdotally managed to come up with the theory that AMD is more popular amongst PC builders?
 
Last edited:

winjer

Gold Member
Mate you could poll our PC building thread, Eras PC building thread, Reddits PC building subreddit and any number of people who actually build their own machines and/or pick their own parts.......Nvidia will vastly vastly outnumber AMD.
Thats why retail stores have had Nvidia leading.....those are people walking into stores and buying GPUs....prebuilders dont order from retailers.

Tell me the places where you even anecdotally managed to come up with the theory that AMD is more popular amongst PC builders?

You are correct.
But I didn't say that AMD beats NVidia in the DIY market. I only said they do well.
It's on the pre-built market that AMD does terrible. Like you said, because of bran awareness.
If the pre-built market was like the DIY market, AMD would have like 30% market share.
 

SolidQ

Member
Thats at the highest settings too......do PC games only have one setting?
People always talking about max setting.

The Last of Us is one of the worst PC ports on a technical level.....dont use it as an example of anything other than how NOT to port a game.
we not talking about only Tlou, there was forsh*t, hogwarts etc

A game with similar texture quality thats nextgen only, but had a competent port team behind them:
We don't know what resolution effect, textures etc uses in Plague/Tlou, but that different story and long discussion. Forget about it
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
People always talking about max setting.


we not talking about only Tlou, there was forsh*t, hogwarts etc


We don't know what resolution effect, textures etc uses in Plague/Tlou, but that different story and long discussion. Forget about it
So three games.....all three being games that have been widely paned for being wonky.

Then the game I bring up we have to play the "we dont know the resolution" card?


And why do we only talk about Ultra settings.........games have other settings why should we ignore those settings?
 
Last edited:

SolidQ

Member
And why do we only talk about Ultra settings.........games have other settings why should we ignore those settings?
That we can talking about lower setting, but i saw and know alot people which cares about only Ultra settings...

Then the game I bring up we have to play the "we dont know the resolution" card?
Difference devs, difference engines, but problem UE5 is going mass engine

So three games.....all three being games that have been widely paned for being wonky.
Year not finished yet, let's see how it's going further
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
That we can talking about lower setting, but i saw and know alot people which cares about only Ultra settings...
People only care about Ultra settings but play at 1080p?
Sure sure.
Difference devs, difference engines, but problem UE5 is going mass engine
But you are claiming games are going to need 12GB to run at 1080p this generation.
Your anecdotal evidence was 3 games, 2 of which are utterly horrible ports..........I brought up a game thats visually in that realm if not beyond and also nextgen only.....you dismissed it.

So can I also dismiss TLoU, Hogwarts and Forspoken?

Year not finished yet, let's see how it's going further
Ill avatar bet you more AAA games will work at 1080p with a sub 12GB card this year than not.
Noting the RTX 3080'10G is a sub 12GB card.
 
Top Bottom