• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Report: Nvidia Has Practically Stopped Production of Its 40-Series GPUs to shift to AI products

Bernoulli

M2 slut

Gaiff

SBI’s Resident Gaslighter
As always, take this information with a grain of salt, as it comes from notorious hardware leaker Moore's Law is Dead on YouTube (MLID), but some of his scoops have become a reality in the past.

So they know the source is an idiot who doesn't actually have any inside info but they write an article with him as a source anyway?
 

winjer

Gold Member
Most GPU SKUs are still in stock and with slow sales.
And with the AI boom replacing the Crypto boom, Nvidia has to follow the money.
 

Spyxos

Member
X3OMm4o.png


Source 3: from his Video. But i need to remind you that they're estimated to have already produces a years's worth of Lovelace anyways.

The report's gist is that one large retailer told MLD Nvidia is throttling supply to keep prices of its high-end cards in the stratosphere. This source said even though they do have people walking into the store looking for an RTX 4090—indicating it's probably Micro Center—they don't always have a GPU available. It's alleged the store's distributor is only sending it a handful of cards each week to keep prices inflated, even though warehouses are reportedly full of high-end 40-series GPUs.

The cards are too expensive and hardly sell anymore. Nvidia tries to counter this and artificially reduces the supply.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
This isnt gonna affect gamers at all.
Dont even worry about it, chances are its weird fearmongering to get people to actually buy RTX40 series GPUs.
Dont fall for it, remember the fearmongering towards the end of the crypto craze when everyone was saying buy now buy now cuz everything will be out of stock anyway.......then people who waiting got 600 dollar 3090s and 6950s......6950XTs are actually still available for ~650.

AMD and Intel need to get their shit together if they are going to be the only options for gamers.
AMD????
They abandoned you along time ago.




hero-image.fill.size_994x559.v1690992092.jpg
 

Sanepar

Member
Strange things are afoot at the production lines for Nvidia's 40-series GPUs.
According to a new leak, the company has ground production of its Ada Lovelace GPUs to a halt and is also limiting the supply of its high-end cards to distributors. It's posited that Nvidia is trying to shift production from its gaming cards to AI products.





Nvidia doesn't care about gpus. They want u on their cloud service. That is it.
 

Fess

Member
Time to step out or stay on the stock?
205% up this year but it’s been fluctuating lately.
 

Tams

Member
Time to step out or stay on the stock?
205% up this year but it’s been fluctuating lately.
I sold mine once I thought the 'AI' mania had reached its peak, but its just gone up and up. It'll have to come down eventually though.
 

ReBurn

Gold Member
Might as well. The 40 series isn't a compelling upgrade compared to the 30 series for many people given the cost difference for the marginal performance difference.
 

Warnen

Don't pass gaas, it is your Destiny!
guess intel or AMD next gen. Funny if AMD still lands in second with Nvidia out
 

Kilau

Member
How big is the AI training market expected to get? Anyone could crypto mine but don’t you need a reason to train AI? It’s not a money maker on it’s own right?
 

Tarin02543

Member
I haven't even come into contact with AI services, anyone in here that has? To me it appears to be smoke and mirrors just like cryptocoins
 
I haven't even come into contact with AI services, anyone in here that has? To me it appears to be smoke and mirrors just like cryptocoins
AI is the genuine article. It's not nearly as sexy in practice as depicted in popular media, but its impact is unrivaled.

DLSS is evidence enough. What's most terrifying is that its improving at an exponential rate: the improvements are not gradual, they're more significant with each day.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I haven't even come into contact with AI services, anyone in here that has? To me it appears to be smoke and mirrors just like cryptocoins
Not even in:
  • Microsoft Word
  • Photoshop
  • Nvidia DLSS
  • Trint - Ohh man I wish I had Trint in college.
  • FaceApp
  • Waze
  • Google Recorder
  • Siri
  • Bing
  • Facetune
  • Alexa
 
Last edited:

Haint

Member
How big is the AI training market expected to get? Anyone could crypto mine but don’t you need a reason to train AI? It’s not a money maker on it’s own right?

GPUs are used for both the training AND the questions/tasks people give it. So both ends are run entirely on GPUs.
 
Last edited:

RoboFu

One of the green rats
They probably have a surplus… could start back later or switching to the 50xx series.
 

Sleepwalker

Member
I haven't even come into contact with AI services, anyone in here that has? To me it appears to be smoke and mirrors just like cryptocoins
AI is the real deal man, you probably have come into contact with it and haven't realized. I use AI features/programs all the time nowadays, and no it's not just chatgpt, there are plenty of use cases already :)
 

Tarin02543

Member
Not even in:
  • Microsoft Word
  • Photoshop
  • Nvidia DLSS
  • Trint - Ohh man I wish I had Trint in college.
  • FaceApp
  • Waze
  • Google Recorder
  • Siri
  • Bing
  • Facetune
  • Alexa

Oh, i didn't realize that DLSS is also AI, just a different application. Yes I do have a RTX card so I guess AI has entered my home.

I was talking about chat services which is what the perception of AI is of the general public
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Oh, i didn't realize that DLSS is also AI, just a different application. Yes I do have a RTX card so I guess AI has entered my home.

I was talking about chat services which is what the perception of AI is of the general public
The DL in DLSS stands for Deep Learning....i.e AI.
AI is alot more prevalent than people know because they have a passing knowledge of what it can and does do already.
People are constantly thinking of chat services or making hentai, its alot more powerful and already absolutely massive.

Siri, Alexa, Google Assistant also use AI.
So effectively if you have a smart phone chances are it used AI to learn natural language processing.

Nvidias gaming revenue was down 38% while their overall revenue was up 14%, showing gaming isnt anywhere near mission critical.
The reason AMD is selling GPUs directly to AI companies right now is partly because Nvidia cant make enough Hopper GPUs to go around.
 
Last edited:

iHaunter

Member
Also because no one is buying it. They tried to get away with over-pricing their GPUs year after year. So many people just skipped this gen. Crypto is falling as well so most of their customers are gone. You can get the 4000 series cheaper than MSRP right now.
 

mhirano

Member
no, when they train the IA to learn it uses GPUs

when you ask it the questions it responds based on what it was trained
ATCHUALLY…
It uses GPU processing for training and also for running the algorithm.
See StableDiffusion, that you can download trained models but must have a (Nvidia) GPU to run the Python scripts and render stuff
 

lmimmfn

Member
GPUs are used for both the training AND the questions/tasks people give it. So both ends are run entirely on GPUs.
Once the model is created the computational needs for query are hugely reduced, can be done on a cpu or local gpu(assuming enough VRAM for the model). So it does not necessarily need a GPU but it is more efficient with one and as I said can be local.

It's all of course dependent on the AI tech and whether they allow/provide CPU processing of the model.

Many AI use cases use huge GPU compute to generate a model then deploy that to edge/near edge low compute processing units(not talking visual AI here but general decision making AI)
 
Last edited:

Haint

Member
Not sure I understand. If I ask Chat GPT a question, it's using two GPUs at once?

The large outfits like Open AI are always training AND processing questions simultaniously, so yes ideally they want 2 GPU's for that.
 

dave_d

Member
How big is the AI training market expected to get? Anyone could crypto mine but don’t you need a reason to train AI? It’s not a money maker on it’s own right?
I'd expect huge. I mean the last company I worked for they did body scanners for airports. The device would scan an image of you through your clothes and send that data to a plug in we got from the algorithm team. That part would look at the image and look for threats, IE weapons. The guy in charge of that team was telling me about it, they used Nvidia gpu's and CUDA since their stuff was better than anybody else's. (Also CUDA supported multiple GPUs and he was under the impression other technologies didn't support it at the time.) So if you ever went through a body scanner at an airport there was machine learning to process the image. Oh, I also interviewed at a company that was using similar technology to look at pathological slides looking for cancers. The idea there was that it would help out pathologists finding cancer cells when they took a biopsy. (The idea is after the biospy you put the cells on a slide and look at it through a microscope. That's slow for someone as highly paid as a pathologist. So first put it through that algorithm to find it and have the pathologist double check.) So yeah, there's quite a few uses just in the image processing field.
 

Kilau

Member
The large outfits like Open AI are always training AND processing questions simultaniously, so yes ideally they want 2 GPU's for that.
Ok I get what you mean.

I'd expect huge. I mean the last company I worked for they did body scanners for airports. The device would scan an image of you through your clothes and send that data to a plug in we got from the algorithm team. That part would look at the image and look for threats, IE weapons. The guy in charge of that team was telling me about it, they used Nvidia gpu's and CUDA since their stuff was better than anybody else's. (Also CUDA supported multiple GPUs and he was under the impression other technologies didn't support it at the time.) So if you ever went through a body scanner at an airport there was machine learning to process the image. Oh, I also interviewed at a company that was using similar technology to look at pathological slides looking for cancers. The idea there was that it would help out pathologists finding cancer cells when they took a biopsy. (The idea is after the biospy you put the cells on a slide and look at it through a microscope. That's slow for someone as highly paid as a pathologist. So first put it through that algorithm to find it and have the pathologist double check.) So yeah, there's quite a few uses just in the image processing field.
Hopefully those purchases will be more direct and less disruptive than the mining craze even if it means the manufacturers are making less consumer GPUs overall.
 

willothedog

Member
Also because no one is buying it. They tried to get away with over-pricing their GPUs year after year. So many people just skipped this gen. Crypto is falling as well so most of their customers are gone. You can get the 4000 series cheaper than MSRP right now.

If they are not selling well why are there so many od the 40 series stack in the Steam hardware survey?
 
Last edited:
Top Bottom