DRAM prices have increased 170% due to AI demand

vEmxAV6Pk4yflnQ3.gif
 
But think of all the good things AI will bring into your life! You'll get CEO focussed approved movies made with AI so they don't have to pay anyone, CEO approved TV streaming shows so they don't have to pay anyone, you'll get to spend time on the internet with social media flooded with AI that's there to keep you in your bubble, why even play games? Let the AI play games for you and give you a brief bullet point summary of the best and worst moments for you to praise and be outraged by, then you can video call your family for Christmas and hear valuable tales from your dead granparents thanks to the gift of AI, then you can focus on your work in the factory, go to work in the factory, make sure to watch AI generated content during your break, then get back to work, enjoy your new life enhanced by AI, you wont ever need to talk to another human being again, AI will chat with other AIs on your behalf, leaving you with even more time to consume content generated just for you!
Maybe a bit of accelerationism is exactly what we need in order to get back to God, family and country.
 
Good thing I decided to pull a trigger and bought my 9950x3d, 128 GB RAM and 5090 in september. Now I see the price went from <500 to almost 1500 euro. Basically 300% increase.
 
Last edited:
I've posted this in another thread, but due to all this AI race bullshit that sent DRAM prices (especially DDR5) skyrocketing over the past few weeks, I've decided to grab a new PC now instead of waiting until spring (I was banking on Nvidia's supposedly refreshing the 5000 series with Super variants).

Got a nice BF discount on the system I bought yesterday, so I'm very content with this decision of getting a new PC sooner rather than later. Specs are 9800X3D, 32GB DDR5 and 5070ti.
Congrats to your new pc.
 
Congrats to your new pc.
Thanks, my bro! Really happy with it since it's a big upgrade over what I had before (old i7‑6700K, 16GB DDR4, and a 1080ti). It's also been stable in everything I've tested so far yesterday: WuWa with everything maxed out including RT, DNA, ZZZ, 3DMark Time Spy (ran a bunch of loops), etc. The only thing I still don't like is Windows 11, but it is what it is. :messenger_grinning_sweat:
 
I hate how all these trends and "new advancements" are completely screwing up the PC ecosystem. Crypto mining and GPUs, and now AI and DRAM. The absolute worst.
 
Thanks, my bro! Really happy with it since it's a big upgrade over what I had before (old i7‑6700K, 16GB DDR4, and a 1080ti). It's also been stable in everything I've tested so far yesterday: WuWa with everything maxed out including RT, DNA, ZZZ, 3DMark Time Spy (ran a bunch of loops), etc. The only thing I still don't like is Windows 11, but it is what it is. :messenger_grinning_sweat:
Oh wow, thats a huge upgrade. You won't need any upgrades for several years now. I also recently switched to 11, and I'm not really warming up to it either.
 
The models get better within their limitations, but the limitations are still there. Like it or not, these AIs are just statistical models at the end of the day, and statistics can only achieve so much.

Can you tell the class all about the limitations?

There are some AI researchers on X that I'd like to share your findings with.
 
Last edited:
Can you tell the class all about the limitations?

There are some AI research on X that I'd like to share your findings with.
>AIs rely mainly on statistical models, this makes them really bad at continuity and long-term thinking. This is evident in things like video generation where they often cannot produce videos longer than 5 second clips without beginning to halucinate. Even generating something like a comic strip often comes with shifting backgrounds and other inconsistent elements. Its a very fundamental flaw in the way these AIs operate. Its possible to improve on at the cost of larger datapools for training but...

>The available datapool these models need to train is limited. As much data as we have now, theres a growing, exponential need for more in order to improve on the models that might not just be available. Worse yet, these pools are starting to get contaminated with AI generated content itself, causing inbreeding.

>These models learning capabilities are very limited outside of the training phase. This makes goals like AGI practically impossible to achieve, as well as other limitations.

Of course, thats all without getting into the economics of the thing.
 
Last edited:
>AIs rely mainly on statistical models, this makes them really bad at continuity and long-term thinking. This is evident in things like video generation where they often cannot produce videos longer than 5 second clips without beginning to halucinate. Even generating something like a comic strip often comes with shifting backgrounds and other inconsistent elements. Its a very fundamental flaw in the way these AIs operate. Its possible to improve on at the cost of larger datapools for training but...

>The available datapool these models need to train is limited. As much data as we have now, theres a growing, exponential need for more in order to improve on the models that might not just be available. Worse yet, these pools are starting to get contaminated with AI generated content itself, causing inbreeding.

>These models learning capabilities are very limited outside of the training phase. This makes goals like AGI practically impossible to achieve, as well as other limitations.

Of course, thats all without getting into the economics of the thing.

LOL ok. Well, suffice to say I don't have any new research insights into the fundamental limitations of LLMs to share with said AI researchers on X.
 
Well, suffice to say I don't have any new research insights into the fundamental limitations of LLMs
Vast majority of people barely understand the basics, which is why they treat AI like some code wizardry that can do anything. Kind of how people in the 50s and 60s thought we'd have fully human-like robots by the early 2000s because "electronics" was essentially magic to them.
 
Can confirm, the RAM kit I bought a few months back is now 70% more expensive at the same retailer.
Just updating my post. That same RAM kit is now more than double. That was fast. I think everyone involved in the supply chain are being greedy cunts again.
 
>AIs rely mainly on statistical models, this makes them really bad at continuity and long-term thinking. This is evident in things like video generation where they often cannot produce videos longer than 5 second clips without beginning to halucinate. Even generating something like a comic strip often comes with shifting backgrounds and other inconsistent elements. Its a very fundamental flaw in the way these AIs operate. Its possible to improve on at the cost of larger datapools for training but...

>The available datapool these models need to train is limited. As much data as we have now, theres a growing, exponential need for more in order to improve on the models that might not just be available. Worse yet, these pools are starting to get contaminated with AI generated content itself, causing inbreeding.

>These models learning capabilities are very limited outside of the training phase. This makes goals like AGI practically impossible to achieve, as well as other limitations.

Of course, thats all without getting into the economics of the thing.

The coherency issue over time you mentioned just doesn't seem to be a fundamental problem at all. Each new model improves on that. Google's Genie 2 world model had an 'interaction horizon' of 10-20 seconds. Their Genie 3 model got that up to several minutes.

In so much as there is a data limit in the form of readily accessible basic text data there are already multiple solutions and mitigations like better curation, multimodal data and synthetic data.

The models are already using a significant amount of synthetic data and, no, it isn't harming the models or their progress.

Continual learning is being worked on and there has been some progress. Nobody doubts it is a hard problem to crack but there isn't any evidence it's an impossible challenge to overcome.
 
Last edited:
The coherency issue over time you mentioned just doesn't seem to be a fundamental problem at all. Each new model improves on that. Google's Genie 2 world model had an 'interaction horizon' of 10-20 seconds. Their Genie 3 model got that up to several minutes.
Like i said, it's possible to improve on that, but it gets exponentially harder. Also, on the Genie example you gave, remembering an artificial enviroment (as in it saving on memory what it previously generated) is different from coherence and continuity. What genie is doing there is different from creating a video.

In so much as there is a data limit in the form of readily accessible basic text data there are already multiple solutions and mitigations like better curation, multimodal data and synthetic data.
Those aren't solutions, they're mitigations to the issue. Synthetic data for example goes back to the inbreeding issue i mentioned.

The models are already using a significant amount of synthetic data and, no, it isn't harming the models or their progress.
Over time, repeated use of synthetic data will cause assumption loops and biases that'll need explicit correction. They even gave a term for its overuse (Model Autophagy Disorder or MAD). It can at best complement real world data, but cannot replace it

Continual learning is being worked on and there has been some progress. Nobody doubts it is a hard problem to crack but there isn't any evidence it's an impossible challenge to overcome.
There are bags of tricks for specific situations that can help mitigate or move the frontier up in certain cases, like the one i mentioned above with saving the details from artificial enviroment to memory, so that turning away from a rock and back at it doesn't make the rock change or disappear. But again, these do not solve the fundamental problems with these models.
 
Last edited:

CyberPowerPC cites 500% RAM surge as MAINGEAR warns of deeper memory shortages


CyberPowerPC has announced that it will raise prices on all gaming PCs starting December 7th, 2025. In a statement to customers, the company links the move to sharp component cost increases and frames the change as a necessary adjustment rather than a permanent shift in its pricing model.

According to the notice, RAM prices have jumped by 500% and SSD prices have doubled since October 1st, 2025. CyberPowerPC says these swings have had a direct impact on the cost of building systems and that it can no longer absorb the difference. The update is aimed at giving customers advance warning, with the company stressing transparency as the holiday shopping period begins.
 

NVIDIA rumored to stop bundling memory with GPUs

We often remind you that NVIDIA does not sell only GPUs. All AI and data center aside, what gamers need to know is that NVIDIA sells bundles of memory and GPU together. This has been the model for years and it is an easy way for board makers to obtain the most important components for graphics cards. Everything else can be sourced through normal channels, and everything else is what makes custom graphics cards 'special'.

However, we are now facing a memory shortage, and apparently NVIDIA is changing how it works with board partners. According to hardware leaker "Golden Pig Upgrade Pack", while the company previously supplied both GPU cores and the matching memory chips to board makers, it now only ships the GPU die, while AICs must secure their own GDDR memory.
Under the earlier model, AICs bought a complete package that paired GPU cores with preselected memory. This simplified purchases and kept specifications tightly controlled. If the rumor is real, partners now have to negotiate directly with DRAM vendors for GDDR6X or GDDR7 chips, manage fast-moving prices on their own, and ensure those parts line up with NVIDIA reference designs. That shifts more supply-chain risk from NVIDIA to its partners.

Golden Pig says this change hits smaller brands hardest. Companies without long-standing links to memory suppliers are described as being ignored when they try to arrange allocations, making it harder for them to stay in the graphics card business at all. I imagine that if EVGA was still around, it would have found a way to source the memory, but each time these changes are reported. I immediately recall how EVGA described the situation for NVIDIA board partners as growing in tension. It definitely doesn't get any easier.

Dram prices must be really bad, if Nvidia prefers to abandon it all and leave the sourcing to AIBs.
We can expect AIBs to increase GU prices soon enough to cover the costs.

PS: Fuck this AI boom.
 
Last edited:

NVIDIA rumored to stop bundling memory with GPUs




Dram prices must be really bad, if Nvidia prefers to abandon it all and leave the sourcing to AIBs.
We can expect AIBs to increase GU prices soon enough to cover the costs.

PS: Fuck this AI boom.
Back to eDRAM and much bigger SRAM pools on die and slower speed VRAM that DataCentres do not want… that could be a way out of this.

Sure, it also increases complexity between techniques like Sampler Feedback Streaming / PRT and manual dev streaming (say hi to PS2 ;)), but you just adapt to changing markets.
 
Last edited:
Damn, this shitshow almost makes me reel like my 64gb of ram is not enough and that I should upgrade to 128gb before it gets even worse.
Time to snap out of it, I guess! By the time 64gb is not enough, these prices are probably back to somewhat normal.
 
What a shitshow. I feel bad for anyone looking to buy a prebuilt PC or even trying to put together their own PC right now with all of the bullshit involving DRAM prices right now.
Anyone looking for a PC should grab one right now from old stock. You can still get pretty good pricing. If you wait even couple of weeks, that will change.
 
The future is being able to generate whatever slop-filled videogame you can imagine but not having the hardware to be able to run it.
 
This really is crazy, I got 32Gb for new PC at around double the price than it was, and if I waited a little while more it would've been triple. Like I literally ordered at the price that was pretty wack already, and next moment I refreshed the store page it was 50% more expensive, luck perhaps lol? Owning RAM is like some say owning gold almost now. Fuck AI, we did well without it, fuck western "progress", it's more like rush towards ruin. More yet, fuck GPU manufacturers, FUCK AMD, FUCK NVIDIA, they have become pathetically, unashamedly and blatantly anti-consumer, anti-end-user, anti-gaming, anti-art.
 
Made a build last weekend. Got 32gb ram for $220. Not great but not as bad as it's looking now. And a 9600x and 9070 XT.

Installed Linux too and it's running great. Bye windows :)
 
This really is crazy, I got 32Gb for new PC at around double the price than it was, and if I waited a little while more it would've been triple. Like I literally ordered at the price that was pretty wack already, and next moment I refreshed the store page it was 50% more expensive, luck perhaps lol? Owning RAM is like some say owning gold almost now. Fuck AI, we did well without it, fuck western "progress", it's more like rush towards ruin. More yet, fuck GPU manufacturers, FUCK AMD, FUCK NVIDIA, they have become pathetically, unashamedly and blatantly anti-consumer, anti-end-user, anti-gaming, anti-art.

Western lol. Nobody is going harder on AI than the Chinese. It's being integrated into far more over there and at a far higher speed than over here.
 
Last edited:
Anyone looking for a PC should grab one right now from old stock. You can still get pretty good pricing. If you wait even couple of weeks, that will change.
I've been thinking about going all in and buying a high end rig in the past couple of days. It feels surreal that I look at this potential investment (4.000 to 4.500 EUR) and feel some kind of urgency now because how things are going I may end up regretting not buying now when prices spiral even more out of control next year.
 
Last edited:
Western lol. Nobody is going harder on AI than the Chinese. It's being integrated into far more over there and at a far higher speed than over here.
And guess who envies them and copies them and tries to be "better" than them? Right..
But I meant that in a more specific way anyhow. The illusion that things are getting and can get better forever, obviously isn't true wherever you look.
 
Western lol. Nobody is going harder on AI than the Chinese. It's being integrated into far more over there and at a far higher speed than over here.
The Chinese are balls-deep into AI across the board, but almost every Google result contains an AI response (albeit cached for most), and Windows is now plastered with this crap.
"Calm the fuck down" is the only way I can express myself in this. I know they need to get their investments back, quickly, but this is just absurd.
 
And guess who envies them and copies them and tries to be "better" than them? Right..
But I meant that in a more specific way anyhow. The illusion that things are getting and can get better forever, obviously isn't true wherever you look.

DM me when the models stop getting better.

In a generation* this type of hysterical anti-AI sentiment will look completely absurd, people will have to tell their children that people had daily meltdowns over the next computing and software paradigm.

I would settle it in for this, because you're not going to be able to keep up this level of agitation for the rest of your life.

*well, assuming we make it to another generation
 
Last edited:
I don't think you'd see a difference with 96gb Vs 64gb, unless you play something like Skyrim with every mod known to man activated.
If you do some AI stuff or run containers and VMs, it helps. Although doing inference on CPU is pain 😅.

I was going to get more RAM for home lab but it ain't happening now.
 
I love playing mmorpgs on my own private server. Depending on the game up to 200gb ram are needed.
Yeah, that sounds like a ram-heavy thing to do, indeed! But for playing games the "regular" way, 64gb ram is more than enough for a long time. It's mot upcoming big games like Witcher 4 or GTA 6 will require more than 64gb.
 
Top Bottom