Maybe a bit of accelerationism is exactly what we need in order to get back to God, family and country.But think of all the good things AI will bring into your life! You'll get CEO focussed approved movies made with AI so they don't have to pay anyone, CEO approved TV streaming shows so they don't have to pay anyone, you'll get to spend time on the internet with social media flooded with AI that's there to keep you in your bubble, why even play games? Let the AI play games for you and give you a brief bullet point summary of the best and worst moments for you to praise and be outraged by, then you can video call your family for Christmas and hear valuable tales from your dead granparents thanks to the gift of AI, then you can focus on your work in the factory, go to work in the factory, make sure to watch AI generated content during your break, then get back to work, enjoy your new life enhanced by AI, you wont ever need to talk to another human being again, AI will chat with other AIs on your behalf, leaving you with even more time to consume content generated just for you!
Congrats to your new pc.I've posted this in another thread, but due to all this AI race bullshit that sent DRAM prices (especially DDR5) skyrocketing over the past few weeks, I've decided to grab a new PC now instead of waiting until spring (I was banking on Nvidia's supposedly refreshing the 5000 series with Super variants).
Got a nice BF discount on the system I bought yesterday, so I'm very content with this decision of getting a new PC sooner rather than later. Specs are 9800X3D, 32GB DDR5 and 5070ti.
Thanks, my bro! Really happy with it since it's a big upgrade over what I had before (old i7‑6700K, 16GB DDR4, and a 1080ti). It's also been stable in everything I've tested so far yesterday: WuWa with everything maxed out including RT, DNA, ZZZ, 3DMark Time Spy (ran a bunch of loops), etc. The only thing I still don't like is Windows 11, but it is what it is.Congrats to your new pc.
Oh wow, thats a huge upgrade. You won't need any upgrades for several years now. I also recently switched to 11, and I'm not really warming up to it either.Thanks, my bro! Really happy with it since it's a big upgrade over what I had before (old i7‑6700K, 16GB DDR4, and a 1080ti). It's also been stable in everything I've tested so far yesterday: WuWa with everything maxed out including RT, DNA, ZZZ, 3DMark Time Spy (ran a bunch of loops), etc. The only thing I still don't like is Windows 11, but it is what it is.![]()
Fuck AI.
The models get better within their limitations, but the limitations are still there. Like it or not, these AIs are just statistical models at the end of the day, and statistics can only achieve so much.
>AIs rely mainly on statistical models, this makes them really bad at continuity and long-term thinking. This is evident in things like video generation where they often cannot produce videos longer than 5 second clips without beginning to halucinate. Even generating something like a comic strip often comes with shifting backgrounds and other inconsistent elements. Its a very fundamental flaw in the way these AIs operate. Its possible to improve on at the cost of larger datapools for training but...Can you tell the class all about the limitations?
There are some AI research on X that I'd like to share your findings with.
>AIs rely mainly on statistical models, this makes them really bad at continuity and long-term thinking. This is evident in things like video generation where they often cannot produce videos longer than 5 second clips without beginning to halucinate. Even generating something like a comic strip often comes with shifting backgrounds and other inconsistent elements. Its a very fundamental flaw in the way these AIs operate. Its possible to improve on at the cost of larger datapools for training but...
>The available datapool these models need to train is limited. As much data as we have now, theres a growing, exponential need for more in order to improve on the models that might not just be available. Worse yet, these pools are starting to get contaminated with AI generated content itself, causing inbreeding.
>These models learning capabilities are very limited outside of the training phase. This makes goals like AGI practically impossible to achieve, as well as other limitations.
Of course, thats all without getting into the economics of the thing.
Vast majority of people barely understand the basics, which is why they treat AI like some code wizardry that can do anything. Kind of how people in the 50s and 60s thought we'd have fully human-like robots by the early 2000s because "electronics" was essentially magic to them.Well, suffice to say I don't have any new research insights into the fundamental limitations of LLMs
Just updating my post. That same RAM kit is now more than double. That was fast. I think everyone involved in the supply chain are being greedy cunts again.Can confirm, the RAM kit I bought a few months back is now 70% more expensive at the same retailer.
>AIs rely mainly on statistical models, this makes them really bad at continuity and long-term thinking. This is evident in things like video generation where they often cannot produce videos longer than 5 second clips without beginning to halucinate. Even generating something like a comic strip often comes with shifting backgrounds and other inconsistent elements. Its a very fundamental flaw in the way these AIs operate. Its possible to improve on at the cost of larger datapools for training but...
>The available datapool these models need to train is limited. As much data as we have now, theres a growing, exponential need for more in order to improve on the models that might not just be available. Worse yet, these pools are starting to get contaminated with AI generated content itself, causing inbreeding.
>These models learning capabilities are very limited outside of the training phase. This makes goals like AGI practically impossible to achieve, as well as other limitations.
Of course, thats all without getting into the economics of the thing.
Like i said, it's possible to improve on that, but it gets exponentially harder. Also, on the Genie example you gave, remembering an artificial enviroment (as in it saving on memory what it previously generated) is different from coherence and continuity. What genie is doing there is different from creating a video.The coherency issue over time you mentioned just doesn't seem to be a fundamental problem at all. Each new model improves on that. Google's Genie 2 world model had an 'interaction horizon' of 10-20 seconds. Their Genie 3 model got that up to several minutes.
Those aren't solutions, they're mitigations to the issue. Synthetic data for example goes back to the inbreeding issue i mentioned.In so much as there is a data limit in the form of readily accessible basic text data there are already multiple solutions and mitigations like better curation, multimodal data and synthetic data.
Over time, repeated use of synthetic data will cause assumption loops and biases that'll need explicit correction. They even gave a term for its overuse (Model Autophagy Disorder or MAD). It can at best complement real world data, but cannot replace itThe models are already using a significant amount of synthetic data and, no, it isn't harming the models or their progress.
There are bags of tricks for specific situations that can help mitigate or move the frontier up in certain cases, like the one i mentioned above with saving the details from artificial enviroment to memory, so that turning away from a rock and back at it doesn't make the rock change or disappear. But again, these do not solve the fundamental problems with these models.Continual learning is being worked on and there has been some progress. Nobody doubts it is a hard problem to crack but there isn't any evidence it's an impossible challenge to overcome.
CyberPowerPC has announced that it will raise prices on all gaming PCs starting December 7th, 2025. In a statement to customers, the company links the move to sharp component cost increases and frames the change as a necessary adjustment rather than a permanent shift in its pricing model.
According to the notice, RAM prices have jumped by 500% and SSD prices have doubled since October 1st, 2025. CyberPowerPC says these swings have had a direct impact on the cost of building systems and that it can no longer absorb the difference. The update is aimed at giving customers advance warning, with the company stressing transparency as the holiday shopping period begins.
What a shitshow. I feel bad for anyone looking to buy a prebuilt PC or even trying to put together their own PC right now with all of the bullshit involving DRAM prices right now.
We often remind you that NVIDIA does not sell only GPUs. All AI and data center aside, what gamers need to know is that NVIDIA sells bundles of memory and GPU together. This has been the model for years and it is an easy way for board makers to obtain the most important components for graphics cards. Everything else can be sourced through normal channels, and everything else is what makes custom graphics cards 'special'.
However, we are now facing a memory shortage, and apparently NVIDIA is changing how it works with board partners. According to hardware leaker "Golden Pig Upgrade Pack", while the company previously supplied both GPU cores and the matching memory chips to board makers, it now only ships the GPU die, while AICs must secure their own GDDR memory.
Under the earlier model, AICs bought a complete package that paired GPU cores with preselected memory. This simplified purchases and kept specifications tightly controlled. If the rumor is real, partners now have to negotiate directly with DRAM vendors for GDDR6X or GDDR7 chips, manage fast-moving prices on their own, and ensure those parts line up with NVIDIA reference designs. That shifts more supply-chain risk from NVIDIA to its partners.
Golden Pig says this change hits smaller brands hardest. Companies without long-standing links to memory suppliers are described as being ignored when they try to arrange allocations, making it harder for them to stay in the graphics card business at all. I imagine that if EVGA was still around, it would have found a way to source the memory, but each time these changes are reported. I immediately recall how EVGA described the situation for NVIDIA board partners as growing in tension. It definitely doesn't get any easier.
Back to eDRAM and much bigger SRAM pools on die and slower speed VRAM that DataCentres do not want… that could be a way out of this.NVIDIA rumored to stop bundling memory with GPUs
Dram prices must be really bad, if Nvidia prefers to abandon it all and leave the sourcing to AIBs.
We can expect AIBs to increase GU prices soon enough to cover the costs.
PS: Fuck this AI boom.
Anyone looking for a PC should grab one right now from old stock. You can still get pretty good pricing. If you wait even couple of weeks, that will change.What a shitshow. I feel bad for anyone looking to buy a prebuilt PC or even trying to put together their own PC right now with all of the bullshit involving DRAM prices right now.
I am tempted to grab a backup GPU and an additional NVME before the shit really hits the fan. I don't need either though.NVIDIA rumored to stop bundling memory with GPUs
Dram prices must be really bad, if Nvidia prefers to abandon it all and leave the sourcing to AIBs.
We can expect AIBs to increase GU prices soon enough to cover the costs.
PS: Fuck this AI boom.
This really is crazy, I got 32Gb for new PC at around double the price than it was, and if I waited a little while more it would've been triple. Like I literally ordered at the price that was pretty wack already, and next moment I refreshed the store page it was 50% more expensive, luck perhaps lol? Owning RAM is like some say owning gold almost now. Fuck AI, we did well without it, fuck western "progress", it's more like rush towards ruin. More yet, fuck GPU manufacturers, FUCK AMD, FUCK NVIDIA, they have become pathetically, unashamedly and blatantly anti-consumer, anti-end-user, anti-gaming, anti-art.
I've been thinking about going all in and buying a high end rig in the past couple of days. It feels surreal that I look at this potential investment (4.000 to 4.500 EUR) and feel some kind of urgency now because how things are going I may end up regretting not buying now when prices spiral even more out of control next year.Anyone looking for a PC should grab one right now from old stock. You can still get pretty good pricing. If you wait even couple of weeks, that will change.
And guess who envies them and copies them and tries to be "better" than them? Right..Western lol. Nobody is going harder on AI than the Chinese. It's being integrated into far more over there and at a far higher speed than over here.
The Chinese are balls-deep into AI across the board, but almost every Google result contains an AI response (albeit cached for most), and Windows is now plastered with this crap.Western lol. Nobody is going harder on AI than the Chinese. It's being integrated into far more over there and at a far higher speed than over here.
And guess who envies them and copies them and tries to be "better" than them? Right..
But I meant that in a more specific way anyhow. The illusion that things are getting and can get better forever, obviously isn't true wherever you look.
I don't think you'd see a difference with 96gb Vs 64gb, unless you play something like Skyrim with every mod known to man activated.I wonder how long it will take until we build Skynet.
Now I regret that I only bought 64gb ram when building my pc. I should have gone for 96gb.
If you do some AI stuff or run containers and VMs, it helps. Although doing inference on CPU is painI don't think you'd see a difference with 96gb Vs 64gb, unless you play something like Skyrim with every mod known to man activated.
Even this insane modlist for Skyrim doesn't have a specific system RAM requirement because of how Skyrim loads dataI don't think you'd see a difference with 96gb Vs 64gb, unless you play something like Skyrim with every mod known to man activated.
I love playing mmorpgs on my own private server. Depending on the game up to 200gb ram are needed.I don't think you'd see a difference with 96gb Vs 64gb, unless you play something like Skyrim with every mod known to man activated.
Yeah, that sounds like a ram-heavy thing to do, indeed! But for playing games the "regular" way, 64gb ram is more than enough for a long time. It's mot upcoming big games like Witcher 4 or GTA 6 will require more than 64gb.I love playing mmorpgs on my own private server. Depending on the game up to 200gb ram are needed.