Report: Nvidia is cancelling RTX 50 Super Series because of AI demand

I have a 4070ti but im getting massive FOMO's and I kinda want to bite on a 5080 due to price increases before its too late. 5090 is too much for my budget. Help :( what to do
 
Last edited:
I have a 4070ti but im getting massive FOMO's and I kinda want to bite on a 5080 due to price increases before its too late. 5090 is too much for my budget. Help :( what to do
Get the 5080. It's a good card, and the somewhat limited VRAM issue is probably less of a issue when using DLSS.

I have a 5070 TI in my secondary PC in the living room, and haven't had any VRAM issues with it when playing on a 4K tv with dlss.
 
You don't know what's "an entire Super refresh" and at this point I doubt that even Nvidia knows.
Considering this anyone speculating on timings or how that could affect a full generational ramp in the form of 60 series is just speculating.
And I've seen enough cringe worthy takes in such speculations to know when someone is doing it without actually having any knowledge whatsoever.

Nvidia definitely know what the super cards are. You aren't talking about 50 series Super are you?

I can guarantee you that they were pencilled for Q1 2026 and moved to Q3 2026. Then......

Nvidia will give the classic line of "You cant delay / cancel something that's never been announced"
 
Get the 5080. It's a good card, and the somewhat limited VRAM issue is probably less of a issue when using DLSS.

I have a 5070 TI in my secondary PC in the living room, and haven't had any VRAM issues with it when playing on a 4K tv with dlss.
I've been seeing the 5080 trade blows with the 4090 (4090 still overall better) and kicking AMDs offerings asses, but it's a case by case thing, so, yeah. The card is good, though. I was checking it out versus the 5090. I am probably just going to undervolt my own gpu. I want it to last. After the 5xxx series there is no telling wtf can happen next.
 
You don't know what's "an entire Super refresh" and at this point I doubt that even Nvidia knows.
Considering this anyone speculating on timings or how that could affect a full generational ramp in the form of 60 series is just speculating.
And I've seen enough cringe worthy takes in such speculations to know when someone is doing it without actually having any knowledge whatsoever.
Of course I'm speculating. I'm not an insider. But with the way DRAM prices are going as well as Nvidias focus shifting, I wouldn't be surprised if the Super series is cancelled or priced higher than it should be, as well as the 6000 series gaming GPUs being pushed out. Or everything might be just fine, and nothing gets delayed or pushed out.
 
More Nvidia factories whens?

the hangover math GIF
Cat Glasses GIF
 
I've been seeing the 5080 trade blows with the 4090 (4090 still overall better) and kicking AMDs offerings asses, but it's a case by case thing, so, yeah. The card is good, though. I was checking it out versus the 5090. I am probably just going to undervolt my own gpu. I want it to last. After the 5xxx series there is no telling wtf can happen next.
I've got the standard $999 PNY 5080, and I can't believe how cool and quiet this thing runs coming from a 3080. Granted, it's a big ass card, but the thermals and noise level has left me pleasantly surprised, and the card runs anything you can throw at it with zero issues. Anyone thinking about getting one at MSRP needs to pull the trigger soon.
 
I've got the standard $999 PNY 5080, and I can't believe how cool and quiet this thing runs coming from a 3080. Granted, it's a big ass card, but the thermals and noise level has left me pleasantly surprised, and the card runs anything you can throw at it with zero issues. Anyone thinking about getting one at MSRP needs to pull the trigger soon.
Yeah, surprisingly from perspective of temperature, power draw and noise, my MSI base 5080 has done better than my EVGA FTW 3080Ti.

It's not bad at all. I didn't bother overclocking since performance is good without it and I don't want extra noise/heat.
 
Yeah, surprisingly from perspective of temperature, power draw and noise, my MSI base 5080 has done better than my EVGA FTW 3080Ti.

It's not bad at all. I didn't bother overclocking since performance is good without it and I don't want extra noise/heat.
For sure! I had an EVGA 3080 FTW3. I think those cards had a bit higher TGP from the reference design though, so that may explain our higher thermals. Either way, the new cooling system they devised works damn well for how much power these things can suck down.
 
Get the 5080. It's a good card, and the somewhat limited VRAM issue is probably less of a issue when using DLSS.

I have a 5070 TI in my secondary PC in the living room, and haven't had any VRAM issues with it when playing on a 4K tv with dlss.
Few games can allocate more at 4K native with maxed out settings and RT / PT but right now it's not an issue from practical point of view. My RTX4080S is to slow to run Alan Wake 2 at 4K native with PT, and when you turn on DLSS Balance VRAM allocation drops to 10-12GB.

Alan-Wake2-2025-03-14-11-27-44-567.jpg


balance.jpg


Alan-Wake2-2025-03-14-11-24-31-421.jpg


Indiana Jones with PT and Warhammer 40,000: Space Marine 2 (with DLC texture pack) are the only games that were VRAM-limited on my RTX 4080 Super at 4K, even with DLSS. I had to set the textures to 'Ultra' in these games to stay within the VRAM budget, but I couldn't see any difference in texture resolution or detail. Therefore, I can't say that having 16 GB of VRAM affected my experience with these two games. The vast majority of 4K games use 9–12 GB of VRAM on my card, so I'm not worried about VRAM allocation for now. However, I'm sure 16 GB will not be enough for PS6/Xbox Two X ports. My card will be done by the time the next generation of consoles launches, which is why I'm planing replacing it with the RTX 6080.
 
Last edited:
Top Bottom