• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 50 “Blackwell” GPU Configs Revealed

Jigsaah

Member
I think I'm gonna have to bow out of this gen. A $2000-2500 GPU is nonsense when my 4090 crushes all games that I play.

Gonna chill out and get the next x3d AMD CPU, new mobo and RAM. Only way I buy a 5090 is if I can sell the 4090 at damn near MRSP.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Again with the stupid 128 bit bus.

Expect another XX60 tier failure that isn't worth it's cost.

Cache plus general gains from architecture and GDDR7 will mean the bus width wont be anywhere near like for like gen on gen.

The 4060(Ti) used GDDR6....not 6X base 6 on a 128but bus.
Yeah that thing was choking for dear life.

A 128bit of GDDR7 with more cache will be much much less of an issue.

It's still 2 GB chips so I wouldn't expect any vram increases except for the 5090, which might be 28 GB.
Ill fuck a duck if any cards are 8GB (outside of those oddball launches like a 5050 or something)
I expect the baseline to be 10-12GB now.
 
Ill fuck a duck if any cards are 8GB (outside of those oddball launches like a 5050 or something)
I expect the baseline to be 10-12GB now.

The 128 bit cards are going to be 8 GB. Maybe they will do 16 again for the 5060 Ti, but I'd hate to see what the premium would be.

Edit: The mobile lineup got leaked earlier, and it's 8-8-8-12-16-16 instead of 6-8-8-12-16. Although it appears the products have moved a tier in pricing up.
 
Last edited:
NPSGmpT.jpeg
 

Dorfdad

Gold Member
$2500 MSRP because some will pay. Imagine you can get a switch 2 / PS5 Pro / and new Series X and a year of Gamepass and PSNOW for about $1500 and have a shit ton of games to play or pay $2500 for a GPU that zero games will push for 2-3 years. That said I want one but I think I might have to slip into Geforce Now Ultimate once they support these cards in the cloud. It's becoming harder to be on the front lines anymore. PC / GPU / Monitor all need to be top of the line to support this stuff and we still have an OS which sucks for gaming!!!
 

//DEVIL//

Member
I love to see this. Suddenly it's too expensive? As if the 4090 wasn't overpriced as is? But hey, I warned people about this back when the OG Titan hit and people bought it up anyways. Nvidia will continue to raise prices because people keep saying ok to it.

And no this isn't an anti-Nvidia thing, if AMD was the market lead they;'d do it too. That said, I refuse to support it regardless even though I can afford it. I used to have a hard $500 cap on gpu's but with inflation these days it's bumped up. Spending $1,000 or more on a card is insane to me though.
You warning people won’t change anything . It’s not like you and I have million followers or big YouTubers or influencers to matter a bit ( even if we were, it means nothing ) god knows how many times hardware unboxed tried that and people still bought the 4090 in droves and I am one of them .

Yes it’s expensive for a video game hobby but each person limit is different . Mine probably is about 2k max. Others 3k. Half of the peeps here wouldn’t spend over 500$.

So save your breathe . We are not the center of the world.
 
I have just upgraded my old CPU (3770K to a 7800X3D), but I also planning to upgrade my GPU in August because the GTX1080 can no longer give me 50/60fps at 1440p in the latest games even at console like settings.

The problem is that even something like the RTX4080 is not fast enough to run the most demanding games (RT/PT games like Alan Wake 2) at 1440p 60fps, at least not without upscaling, so I would spend $1000 and still not be 100% happy with the results.

I can buy something like a 4070tiS / 4080 in August and play the most demanding games with compromises (DLSS3 and FG), or wait for the launch of the RTX5080 and hope that it will offer a substantial RT performance improvement. Tough decision :(.
 
Last edited:

OmegaSupreme

advanced basic bitch
I have just upgraded my old CPU (3770K to a 7800X3D), but I also planning to upgrade my GPU in August because the GTX1080 can no longer give me 50/60fps at 1440p in the latest games even at console like settings.

The problem is that even something like the RTX4080 is not fast enough to run the most demanding games (UE5 or games with RT) at 1440p 60fps, at least not without upscaling. I can buy something like a 4070tiS / 4080 in August and play the most demanding games with compromises (DLSS3 and FG), or wait for the launch of the RTX5080 and hope that it will offer a substantial RT performance improvement.
dlss isn't much of a compromise I don't think. It's damn good. I've got a 3080 and can pretty much max out most things at 1440p even with rt.
 
dlss isn't much of a compromise I don't think. It's damn good. I've got a 3080 and can pretty much max out most things at 1440p even with rt.
I have only seen DLSS on screenshot comparisons, but in games like The Witcher 3 or Ghost Of Tsushima I have used FSR2 quality and I did not like the results, so I would definitely want to play at native. How much better does DLSS look compared to FSR2?

I dont like how FSR2 looks, but FSR2 native looks great in Ghost of Tsushima, it has similar sharpness to SMAA, but no shimmering. If DLSS looked like this, I would be happy with the image quality.
 
Last edited:

hinch7

Member
5070 Ti @ 192 bit….. I dunno guys. Not ideal.
DDR7 and potentially larger cache will mitigate bandwidth worries. I'd be more concerned if they stuck with 12GB. In any case I doubt Nvidia cares, they just want to sell GB202's mostly to gamers.
 
Last edited:

Sentenza

Member
Based on this 5080 should be weaker than 4090.

Massive fucking scam again, 5090 will be fucking big and powerful and then we have mid tier 5080, massive gap in between.

This is Ada part 2 - we will scam you again!

nvidia-jensen-huang.gif
Aside from the fact that he was obviously being cheeky with the marketing tagline, I wonder if people realize he wasn't even remotely thinking of the consumer market during that presentation.
It was all aimed to businesses running their own server farms. Not Average Joe using his 1070 to play videogames.
 

SHA

Member
Nah man.. We all know what we buy and for how much, and how these fits in our personal situation of value. Nvidia's practices are well-known and unapologetic, arguably shameful, but it certainly doesn't constitute a scam in any sense of the term.
Jensen is always doubling down in Technology even though the silicon material is failing him.
 

Gp1

Member
And here is my completely PTSD 1060 waiting for a good xx60/60ti value offer for another generation...

game of thrones hbo GIF


Maybe ill go with a 5070 if they don't f* us up on VRAM.

It's Nvidia, let's wait AMD see what will be the gimmicks and the small letters this time...
 
The 4090 has outsold the 3090 2x with zero crypto demand and no covid bucks, and the 4070's (same price bracket as 3080) have nearly 2x'd the 3080. No one is actually broke, that's just theater they post online. Dumbass half retarded early 20's kids with a GED making $30/hr out here now. A $2500 5090 will enjoy sub-1 minute sellouts for 6+ months just like the 4090 did.
I want to believe that, but the 4090 was always within reach for me, while the 3090 never was. Standing in line, trying to win spots in raffles. Maybe it was just bad luck?
 

nkarafo

Member
Ill fuck a duck if any cards are 8GB (outside of those oddball launches like a 5050 or something)
I expect the baseline to be 10-12GB now.
But can a 128bit bus card have 12 GB? Isn't it restricted to either 8 or 16? Which automatically means 8 since it's Nvidia we are talking about here?

I'm calling it right now, the 5060 will be the same crappy deal as the 4060, 8GB at 400$ and 16GB at 500$ with very small performance gains over the 3060/4060.

My expectations are really low after waiting for so many years for a decent deal on a mid-range Nvidia card.
 

Bojji

Member
But can a 128bit bus card have 12 GB? Isn't it restricted to either 8 or 16? Which automatically means 8 since it's Nvidia we are talking about here?

I'm calling it right now, the 5060 will be the same crappy deal as the 4060, 8GB at 400$ and 16GB at 500$ with very small performance gains over the 3060/4060.

My expectations are really low after waiting for so many years for a decent deal on a mid-range Nvidia card.

In theory they can use 3GB chips so 128 bit GPU can have 12GB of memory.

IUjBqSF.jpeg
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
But can a 128bit bus card have 12 GB? Isn't it restricted to either 8 or 16? Which automatically means 8 since it's Nvidia we are talking about here?

I'm calling it right now, the 5060 will be the same crappy deal as the 4060, 8GB at 400$ and 16GB at 500$ with very small performance gains over the 3060/4060.

My expectations are really low after waiting for so many years for a decent deal on a mid-range Nvidia card.


The 5060s are coming way down the line when 3GB modules will be available.
So 12GB shouldnt be outside the realm of possiblity....but as you said its Nvidia and they are committing the GDDR7, so maybe the fuck the entrylevel guys over again.
 

Klik

Member
Upgrading from i5 12400f+ RTX 3060ti i wanna get Ryzen 9900X3D + RTX 5080 but honestly i dont see any new UE5 game that would justify this GPU..

My RTX3060ti is still running new games at 60-80fps/dlss high settings on 1440p..
 

hinch7

Member
And here is my completely PTSD 1060 waiting for a good xx60/60ti value offer for another generation...

game of thrones hbo GIF


Maybe ill go with a 5070 if they don't f* us up on VRAM.

It's Nvidia, let's wait AMD see what will be the gimmicks and the small letters this time...
Not sure if thats ever going to happen. If you're in the market for a sub $400 GPU. I'd look into getting something now instead of waiting a year or two to get something marginly better, or worse a regression in cost per frame.

6800's are going for like around $360. Which is already faster than a 4060Ti with 16GB VRAM. And it's not like these lower end SKU's are going to be that much more performant than current ones.
 

simpatico

Member
DDR7 and potentially larger cache will mitigate bandwidth worries. I'd be more concerned if they stuck with 12GB. In any case I doubt Nvidia cares, they just want to sell GB202's mostly to gamers.
Not many gamers want to get financing on a GB202 and the power supply required to make it turn over. Especially when they’ve all learned the shine wears off in 18 months. I’ve got a shelf in my workshop with the first two GTX Titan cards Nvidia ever came out with. They serve as my reminder and I never dust them.
 

StereoVsn

Gold Member
Aside from the fact that he was obviously being cheeky with the marketing tagline, I wonder if people realize he wasn't even remotely thinking of the consumer market during that presentation.
It was all aimed to businesses running their own server farms. Not Average Joe using his 1070 to play videogames.
Yeah, he was referring to scaling in datacenters and power consumption. So the more GPUs you buy, the faster you run your inference and training, the more power you save and so on.

I mean it’s still bullshit, but it’s not talking about regular consumers. It’s trying to sell to hyper scalers by literal truckloads at $20-40K a card.
 
And here is my completely PTSD 1060 waiting for a good xx60/60ti value offer for another generation...

game of thrones hbo GIF


Maybe ill go with a 5070 if they don't f* us up on VRAM.

It's Nvidia, let's wait AMD see what will be the gimmicks and the small letters this time...

Performance
Price
VRAM

Nvidia: Choose two!
 

Puscifer

Member
I am really dreading the prices they gonna charge for the 5080 and 5090...
Has it been confirmed yet? I can't lie, I've already decided the 4 series and PS5 along with my deck and pocket are the last systems I'm owning in general but damn I can't do this pricing anymore lol
 

BennyBlanco

aka IMurRIVAL69
I am gonna sit this one out because there are legit zero demanding games coming out next year that I wanna play. What game is Nvidia gonna use to sell these? Wukong?
 

Celcius

°Temp. member
I am gonna sit this one out because there are legit zero demanding games coming out next year that I wanna play. What game is Nvidia gonna use to sell these? Wukong?
I'm hoping Gears of War E-Day has ray tracing (or maybe even path tracing)
 

analog_future

Resident Crybaby
I am gonna sit this one out because there are legit zero demanding games coming out next year that I wanna play. What game is Nvidia gonna use to sell these? Wukong?

Marvel 1943 will be a really good one. And maybe Fable, DOOM: The Dark Ages, and Perfect Dark as well.

Not to mention 2026 games (6090 still won't be out by then) like Gears of War: E-Day, GTA VI, and all kinds of stuff that hasn't been announced yet.
 
Last edited:
Top Bottom