• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 50 “Blackwell” GPU Configs Revealed

DenchDeckard

Moderated wildly
I'm planning on getting a 7900XT as part of a new build before Christmas, should I hold off or is it likely nVidia's new stuff will drastically cut into my budget (€2000 which is €400 over my initial budget).

* Forgot to add I'd probably be looking at the 5070 gpu. The 80/90 cards are way out of bounds for me.

I would 100% wait and make a decision after the 6th of Jan.
 
I'm planning on getting a 7900XT as part of a new build before Christmas, should I hold off or is it likely nVidia's new stuff will drastically cut into my budget (€2000 which is €400 over my initial budget).

* Forgot to add I'd probably be looking at the 5070 gpu. The 80/90 cards are way out of bounds for me.
5070 going to be 700 euro most likely. Which seems to be around what the cheapest 7900XT is at right now. Maybe see if there is a really good black friday deal next week for the 7900XT. If you don't have a gaming capable GPU it's still a long wait for the 5070, probably march 2025 for proper availability.
 

Bojji

Member
5070 going to be 700 euro most likely. Which seems to be around what the cheapest 7900XT is at right now. Maybe see if there is a really good black friday deal next week for the 7900XT. If you don't have a gaming capable GPU it's still a long wait for the 5070, probably march 2025 for proper availability.

5070 with that 12GB of vram - avoid!

There are games already thay go up to 13, 14GB even with dlss balanced or performance with 4k target and RT, FG etc.

That 5070ti could be interesting depending on the price if it has 16GB.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
12gb vram is the only bogey

The 5070 is expected to be 12GB?

Damn.

I'm skipping the generation and haven't been following leaks like I usually would cuz I lucked into a 4080 and don't want feel fomo otherwise I might sell this thing before it's even had a serious workout.

Well with no real competish Nvidia is probably gonna fuck people over unless they course correct like they did with the rtx30s
 

Gubaldo

Member
The GTX260 was the xx70 of its generation.
There was no GTX270.
Nvidia used to reserve odd numbers for "supers"......so your xx60 comment still doesnt hold water.

Look I criticize Nvidia as much as the next guy.....cuz of pricing mostly(my work need CUDA so I needs em).
Ive been a second tier buyer since my GTX260c216 but the third tier card has never(not in my lifetime) been able to trump the previous generations top tier card.
The second tier cards yes, they used to be able to keep up or beat the previous generations top tier, but the 40 series pretty much broke that but only in regards to the 4070 not being able to beat the 3090, it does however keep up with the 3080 and if you include the 4070S then the status quo hasnt really changed.
The xx60s were a good buy generally because of price/performance.....not because they kept up with top tier of the previous generation.
It holds water !!! see even if we consider GTX 260 a non X60 card
There was GTS 250 which was slightly better than the 8800GTX/ultra

There was 4 occasions when a x60 class beat or matched previous single GPU high end card

6600 GT which easily beat 5950 Ultra
7600 GT which slightly beat 6800 ultra
GTS 250 again slightly ahead of 8800 ultra
GTX 460 1Gb neck and neck with GTX 285 ( saw this on TPU review)
 
I wonder how much faster the RTX5090 will be compared to my RTX4080S. If the 5090 going to be 50% faster than RTX4090, and the RTX4090 is up to 48% faster compared to my 4080S, I guess the RTX5090 should be around 2x faster compared to my card? If that will be the case the RTX5090 should run the most demanding PT games (Portal RTX, cyberpunk, alan wake 2) at 4K DLSSQuality + FG at 100-120fps (I have similar results at 1440p).
 

Gubaldo

Member
I wonder how much faster the RTX5090 will be compared to my RTX4080S. If the 5090 going to be 50% faster than RTX4090, and the RTX4090 is up to 48% faster compared to my 4080S, I guess the RTX5090 should be around 2x faster compared to my card? If that will be the case the RTX5090 should run the most demanding PT games (Portal RTX, cyberpunk, alan wake 2) at 4K DLSSQuality + FG at 100-120fps (I have similar results at 1440p).
On avg 4090 is ~30% over 4080S
i "alteast" expect 5090 to be 40% over 4090
that would put 5090 about 80-90% over 4080S
almost 2x
 

hinch7

Member
I'm planning on getting a 7900XT as part of a new build before Christmas, should I hold off or is it likely nVidia's new stuff will drastically cut into my budget (€2000 which is €400 over my initial budget).

* Forgot to add I'd probably be looking at the 5070 gpu. The 80/90 cards are way out of bounds for me.
Yeah, I'd wait at this point. Feb for 5070 or 8800XT some time in Q1. Granted the former will have 12GB VRAM.

5070 should destroy the 7900XT at RT, as with 8800XT.. all while drawing way less heat/power.
 
Last edited:

Xdrive05

Member
VRAM is expected to be the same for all cards... except for the 5090 and there likely won't be a 5060 Ti 16 GB.

The refresh might increase it by 50%.

Agreed that this is the most likely situation. All indications are that the VRAM configs are exactly the same, with the eventual "Super" refresh having the 3GB modules instead of 2GB modules, getting you the possible "5060 Super 12GB" at least in theory, but still on the 128-bit bus for that one. The bus would most likely not change with the refresh, but the amount would go up by 50%.

No idea where they would put the pricing for the refresh. That probably depends on how the market is responding at that time.

I'm interested to see if the 50XX series leans into the "jack up the L2 cache instead of increasing the bus" approach again. They will at least match Ada Lovelace in that regard, and I'm expecting an increase of some degree.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
It holds water !!! see even if we consider GTX 260 a non X60 card
There was GTS 250 which was slightly better than the 8800GTX/ultra

There was 4 occasions when a x60 class beat or matched previous single GPU high end card

6600 GT which easily beat 5950 Ultra
7600 GT which slightly beat 6800 ultra
GTS 250 again slightly ahead of 8800 ultra
GTX 460 1Gb neck and neck with GTX 285 ( saw this on TPU review)

deep-sigh-albert-wesker.gif


Okay lets do this.



The 6600GT was the xx70 of its generation.
The 6700XL was a "super" 6600, I think it was just one company that did some BIOS editing, Nvidia used to reserve odd numbers for "supers".


The 7600GT was the xx70 of its generation.
There was no 7700 Nvidia used to reserve odd numbers for "supers".


The GTS 250 is a rebadged 9800GTX notice the chip is a G92, same as the 8800.
The 9800GTX+ inexplicably also beats the GTS 250 even if they are effectively the exact same thing, probably power related.


The GTX460 vs GTX285 is the closest its been in my lifetime for a tier 3 card to keep up with a tier 1 card of the pervious generation
As I had said n my initial post countering your claims.


Do we need to go even further back in time now??
 
Last edited:

Gubaldo

Member
deep-sigh-albert-wesker.gif


Okay lets do this.



The 6600GT was the xx70 of its generation. --- Nope pure 6800 was
The 6700XL was a "super" 6600, I think it was just one company that did some BIOS editing, Nvidia used to reserve odd numbers for "supers".


The 7600GT was the xx70 of its generation. - again there lot of cards ins this gen here 7800gs , 7900 gs etc were x70 of that generation and GTX of tha generation were ultra higher end
There was no 7700 Nvidia used to reserve odd numbers for "supers".


The GTS 250 is a rebadged 9800GTX notice the chip is a G92, same as the 8800. - rebranded or not but it was x60 of that generation
The 9800GTX+ inexplicably also beats the GTS 250 even if they are effectively the exact same thing, probably power related.


The GTX460 vs GTX285 is the closest its been in my lifetime for a tier 3 card to keep up with a tier 1 card of the pervious generation
As I had said n my initial post countering your claims.


Do we need to go even further back in time now??
 
I'm interested to see if the 50XX series leans into the "jack up the L2 cache instead of increasing the bus" approach again. They will at least match Ada Lovelace in that regard, and I'm expecting an increase of some degree.

If anything the L2 cache might be cut on some models.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
G Gubaldo

Okay We are gonna call the 6800 an xx70 now.........sure sure sure, especially considering a 6800LE was an even lower 6800, so the 6800LE must have been the xx60 then I guess huuh?
Ohh then that means the 6600GT was actually xx50 of its generation......damn Nvidia were so fire back in the day.

Were also gonna call the 7800 an xx70 too.

I dont really feel like going back and forth with you when the facts are the facts, alternative facts arent for me.
So we can just leave it there.

2 times in the last 21 years sets such a high precedent, shame on Nvidia!

Im out.

rQDWw0.gif
 
Last edited:

Gubaldo

Member
G Gubaldo

Okay We are gonna call the 6800 an xx70 now.........sure sure sure, especially considering a 6800LE was an even lower 6800, so the 6800LE must have been the xx60 then I guess huuh?
Ohh then that means the 6600GT was actually xx50 of its generation......damn Nvidia were so fire back in the day.

Were also gonna call the 7800 an xx70 too.

I dont really feel like going back and forth with you when the facts are the facts, alternative facts arent for me.
So we can just leave it there.

2 times in the last 21 years sets such a high precedent, shame on Nvidia!

Im out.

rQDWw0.gif
Dude you yourself started calling x60 as x70 card few posts back

And yes 6800 le , 6800, 6800 gs etc were x70
Of that era

7800gs ,7800Gso etc too.
 
Last edited:

ap_puff

Banned
I'm planning on getting a 7900XT as part of a new build before Christmas, should I hold off or is it likely nVidia's new stuff will drastically cut into my budget (€2000 which is €400 over my initial budget).

* Forgot to add I'd probably be looking at the 5070 gpu. The 80/90 cards are way out of bounds for me.
I would wait to see what they have on offer for rdna4/blackwell.
 
Top Bottom