• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

ZOTAC confirms GeForce RTX 5090 with 32GB GDDR7 memory, 5080 and 5070 series listed as well

Shodai

Member
Seriously what is out of coming out that would get anywhere close to taxing these things, I can justify highly expensive Quadro cards for my work machines but for gaming how are these in anyway being utilized
4k 240hz OLED monitors have been around nearly a year. All my games on a 4090 are not pushing nearly 240 fps, so...
 
id-buy-that-for-a-dollar-robocop.gif
 
The 4080 is approaching DOA. Stalker 2 is regularly hitting 15+ gb, I'm debating getting a 7900XTX

I should've sprang for the 4090 this card is such a waste
15+ GB VRAM with mods? Stalker 2 seems to be allocating around 10GB VRAM (real usage 8-9GB).



Some game might fill all available VRAM but that doesnt mean they really need that much (there's a difference between VRAM allocation and process). Stuttering or performance degradation is the only way to know for sure that the game is really VRAM-limited.

As of now there are 3 games I'm aware of that can use more than 16GB VRAM for real.

- Warhammer 40K space marine 2 with 4K texture pack (this texture pack also requires 12 core 7900X3D CPU to decompress texture data fast enough).

- Indiana Jones at 4K native with PT with maxed out settings.

- cyberpunk with mods at 4K native with PT + FG.

The RX7900XTX 24GB would help you in Space Marine 2 assuming you have 7900X3D (athough I havent noticed any difference in texture quality even when I made screenshot comparisons between ultra and high settings), but in PT games AMD card would not benefit you in any way. In Indiana Jones PT will not run at all on the AMD card, and in Cyberpunk at 4K native you will get an unplayable frame rate with PT. On RTX 4080 you can at least play both games with PT without any problems with reasonable settings. For example DLSS quality in cyberpunk is enough to stay within 16GB VRAM budget. In Indiana jones at 4K I also need to set texture streaming budget to high (and I havent noticed texture quality reduction).

Besides these rare exceptions most games use around 9-12GB VRAM at 4K. I would worry if my card had 12GB VRAM, but 16GB VRAM is still plenty. IMO VRAM requirements won't increase that much until 2028/2029 when developers start making games with PS6 hardware in mind.
 
Last edited:

Puscifer

Member
15+ GB VRAM with mods? Stalker 2 seems to be allocating around 10GB VRAM (real usage 8-9GB).



Some game might fill all available VRAM but that doesnt mean they really need that much (there's a difference between VRAM allocation and process). Stuttering or performance degradation is the only way to know for sure that the game is really VRAM-limited.

As of now there are 3 games I'm aware of that can use more than 16GB VRAM for real.

- Warhammer 40K space marines 2 with 4K texture pack (this texture pack also requires 12 core 7900X3D).

- Indiana Jones at 4K native with PT with maxed out settings.

- cyberpunk with mods at 4K native with PT + FG.

The RX7900XTX 24GB would help you in Space Marine 2 (athough I havent noticed any difference in texture quality even when I made screenshot comparisons between ultra and high settings), but in PT games AMD card would not benefit you in any way. In Indiana Jones PT will not run at all on the AMD card, and in Cyberpunk at 4K native you will get an unplayable frame rate. On RTX 4080 you can at least play both games with PT without any problems with reasonable settings. For example DLSS quality in cyberpunk is enough to stay within 16GB VRAM budget. In Indiana jones at 4K I also need to set texture streaming budget to high (and I havent noticed texture quality reduction).

Besides these rare exceptions most games use around 9-12GB VRAM at 4K. I would worry if my card had 12GB VRAM, but 16GB VRAM is still plenty. IMO VRAM requirements won't increase that much until 2028/2029 when developers start making games with PS6 hardware in mind.

I'll screenshot, but last night I was playing it was bouncing around between 1450-15700 and Everytime it approached 16gb I was getting some MASSIVE lags.
 
I was really hoping they were going to shift the 70 series card to 16GB. With Intel sticking 12GB on a $250 card at launch and the prices AMD are doing on the 16GB cards, we know it isn't impossible for them to get more memory on these things.
 

Puscifer

Member
15+ GB VRAM with mods? Stalker 2 seems to be allocating around 10GB VRAM (real usage 8-9GB).



Some game might fill all available VRAM but that doesnt mean they really need that much (there's a difference between VRAM allocation and process). Stuttering or performance degradation is the only way to know for sure that the game is really VRAM-limited.

As of now there are 3 games I'm aware of that can use more than 16GB VRAM for real.

- Warhammer 40K space marine 2 with 4K texture pack (this texture pack also requires 12 core 7900X3D CPU to decompress texture data fast enough).

- Indiana Jones at 4K native with PT with maxed out settings.

- cyberpunk with mods at 4K native with PT + FG.

The RX7900XTX 24GB would help you in Space Marine 2 assuming you have 7900X3D (athough I havent noticed any difference in texture quality even when I made screenshot comparisons between ultra and high settings), but in PT games AMD card would not benefit you in any way. In Indiana Jones PT will not run at all on the AMD card, and in Cyberpunk at 4K native you will get an unplayable frame rate with PT. On RTX 4080 you can at least play both games with PT without any problems with reasonable settings. For example DLSS quality in cyberpunk is enough to stay within 16GB VRAM budget. In Indiana jones at 4K I also need to set texture streaming budget to high (and I havent noticed texture quality reduction).

Besides these rare exceptions most games use around 9-12GB VRAM at 4K. I would worry if my card had 12GB VRAM, but 16GB VRAM is still plenty. IMO VRAM requirements won't increase that much until 2028/2029 when developers start making games with PS6 hardware in mind.

Memory leak?
So nope, I played some Cyberpunk 2077 tonight without path tracing and Stalker 2 again just to see and they were topping 15GB at 4K DLSS.

My rig is as follows:

4080 Founders
7800X3D
32gb DDR5 @ 6000

Averaging 14-1500gb of VRAM
 

The Cockatrice

I'm retarded?
I'll screenshot, but last night I was playing it was bouncing around between 1450-15700 and Everytime it approached 16gb I was getting some MASSIVE lags.

You need to learn the difference between used ram and allocated ram. All games will allocate resources to the max available you have, but in reality most games used up to 10-12 and far less without raytracing. Indiana Jones is I think the first to need a huge amount of VRAM when you use its pathtracing, though it could also be unoptimized as shit since its the first time idtech has used PT. Cyberpunk never surpassed 11 GB with PT and DLSS/FG at high/balanced/ 1440p. Resolution ofc matters but the majority of gamers play at 1080p still and 1440p. 4K is still a niche for most pc gamers even with dlss. Remember, gaf is a massive minority when it comes to PC elitism/discussion.

qhw6SyH.png
 

Stuart360

Member
15+ GB VRAM with mods? Stalker 2 seems to be allocating around 10GB VRAM (real usage 8-9GB).



Some game might fill all available VRAM but that doesnt mean they really need that much (there's a difference between VRAM allocation and process). Stuttering or performance degradation is the only way to know for sure that the game is really VRAM-limited.

As of now there are 3 games I'm aware of that can use more than 16GB VRAM for real.

- Warhammer 40K space marine 2 with 4K texture pack (this texture pack also requires 12 core 7900X3D CPU to decompress texture data fast enough).

- Indiana Jones at 4K native with PT with maxed out settings.

- cyberpunk with mods at 4K native with PT + FG.

The RX7900XTX 24GB would help you in Space Marine 2 assuming you have 7900X3D (athough I havent noticed any difference in texture quality even when I made screenshot comparisons between ultra and high settings), but in PT games AMD card would not benefit you in any way. In Indiana Jones PT will not run at all on the AMD card, and in Cyberpunk at 4K native you will get an unplayable frame rate with PT. On RTX 4080 you can at least play both games with PT without any problems with reasonable settings. For example DLSS quality in cyberpunk is enough to stay within 16GB VRAM budget. In Indiana jones at 4K I also need to set texture streaming budget to high (and I havent noticed texture quality reduction).

Besides these rare exceptions most games use around 9-12GB VRAM at 4K. I would worry if my card had 12GB VRAM, but 16GB VRAM is still plenty. IMO VRAM requirements won't increase that much until 2028/2029 when developers start making games with PS6 hardware in mind.

Yeah problem is people with high end cards with a high amount of vram dont seem to realize that games will allocate more vram if its available. Thats why you will get someone with say 24gb of vram, play a game and it shows 12gb usage, and they will be like 'dont play this game on ultra if you have less than 12gb vram!'. Then someone with a 8gb card plays the game just fine and is like 'eh?'.

My last card was a 1080ti with 11gb, and at 1080 (and even 1440p sometimes), very few games even hit 8gb usage, even at max settings. Sure you will get the odd game where you might have to turn textures down one notch, or shadows, but the majority will be fine on 8gb cards. Its the reason why i went for a 3070 with my upgrade.

Also this is PC gaming we are talking about, you adjust settings when and where needed. You dont HAVE to play everyhting on ultra. In fact many people say its not worth maxing games out as the visual differences can be very minor, while the performances losses can be big.
 

rofif

Can’t Git Gud
I don’t think people buying them care.

It’s hilarious how much waste this is approaching considering the wattage the card will draw. GL with energy prices nowadays. Also: fuck the planet.
That’s a good point. These cards alone draw what? 2-3x wattage compared to a console and will become obsolete sooner than should be due to low ram amount.

My 3080 is 330watt, still plenty fast but nvidia crippled it with 10gb vram and not including any dlss3 stuff
 

nkarafo

Member
240fps is surely approaching the realms of diminishing returns, 120 -> 240 are you really going to see those differences?
There are no diminishing returns yet with modern, LCD screens.

If we were talking about CRT technology you would be right. But with LCDs, you need as much as you can get to completely get rid of the inherent motion blur.

I specifically bought a 240hz panel instead of a 120/144hz one because of this. And there is a big difference, assuming you can reach those numbers. I mostly test older games with unlocked fps to reach 240fps (the Quake 1/2 NightDive remasters are good for this too) and the difference in motion clarity between 120fps and 240fps is big. Though even at 240fps you still don't reach the crystal clear clarity of a CRT monitor at 75/85hz (which i still have and test). But it's close so think at 360hz you will be 100% there.

It's pretty impressive how superior the CRT technology is with this and most people have no idea.
 
Last edited:

KungFucius

King Snowflake
Yeah it's crazy they don't offer anything between the 16 and 32 range. Unless those cards will offered later.
And God i hope they'll drop 8GB, that shit would be lame as fuck
Why is this crazy? Almost all HW makers pull this shit. They make every model below the top tier undesirable in some way and make the top tier one 50% more expensive than 2nd best. If I was going to upgrade my 4090 I would have no choice but to get a 5090 without accepting some downgrade. If the 5080 had 24 GB it would probably be significantly better than a 4090. It would also look like a good path for those on 3090s.
 

HeisenbergFX4

Gold Member
240fps is surely approaching the realms of diminishing returns, 120 -> 240 are you really going to see those differences?
The jump from 60 to 120 is pretty great but no the jump from 120 to 240 isn’t near as noticeable but I can tell a difference

This monitor can do 4k 240hz then flip to 1080p 480hz and its super smooth running 480 but doesn’t feel twice as “fast” as 240
 

Brucey

Member
There are no diminishing returns yet with modern, LCD screens.

If we were talking about CRT technology you would be right. But with LCDs, you need as much as you can get to completely get rid of the inherent motion blur.

I specifically bought a 240hz panel instead of a 120/144hz one because of this. And there is a big difference, assuming you can reach those numbers. I mostly test older games with unlocked fps to reach 240fps (the Quake 1/2 NightDive remasters are good for this too) and the difference in motion clarity between 120fps and 240fps is big. Though even at 240fps you still don't reach the crystal clear clarity of a CRT monitor at 75/85hz (which i still have and test). But it's close so think at 360hz you will be 100% there.

It's pretty impressive how superior the CRT technology is with this and most people have no idea.
Less motion blur with the oleds?
 

Bojji

Member
So nope, I played some Cyberpunk 2077 tonight without path tracing and Stalker 2 again just to see and they were topping 15GB at 4K DLSS.

My rig is as follows:

4080 Founders
7800X3D
32gb DDR5 @ 6000

Averaging 14-1500gb of VRAM

CP can easily go above 14GB with PT and FG but Stalker? It uses ~9.3GB in native 4k on highest settings and less than 8GB with DLSS B (reserved):

Z2ceuSQ.jpeg
3GY9wmB.jpeg


Maybe it grows when playing but to reach 15GB? This must be memory leak or something.
 

Calverz

Member
I just had a look and there isn’t that many 4090’s for sale and seem to hold price a lot better than 4080’s. I might quickly sell and go for the 5090. Anyone know quickly what extra wattage needed between a 4090 and a 5090 might be? I’m sure I have overhead currently but would rather just a straight swap gpu without the hassle of changing psu too
 

GHG

Member
I just had a look and there isn’t that many 4090’s for sale and seem to hold price a lot better than 4080’s. I might quickly sell and go for the 5090. Anyone know quickly what extra wattage needed between a 4090 and a 5090 might be? I’m sure I have overhead currently but would rather just a straight swap gpu without the hassle of changing psu too

If your PSU is newer and natively has one of the 12vhpwr connectors then you shouldn't have too much to worry about provided you're over ~1000w.
 

Calverz

Member
If your PSU is newer and natively has one of the 12vhpwr connectors then you shouldn't have too much to worry about provided you're over ~1000w.
It’s the Corsair Rm1000x (2021) I have. Yea I use their bespoke 12vhpwr connector. Is the 5090 using that same connection again?!
 

StereoVsn

Gold Member
Kind of has to if it's 600 W.
And depending on the rest of the system (CPU, etc), 1000w PSU might not be enough for a 5090 setup.

Edit: And Nvidia can go suck it for 16GB 5080. It’s ridiculous and basically means only worthwhile cards will be a 5090 for $$$$$ or 5070Ti.

Basically might as well wait a year for 5080 Super if you want that sort of a card. I have a 3080Ti and that’s what I am most likely going to do.
 
Last edited:
Top Bottom