• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Daniel Owen: How bad is 8GB of VRAM in 2024?

How much vram you have?

  • 6GB

    Votes: 22 7.1%
  • 8GB

    Votes: 65 20.8%
  • 10GB

    Votes: 27 8.7%
  • 12GB

    Votes: 70 22.4%
  • 16GB or more

    Votes: 128 41.0%

  • Total voters
    312
D

Deleted member 1159

Unconfirmed Member
When I bought my 3090 people were saying it’s not a gaming card, nobody needs 24gb, and the 10gb 3080 is perfectly fine.
Who’s laughing now? Mwahaha
Me, I had an 11GB(I think?) 1080ti for years and waited for the 4080 and its 16GBs seem to be fairing quite well
 

dave_d

Member
When I bought my 3090 people were saying it’s not a gaming card, nobody needs 24gb, and the 10gb 3080 is perfectly fine.
Who’s laughing now? Mwahaha
The ones that saved between $700 and $1000 by buying the cheaper card and invested it in Nvidia stock. (Have you seen how much it's gone up since 2020? Man, I wished I invested 10k in Nvidia even back in 2016)
 

FireFly

Member
For this GPU 24GB is overkill but Nvidia likes to fuck consumers with "options" like that:

10GB -------------------(nothing)--------->24GB

3090 would be super fine with 16GB (with lower price).
Well there was the 12 GB 3080 Ti, but AFAIK there were no 1.5 GB GDDR6X memory modules, so there was no way to get to say, 16GB with a 384-bit bus.
 
Last edited:

Bojji

Member
If you're in the market for a graphics card (new build) should I aim 12GB or 16GB VRAM? Budget for the graphics card is 600€.

I leaning towards 4070 Super, but even after watching a bunch of benchmarks, that 12GB GB VRAM seems a bit short?

12GB is the minimum now if you want to use max setting in most games (not all of them) and sometimes you won't be able to use Nvidia frame gen if you target 4k output (even with dlss performance). 16GB is the sweet spot right now. 20GB and more is useless most of the time.

Well there was the 12 GB 3080 Ti, but AFAIK there were no 1.5 GB GDDR6X memory modules, so there was no way to get to say, 16GB with a 384-bit bus.

12GB cards were added later. 3080 would be ok if it was 12GB from the beginning.

Of course they designed this GPU with that 24GB target, if they wanted they could have made cards with different memory layout.
 

FireFly

Member
12GB cards were added later. 3080 would be ok if it was 12GB from the beginning.

Of course they designed this GPU with that 24GB target, if they wanted they could have made cards with different memory layout.
12 GB cards require a full GA102 die so will have worse yields. Increasing the bus size further would have increased the cost and complexity, and according to Microsoft, large bus sizes are difficult to implement on GDDR6 due to the higher speeds.

They could have used the 3090's clamshell design on the 3080 for 20GB, but anything less would have required reducing the size of the memory bus and therefore lowering bandwidth.

Edit: Actually the 3080 Ti still isn't a full GA102 die, but enabling the extra SMs still isn't "free".
 
Last edited:
What's the highest anyone has seen vRAM usage in a game? Just curious. I'm on 8 GB now and my next card will have 16 GB as a floor.

When using ultra textures on Steel Rising with my 3090 it'll use every KB of the 24GB available. Game will stutter too so probably wants to gobble up even more even if the quality isn't much, if at all, better than high textures.
 

Bojji

Member
12 GB cards require a full GA102 die so will have worse yields. Increasing the bus size further would have increased the cost and complexity, and according to Microsoft, large bus sizes are difficult to implement on GDDR6 due to the higher speeds.

They could have used the 3090's clamshell design on the 3080 for 20GB, but anything less would have required reducing the size of the memory bus and therefore lowering bandwidth.

Edit: Actually the 3080 Ti still isn't a full GA102 die, but enabling the extra SMs still isn't "free".

3080 12GB only have slightly bigger core than 3080 10 and has bus that allows 12GB. 3080 12GB - 3080 Ti - 3090 all have buses for 12/24 config and all of them have different core config. 3090ti is full die.

They should have made 3080 with 20GB the same way that 4060ti exists with 8/16 options - 3070 should get the same treatment. But of course they don't want to make their (not top) GPUs too good for consumers.
 

DeepEnigma

Gold Member
Love Myself GIF
Season 2 Love GIF by Ash vs Evil Dead
 

Nankatsu

Member
12GB is the minimum now if you want to use max setting in most games (not all of them) and sometimes you won't be able to use Nvidia frame gen if you target 4k output (even with dlss performance). 16GB is the sweet spot right now. 20GB and more is useless most of the time.

Won't be gaming at 4K at all. 1440p is the way.
 
Threads like this really show how out of touch NeoGAF is with the usual PC situation. According to the Steam hardware survey over 75% of users have 10GB or less.

I'm aware. What's even crazier is that 4k is something like less than 5%. Lots of steam users don't even have a dedicated gpu.

I think the vram panic is certainly a little but the fact remains - nvidia is basically the world's leader in tech and they should have a sold an 80 class card with enough vram to run their own damn features.
 

Madflavor

Member
I got my 3080 10gb in 2020 and don’t regret it. I ran RE4R on high settings, and with a few small adjustments, it ran totally fine and hit 60+ fps. Cyberpunk 2077 also runs crisp on Ultra with Raytracing enabled.

Until I find myself going to Medium settings, I’m in no rush to upgrade.
 

Hugare

Member
Dude testing on 4K or 1440p is missing the point

8gb should be perfectly fine if you game on 1080p (specially with DLSS). When you add Lossless Scaling then, its even better.

I'm playing Cyberpunk with RT on Psycho at 60 FPS on my notebook equiped with a 3060. DLSS on Quality, 1080p, Lossless Scaling framegen.

Problem is, there are too many unoptimized games being released nowadays. Its more about the tech being used and less about the hardware.
 

Bojji

Member
Dude testing on 4K or 1440p is missing the point

8gb should be perfectly fine if you game on 1080p (specially with DLSS). When you add Lossless Scaling then, its even better.

I'm playing Cyberpunk with RT on Psycho at 60 FPS on my notebook equiped with a 3060. DLSS on Quality, 1080p, Lossless Scaling framegen.

Problem is, there are too many unoptimized games being released nowadays. Its more about the tech being used and less about the hardware.

GoT is perfectly playable with 4K DLSS performance on 16GB version and unplayable on 8GB one. Many games also start to show problems at 1440p, frame gen is also unusable in some of them.

3070 for example never was a 1080p card, it was 500$ mid-high end GPU in 2020 (3080 was HE and 3090 was Enthusiast level) designed mostly for 1440p gaming. 4060ti has about the same performance for 400$ in 2023, both GPUs share the same problems.

With more VRAM you have way more options. This video also focuses on 2024 games, there were many games where 8GB cards had missing textures and other problems in 2023.

You DON'T WANT 8GB card in 2024 and everyone buying GPU should know this.
 

Hugare

Member
GoT is perfectly playable with 4K DLSS performance on 16GB version and unplayable on 8GB one. Many games also start to show problems at 1440p, frame gen is also unusable in some of them.

3070 for example never was a 1080p card, it was 500$ mid-high end GPU in 2020 (3080 was HE and 3090 was Enthusiast level) designed mostly for 1440p gaming. 4060ti has about the same performance for 400$ in 2023, both GPUs share the same problems.

With more VRAM you have way more options. This video also focuses on 2024 games, there were many games where 8GB cards had missing textures and other problems in 2023.

You DON'T WANT 8GB card in 2024 and everyone buying GPU should know this.
"Many games also start to show problems at 1440p, frame gen is also unusable in some of them.", that was my point. But if you game at 1080p (specially with DLSS) you'll be fine most of the time.

And framegen is perfectly fine if your baseline is 30 FPS. As I said, I'm playing Cyberpunk at 60, and also Baldurs Gate 3. Wouldnt recomend it to online competitive games, but for most games should be more than fine.

Would I buy an 8GB card in 2024? Probably not.

Are 8GB cards totally unusable in 2024? Certainly not the case if you game at 1080p.
 
Last edited:

Bojji

Member
"Many games also start to show problems at 1440p, frame gen is also unusable in some of them.", that was my point. But if you game at 1080p (specially with DLSS) you'll be fine most of the time.

And framegen is perfectly fine if your baseline is 30 FPS. As I said, I'm playing Cyberpunk at 60, and also Baldurs Gate 3. Wouldnt recomend it to online competitive games, but for most games should be more than fine.

Would I buy an 8GB card in 2024? Probably not.

Are 8GB cards totally unusable in 2024? Certainly not the case if you game at 1080p.

Yeah, but you will still need to mess with the settings in some games even at 1080p.

DF talking about 8GB of VRAM (timestamped):

 
As others have said, it all depends on the settings. GPUs with 8GB or VRAM can no longer run every single game maxed out, but with a few tweaks I can still run almost every game without any problems (no VRAM related stuttering or slowdowns to a 10-20fps).

So far, the only game I have tried that is really unacceptable on my 8GB GPU is Forspoken (I have only tried demo version). I could only use the lowest texture settings in this game and the texture quality looked extremely bad, like in PS2 era. There was a similar problem with the TLOU1 at launch, but thanks to patches even this game runs fine at high texture settings on my GTX1080.

Upscaling like FSR2.0 also helps a lot with VRAM. In Ratchet and Clank I had stuttering in native TAA and high texture settings, but the FSR quality lowered the VRAM usage and now the game runs fine. What's great is that FSR2 looks better to me than native on the static image, and the game runs faster on my PC at 1440p FSRQ than at 1080p in standard TAA. I have only two issues with FSR. In some games the FSR image is not sharp enough (Alan Wake 2), or oversharpened (TLOU1), so I had to adjust FSR sharpening to make it look better than native on the static image (some games offer FSR sharpening, but if not I'm using reshade and lumasharpen + CAS sharpening). The second problem with FSR is a slight artifacting / shimering during motion, especially visible on grass. I wonder if DLSS has less artifacts in motion because if that's the case then native TAA would look worse in every way.

So IMO 8GB GPUs are still usable, but I'm planning on buying a new GPU soon and I'm not willing to buy anything with less than 16GB of VRAM. I learned from my mistake in the past that buying a new GPU with small amount of VRAM is not worth it. I bought a GTX680 2GB in 2012 and soon (PS4 launch) I had to compromise on texture settings because more and more games were stuttering badly because of insufficient VRAM. My GTX1080 and 1080Ti (I had both) have more than enough VRAM and that's one of the reasons why those GPUs lasted as long as they did. I feel like 12GB's VRAM is fine for modern games, but 16GB is definitely more future proof, because with maxed out settings some games already run badly on the 12GB GPUs (Ratchet and Clank can eat up to 15GB VRAM at 4K max settings and that's real VRAM usage, meaning 12GB GPUs have extreme slowdowns because of insufficient VRAM).
 
Last edited:
My 3080 10Gb is still serving me well and I'm not having problems running games at 1440p high or ultra settings in a lot of cases. In a lot of games ultra settings are a waste of resources and end up not being well optimised anyway. In some games there isn't even a visual difference for some settings moving from high to ultra.

I think 10Gb of VRAM is fine for anyone playing at 1440p at the moment and until I see a game I want to play that doesn't run as well as I want it to at 1440p (not because it poor optimisation) I won't be upgrading, especially at these retarded prices.
 
Last edited:

Kenpachii

Member
Got a laptop with 12gb 4080, yet to see any v-ram issue's in any game even cyberpunk with pathtracing at 3440x1440 doesn't have v-ram issue's.

Nice benchmark, sadly couldn't use ( i understand why ) nvidia cards with framegen.
 

Bojji

Member
I one of his videos he shows that Forbidden West and Alan Wake (without RT!) fuck up 8GB GPUs even in 1080p.
 

SlimySnake

Flashless at the Golden Globes
Even my 10 GB 3080 has issues in several games. Especially when turning on RT at 4k 60 fps even with DLSS on. RE4 crashes. Gotham Knights would have framedrops down to 2 fps for like 30 seconds. TLOU had the same 2 fps issues until they added a streaming setting.

This is a card made for ray tracing and ive had to turn off RT in several games just to get consistent performance without major drops.
 

nkarafo

Member
What pisses me off with Nvidia is that all their magic software tricks only benefit themselves as they allow them to nerf the hardware of their cards and sell them as if those nerfs never happened. So something like DLSS is used to balance things out and cover this performance loss. So you, as a gamer, think you get better performance from DLSS, but in reality you would get the same without DLSS if the hardware wasn't nerfed in the first place. Plus, it's worse in the end for you because software tricks like this don't work in all games and they can also cause other artifacts.

So watch as Nvidia releases the 128bit bus 5060 with 8GB of VRAM at 400$ because, hey, you can run the games at lower resolution and DLSS will cover the loss!
 
Last edited:

StreetsofBeige

Gold Member
1080p laptop with 4070. Should be fine. Although I dont even play big games on PC. My old one was messing up so I decided to get it, while giving my old one away.
 

SoloCamo

Member
I should have listened. Happened to me with the GTX 680 only having 2gb of vram and then I did it again with the 3080 10gb. Got the 680 for physx and the 3080 for ray tracing, I'm a sucker for those nvidia features.


george bush GIF


Got fooled again...

Funny how that works, same arguments I had when choosing a 7970Ghz edition over a 680 (3gb vram is more valuable than 2gb) and same with me choosing a 6900XT over a 3080 (16gb vs 10gb + RT). I learned this lesson back when I had a Geforce 4 ti4200 128mb back in the very early 2000's so it gets old warning people over and over to not undercut themselves on vram if they plan to use the card a bit - but alas here we are.

Googled how much vram my 1080ti has.

11. Those retards went backwards hahahahah

Nvidia will never make the mistake of releasing a card like the 1080ti again at the price they asked for. Card was a unicorn and Nvidia wouldn't dare want people to actually stick with their current card instead of buying the new and shiny next gen.
 
Last edited:

raduque

Member
I have an 8gb 2080, and it's starting to be a poor performer. I'm planning on buying a laptop some time later this year if I can save enough, and I was looking at 8gb 4060 machines. I can barely afford $1k for an 8 gig vram laptop, I'd never be able to afford $3000 for a laptop with 16gb vram.
 
Top Bottom