Me, I had an 11GB(I think?) 1080ti for years and waited for the 4080 and its 16GBs seem to be fairing quite wellWhen I bought my 3090 people were saying it’s not a gaming card, nobody needs 24gb, and the 10gb 3080 is perfectly fine.
Who’s laughing now? Mwahaha
Remember people saying "8 Gb is okay for PS5 / SX gen"
Same guys saying "2 GB is okay for PS4 / One gen"
The ones that saved between $700 and $1000 by buying the cheaper card and invested it in Nvidia stock. (Have you seen how much it's gone up since 2020? Man, I wished I invested 10k in Nvidia even back in 2016)When I bought my 3090 people were saying it’s not a gaming card, nobody needs 24gb, and the 10gb 3080 is perfectly fine.
Who’s laughing now? Mwahaha
If you're using a 3080 at max settings, you're likely running at a lower resolution with DLSS.People who bought 3070s and 3080 10gb cards really got screwed.
When I bought my 3090 people were saying it’s not a gaming card, nobody needs 24gb, and the 10gb 3080 is perfectly fine.
Who’s laughing now? Mwahaha
Well there was the 12 GB 3080 Ti, but AFAIK there were no 1.5 GB GDDR6X memory modules, so there was no way to get to say, 16GB with a 384-bit bus.For this GPU 24GB is overkill but Nvidia likes to fuck consumers with "options" like that:
10GB -------------------(nothing)--------->24GB
3090 would be super fine with 16GB (with lower price).
If you're in the market for a graphics card (new build) should I aim 12GB or 16GB VRAM? Budget for the graphics card is 600€.
I leaning towards 4070 Super, but even after watching a bunch of benchmarks, that 12GB GB VRAM seems a bit short?
Well there was the 12 GB 3080 Ti, but AFAIK there were no 1.5 GB GDDR6X memory modules, so there was no way to get to say, 16GB with a 384-bit bus.
12 GB cards require a full GA102 die so will have worse yields. Increasing the bus size further would have increased the cost and complexity, and according to Microsoft, large bus sizes are difficult to implement on GDDR6 due to the higher speeds.12GB cards were added later. 3080 would be ok if it was 12GB from the beginning.
Of course they designed this GPU with that 24GB target, if they wanted they could have made cards with different memory layout.
What's the highest anyone has seen vRAM usage in a game? Just curious. I'm on 8 GB now and my next card will have 16 GB as a floor.
12 GB cards require a full GA102 die so will have worse yields. Increasing the bus size further would have increased the cost and complexity, and according to Microsoft, large bus sizes are difficult to implement on GDDR6 due to the higher speeds.
They could have used the 3090's clamshell design on the 3080 for 20GB, but anything less would have required reducing the size of the memory bus and therefore lowering bandwidth.
Edit: Actually the 3080 Ti still isn't a full GA102 die, but enabling the extra SMs still isn't "free".
12GB is the minimum now if you want to use max setting in most games (not all of them) and sometimes you won't be able to use Nvidia frame gen if you target 4k output (even with dlss performance). 16GB is the sweet spot right now. 20GB and more is useless most of the time.
Won't be gaming at 4K at all. 1440p is the way.
Threads like this really show how out of touch NeoGAF is with the usual PC situation. According to the Steam hardware survey over 75% of users have 10GB or less.People who bought 3070s and 3080 10gb cards really got screwed.
I had a 3070 and upgraded to a 4070. Didn't get screwed.People who bought 3070s and 3080 10gb cards really got screwed.
I should have listened. Happened to me with the GTX 680 only having 2gb of vram and then I did it again with the 3080 10gb. Got the 680 for physx and the 3080 for ray tracing, I'm a sucker for those nvidia features.We tried telling people.
I had a 3070 and upgraded to a 4070. Didn't get screwed.
VRAM issues on 3070 never affected me.
Threads like this really show how out of touch NeoGAF is with the usual PC situation. According to the Steam hardware survey over 75% of users have 10GB or less.
Dude testing on 4K or 1440p is missing the point
8gb should be perfectly fine if you game on 1080p (specially with DLSS). When you add Lossless Scaling then, its even better.
I'm playing Cyberpunk with RT on Psycho at 60 FPS on my notebook equiped with a 3060. DLSS on Quality, 1080p, Lossless Scaling framegen.
Problem is, there are too many unoptimized games being released nowadays. Its more about the tech being used and less about the hardware.
"Many games also start to show problems at 1440p, frame gen is also unusable in some of them.", that was my point. But if you game at 1080p (specially with DLSS) you'll be fine most of the time.GoT is perfectly playable with 4K DLSS performance on 16GB version and unplayable on 8GB one. Many games also start to show problems at 1440p, frame gen is also unusable in some of them.
3070 for example never was a 1080p card, it was 500$ mid-high end GPU in 2020 (3080 was HE and 3090 was Enthusiast level) designed mostly for 1440p gaming. 4060ti has about the same performance for 400$ in 2023, both GPUs share the same problems.
With more VRAM you have way more options. This video also focuses on 2024 games, there were many games where 8GB cards had missing textures and other problems in 2023.
You DON'T WANT 8GB card in 2024 and everyone buying GPU should know this.
"Many games also start to show problems at 1440p, frame gen is also unusable in some of them.", that was my point. But if you game at 1080p (specially with DLSS) you'll be fine most of the time.
And framegen is perfectly fine if your baseline is 30 FPS. As I said, I'm playing Cyberpunk at 60, and also Baldurs Gate 3. Wouldnt recomend it to online competitive games, but for most games should be more than fine.
Would I buy an 8GB card in 2024? Probably not.
Are 8GB cards totally unusable in 2024? Certainly not the case if you game at 1080p.
This.3080 ti with 12GB
12GB masterrace. /not really
I should have listened. Happened to me with the GTX 680 only having 2gb of vram and then I did it again with the 3080 10gb. Got the 680 for physx and the 3080 for ray tracing, I'm a sucker for those nvidia features.
Got fooled again...
Googled how much vram my 1080ti has.
11. Those retards went backwards hahahahah