• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Pragmata PC Performance Raises Concerns for 8 GB GPU Users, Digital Foundry Reports

It is true. The mean GPU sold is a 5070, but that is dragged by 5090 and such.

We are still seeing a massive chunk of the market on 8GB. If 40% of the market is 8/9GB in perpetuity, yes that's all that can be afforded.


No it's a sign that that's all gamers can afford at a 299-349$ MSRP.

Well we can say goodbye to the prospect of developers designing their games around Nvidia's path tracing technologies then. AMD budget cards are going to command a massive advantage on that front if Nvidia doesn't raise the bar to 16GB for their lower end cards.

Why even make a 6060 card if all the goodies like RT are kneecapped by lack of VRAM? Why push path tracing at all?
 
Last edited:
Hogwarts look like this

wzjsvnHm6bxA2hov.jpg
46BOZMpLWQ6yRUo0.jpg




(timestamped) Game is just not loading quality textures, even in your face. And this is not the only game doing that.

Forcing settings requiring more vram than you have and hoping that game will manage that is not a good idea.

Did you read what i just said? Texture quality here alters texture streaming, it changes the quality of distant textures. Setting it to medium on a rtx 3070 would make it barely noticeable unless you were playing on higher resolutions or zooming in on distant assets




So yeah, setting it to the max on a 8gb vram card even though the setting was designed explicitly to account for different vram amounts with minimum-to-null impact on the iq and to correspond to specific resolutions, its gonna cause problems.
 
Last edited:
Did you read what i just said? Texture quality here alters texture streaming, it changes the quality of distant textures. Setting it to medium on a rtx 3070 would make it barely noticeable unless you were playing on higher resolutions or zooming in on distant assets




So yeah, setting it to the max on a 8gb vram card even though the setting was designed explicitly to account for different vram amounts with minimum-to-null impact on the iq and to correspond to specific resolutions, its gonna cause problems.


But you were talking about games doing automatic adjusts and that's clearly not the case for many titles.

I actually played HL on 3060ti and medium textures were noticeable (vs. high) when moving the character. Who knows what version that was, game got tons of patches over the years.
 
But you were talking about games doing automatic adjusts and that's clearly not the case for many titles.
I very clearly said there were cases where the setting itself alters it
Hogwarts legacy and RE engine are case scenarios where the texture setting itself alters (or can alter) texture streaming rather than texture quality. Aka only distant textures may appear blurrier, RE engine even has explicit settings for texture pool size vs texture quality. Other examples of this approach include expedition 33 and indiana jones tgc.
Not like the method matters that much when its designed for the same purpose

I actually played HL on 3060ti and medium textures were noticeable (vs. high) when moving the character. Who knows what version that was, game got tons of patches over the years.
Yeah and some people noticed no differences at all. One guy in a benchmark even said to crank up everything to the max and leave textures at low and you'd barely notice a difference compared to ultra.
 
Last edited:
I very clearly said there were cases where the setting itself alters it




Yeah and some people noticed no differences at all. One guy in a benchmark even said to crank up everything to the max and leave textures at low and you'd barely notice a difference compared to ultra.

Settings alters textures and how much memory they use in many games (most games)?

Something that Ubisoft massive is doing with automatic adjust is very uncommon. Most games allow users to max out the game beyond how much vram they have, and once game goes over that issues will appear (stutter, CTD or ugly textures).

Memory pool setting was also not working correctly at launch with HL, game was still performing like crap on 8GB cards. They really never fixed this fucking game.
 
Last edited:
Settings alters textures and how much memory they use in many games (most games)?
theres no universal standard for how these settings are labelled (which admittedly sucks), but i've yet to see a game that will genuinely look or play bad on a 8gb card compared to a ps5 (assuming settings are correctly adjusted) that wasn't broken at release or simply looks/play bads in general.

Memory pool setting was also not working correctly at launch with HL, game was still performing like crap on 8GB cards. They really never fixed this fucking game.
Broken at release or not, it currently runs on a 6gb rtx 3050 at 1080p ultra dlss q, so its certainly not broken anymore, nor did it have some inherent vram problem

 
Last edited:
theres no universal standard for how these settings are labelled (which admittedly sucks), but i've yet to see a game that will genuinely look or play bad on a 8gb card compared to a ps5 (assuming settings are correctly adjusted) that wasn't broken at release or simply looks/play bads in general.


Broken at release or not, it currently runs on a 6gb rtx 3050 at 1080p ultra dlss q, so its certainly not broken anymore, nor did it have some inherent vram problem



This game has correct settings for 8GB GPU in DF video (medium). Does it look good with medium textures, as good as PS5 version?



pxolSVLtBXNnZomH.jpg
 
This game has correct settings for 8GB GPU in DF video (medium). Does it look good with medium textures, as good as PS5 version?



pxolSVLtBXNnZomH.jpg

Dude my very first post on this conversation with you was pointing out DS2 automatically adjust texture pool size to vram (with video proof). Go back the posts lest we keep posting in circles.

This game has correct settings for 8GB GPU in DF video (medium).
When did i ever say the correct setting for Death Stranding 2 is medium? Or are thinking because it was the correct for Hogwarts (at release only apparently as this was clearly improved on) it is also the correct for every single other game out there? Are you going to play CS 1.6 on medium texture settings too?
 
Last edited:
Dude my very first post on this conversation with you was pointing out DS2 automatically adjust texture pool size to vram (with video proof). Go back the posts lest we keep posting in circles.


When did i ever say the correct setting for Death Stranding 2 is medium? Or are thinking because it was the correct for Hogwarts (at release only apparently as this was clearly improved on) it is also the correct for every single other game out there? Are you going to play CS 1.6 on medium texture settings too?

Nixxes games drop performance like a rock when you go out of vram.

9ulJbymO5hzUOND3.jpg




Video you posted uses 1080p and 1440p output, DF uses 4k output (so compatibility with PS5 version).
 
Last edited:
Nixxes games drop performance like a rock when you go out of vram.

9ulJbymO5hzUOND3.jpg



Same problem as before, running very high texture quality when 8gb should be running at high tq in this game. The high setting in particular here doesn't alter texture quality, just the streaming.

Video you posted uses 1080p and 1440p output, DF uses 4k output (so compatibility with PS5 version).
Video i posted also cranked up everything to the max, whereas ps5 performance mode runs at 1440p internal res with a mix of high and medium settings.
 
Same problem as before, running very high texture quality when 8gb should be running at high tq in this game. The high setting in particular here doesn't alter texture quality, just the streaming.


Video i posted also cranked up everything to the max, whereas ps5 performance mode runs at 1440p internal res with a mix of high and medium settings.

I don't have 8GB GPU and DS2 to test it.

But based on other nixxes games, this guy testing it could potentially achieve higher performance on lower texture settings on that 3060ti. That would mean game is not scaling perfectly with memory.
 
But based on other nixxes games, this guy testing it could potentially achieve higher performance on lower texture settings on that 3060ti. That would mean game is not scaling perfectly with memory.
Dunno, like i said theres no universal standard for these graphical settings.

The whole point of this conversation is that all these issues over 8gb ram cards DF and others keeps screaming over have been easily solved with proper texture pool management and streaming, and the "problematic" cases usually presented are just people not understanding what the texture quality setting is actually doing (Which admittedly isn't always clear)
 
Last edited:
Dunno, like i said theres no universal standard for these graphical settings.

The whole point of this conversation is that all these issues over 8gb ram cards DF and others keeps screaming over have been easily solved with proper texture pool management and streaming, and the "problematic" cases usually presented are just people not understanding what the texture quality setting is actually doing (Which admittedly isn't always clear)

I agree that you can still play ok on 8GB cards and that require correct settings.

But ability to match PS5 quality is out of the question for many games. For example Pragmata has hair strands, highest texture quality and RT on PS5 and we know that 4060 (that has comparable raster performance and better RT performance) can't play the game smoothly on similar settings - it stutters and fps tanks in bigger levels.
 
But ability to match PS5 quality is out of the question for many games. For example Pragmata has hair strands, highest texture quality and RT on PS5 and we know that 4060 (that has comparable raster performance and better RT performance) can't play the game smoothly on similar settings - it stutters and fps tanks in bigger levels.
We know? All the benchmarks i've saw with this range of cards had the game performing very closely at such settings. Also worth mentioning the fidelity mode (that has all these settings like hair strands cranked up) in the PS5 often has fps drops
 
Last edited:
We know? All the benchmarks i've saw with this range of cards had the game performing very closely at such settings. Also worth mentioning the fidelity mode (that has all these settings like hair strands cranked up) in the PS5 often has fps drops

That is what Richard was talking about in the video mentioned in OP. Do you think he is lying?



He talked about demo area running perfectly but later more open areas starting to show problems. Many comparison videos on youtube etc. focus on areas that are often not very taxing in games (so performance charts can be misleading).
 
Last edited:
Love how this thread Is Nvidia users justifying <12GB RTX cards by using DLSS. But someone doing FSR 4+ on 7900xtx to do 1440p/Ultra(RT high) to get locked 60fps Is suddenly trolling. I love the "Who cares RDNA 2 Is shit" yet the PS5 & Series X perform about the same as 3060ti ~ 3070 which Digital foundry has proven in a ton of videos.
 
That is what Richard was talking about in the video mentioned in OP. Do you think he is lying?



He talked about demo area running perfectly but later more open areas starting to show problems. Many comparison videos on youtube etc. focus on areas that are often not very taxing in games (so performance charts can be misleading).

Lying? No. Is the article/thread title greatly exagerated? Yeah

...the experience was less consistent on hardware such as an RTX 4060 with 8 GB of VRAM.

On mid-range systems, the game was able to approach 60 fps at 1440p using DLSS Balanced settings, though performance dips were observed. Frame rates occasionally dropped into the 50s and even high 40s, particularly during demanding scenes and cutscenes. The report indicates that these issues were not purely tied to raw compute power, but instead linked closely to VRAM limitations.
These 'concerns' were the fps dropping to the high 40s in such scenes with these cards,

Which just also happens to be the same drops you'd observe in the ps5 quality mode



So, yeah. Potatoes potatoes
 
Last edited:
Lying? No. Is the article/thread title greatly exagerated? Yeah


These 'concerns' were the fps dropping to the high 40s in such scenes with these cards,

Which just also happens to be the same drops you'd observe in the ps5 quality mode



So, yeah. Potatoes potatoes


Game going out of vram

bqETsjwurFOmluJb.jpg


And this happened after he dropped some settings (demo area was running perfectly in comparison).
 
Game going out of vram

bqETsjwurFOmluJb.jpg


And this happened after he dropped some settings (demo area was running perfectly in comparison).
From the video, this happened specifically during a certain cutscene transition. Far more likely to be a bug or jank, may not even be reproducible. Not exactly a good indicator for performance (the guy in video even implies so)
 
Last edited:
From the video, this happened specifically during a certain cutscene transition. Far more likely to be a bug or jank, may not even be reproducible. Not exactly a good indicator for performance (the guy in video even implies so)

This looks like game going out of memory, similar thing happened to dead space in the past on 8 or even 10GB GPUs (game is actually jumping to 12GB in that moment and spilling over). timestamped video:



Full analysis will show us what's is going on.
 
Last edited:
Anyone who bought 8GB cards knowing consoles had 16 GBs is an idiot and needs to turn in their PC gaming card. I dont want you in my team. You are dumb and you should go buy a switch.

That 16GB is shared memory for both data & graphics so they can't use a full 16GB for graphics, but apparently some console games use up to 10.5GB for graphics (I saw Last of Us mentioned / PS5) so yea its still more than 8GB.

Either way I agree 8GB vram has been a problem for awhile.. you know its getting bad when you have to lower graphics settings at 1080P to keep VRAM capacity happy, or just deal with the wonderful stuttering.
 
This looks like game going out of memory, similar thing happened to dead space in the past on 8 or even 10GB GPUs (game is actually jumping to 12GB in that moment and spilling over). timestamped video:



Full analysis will show us what's is going on.

It did not happen on 8gb cards but happened on a 10gb one, the video says so. Thats why i said its more likely to be a bug, it may be memory related but its not because of low vram, its just an error in its management
 
Last edited:
Steam Machine is looking worse day by day... And it's not even out.
Dunno about that



Valve's VRAM optimizations for Linux, on a 4GB VRAM GPU - in Alan Wake 2 it got boost from 15 fps to 40fps.


And new Proton 11 Beta which fixes a lot of stuff, and enables NTSYNC by default. Plus Valve Engineers started to merge fixes for Raytracing on Mesa (AMD Open Sources drivers on Linux), I already see gains on my RX 9070XT and Steam Machine isn't even out yet.

And with ProtonPlus you can like install GE-Proton, EM-Proton and DWProton, custom Protons with FSR4 support for both RDNA3 and RDNA4.
 
Last edited:
It did not happen on 8gb cards but happened on a 10gb one, the video says so. Thats why i said its more likely to be a bug, it may be memory related but its not because of low vram, its just an error in its management

It did not happen with 1440p output, if he tried 4k output he would see exactly the same issue (probably even worse).

I had exact same problem in Dead Space when I had 3070 or 3060ti (first I had 3070, 3060ti much later), and it does not happen in one place but in several places. Game jumps vram usage by 2GB in one second and fps drops to single digits, it fixes itself after a miunute or two. You can replicate it when you want (if you have the right hardware).

I had 3070 since 2020, and played it with 4K output so I know how vram limitations looks like, with this output problems happened much faster than for lower resolution users, and this GPU is 2080ti level so it's not weak - it's just fucked up by low vram. Even dropping DLSS to performance is not always fixing issues because 4k output is still using much more memory than 1080p.

Edit: Dead Space, game on 4k output, DLSS quality and max settings. Runs with ~8.5GB of reserved memory, when you start this cutscene - it jumps up to ~10.2GB

bLaN1OXqJXCE0xdq.jpg
KSbYkyErMZuCffDM.jpg




That's why the game was going out of vram on 3080 (and I even think he used lower DLSS resolution).

There are few more points like that across the game. Zero automatic scaling for vram, it just spills over.
 
Last edited:
Well we can say goodbye to the prospect of developers designing their games around Nvidia's path tracing technologies then
Practically all path traced games run on 8GB Nvidia stuff. Your perceptions are completely warped about VRAM.

Besides what do you think RTX Neural Texture Compression is for?
AMD budget cards are going to command a massive advantage on that front if Nvidia
Putting Lipstick on a pig has never worked as a strategy for them. And they have abandoned it with RDNA5.

There is a reason 9060 XT 16GB and 7060 XT are such commercial failures.

RX 1070 XT is 18GB, 1070 is 15GB, 1070 GRE is 12GB and 1060/etc don't exist. Get on the program.
on that front if Nvidia doesn't raise the bar to 16GB for their lower end cards.
This is completely batshit and stupid.

16GB VRAM is mid-high end amount of VRAM. The number of games that even theoretically make use of it over 12GB is single digit. This is more the usual gamer demand of excessive memory and specs in search of a use case.

Low end is 8-9GB. Will be the same in 2028. Get on with the program.

PS6 competes with a 6070 Desktop PC. It's a very high end machine with total cost of ownership over 1500$. It won't set the floor nor can it set a floor.
Why even make a 6060 card if all the goodies like RT are kneecapped by lack of VRAM? Why push path tracing at all?
If you are worried about that you pay the 450$ for 6060 Ti 12GB.
 
Last edited:
Did some testing with Pragmata last night, and at 1080p max settings with path tracing, the game was using anywhere from 6.5gb -10gb vram. Although most of the test it was hovering in the 8-9gb vram range.
 
"Usage"

We've been through this movie before. What a game uses when it has 32GB to use isn't what it'd use when it has 8-9-10-11-12GB. That's what's relevant to see.

RE8 was going out of vram on 3070 with max texture settings (and dropping to single digits) with 1080p output - at the time I didn't know what was going on.

RE4 was crashing to desktop when you turned on RT on 8GB cards.

Don't expect games to manage vram automatically without issues if you use settings that require more than 8GB of memory..
 
Last edited:
"Usage"

We've been through this movie before. What a game uses when it has 32GB to use isn't what it'd use when it has 8-9-10-11-12GB. That's what's relevant to see.

That's kind of a weird statement as if nothing would ever actually use over 8GB? You don't think sites like Techpowerup know how to get accurate readings?

all I know is when I played High on Life 2 on my 3060 Ti 8GB it was a stuttering mess, VRAM usage was always pegged at 7.8GB with MSI Afterburner stats (guessing the other 200MB is reserved for Windows etc).
Upgraded to a 5070 12GB and now VRAM usage showing 8.5GB minimum indoors and 10.5GB when outdoors, this is with maxed settings + DLAA + 2X Framegen at 1080P, Game has far less stuttering too, but Unreal Engine 5 is known to be a vram hog I haven't played Pragmata.
 
Completely wrong as usual compared to what i got on my 4070ti.
Unless thats actual 'allocation' and not vram usage as games will allocate alot of vram if its available, like on the 5090 they are using. It doesnt mean a game needs that much to run though.

Allocation is enough to break performance, you don't need usage for that.

Dead Space goes above 10GB of allocation (over 8 in usage) and it fuck ups performance on 3080 10GB.

 
Last edited:
Allocation is enough to break performance, you don't need usage for that.

Dead Space goes above 10GB of allocation (over 8 in usage) and it fuck ups performance on 3080 10GB.


Dead Space ran fine for my when i had a 3070 to be honest.

And i was just making the point when you have a card with a lot of vram, like the 5090, games will allocate more vram as there is a lot available.
 
Last edited:
Dead Space ran fine for my when i had a 3070 to be honest.

And i was just making the point when you have a card with a lot of vram, like the 5090, games will allocate more vram as there is a lot available.

Some will do that, some won't. Every game is different in this aspect.

I played DS on 8GB card and it was going out of vram with 4k output.
 
8GB was already a concern back in 2020 for anyone who wanted to run on mostly max settings. I remember people being concerned with the launch 3080 and its 10GB of VRAM. Shouldn't even be considered in 2026.
 
I'm due a new monitor upgrade but currently on a 1080p 165hz so I got a bit of headroom.

I set everything to max with PT and DLSS on Quality and it seemed smooth, didnt check FPS.

Whats the verdict on Frame Gen X 2? From what i read sounds like its worth just having on.

I will test it out when i play it again tomorrow.

9800X3d/ 5070ti.
 
That's kind of a weird statement as if nothing would ever actually use over 8GB? You don't think sites like Techpowerup know how to get accurate readings?
I am not saying that. And no they don't.

The issue is again testing VRAM usage when using infinite GB cards isn't right because games can adjust depending on how much is available.

The right way is using the limit VRAM SKUs and SKUs with 8-9-10-11-12 GB VRAM and seeing if they have slowdowns, stuttering, pop ins, lower details, etc.
 
Why even make a 6060 card if all the goodies like RT are kneecapped by lack of VRAM? Why push path tracing at all?

Ideally the 8/9 GB cards would be OEM only. But that would mean the entry level card would be $500 or more.

Although in a world where the PS5 is $599, maybe that's how it should be.
 
Practically all path traced games run on 8GB Nvidia stuff. Your perceptions are completely warped about VRAM.

Besides what do you think RTX Neural Texture Compression is for?

Putting Lipstick on a pig has never worked as a strategy for them. And they have abandoned it with RDNA5.

There is a reason 9060 XT 16GB and 7060 XT are such commercial failures.

RX 1070 XT is 18GB, 1070 is 15GB, 1070 GRE is 12GB and 1060/etc don't exist. Get on the program.

This is completely batshit and stupid.

16GB VRAM is mid-high end amount of VRAM. The number of games that even theoretically make use of it over 12GB is single digit. This is more the usual gamer demand of excessive memory and specs in search of a use case.

Low end is 8-9GB. Will be the same in 2028. Get on with the program.

PS6 competes with a 6070 Desktop PC. It's a very high end machine with total cost of ownership over 1500$. It won't set the floor nor can it set a floor.

If you are worried about that you pay the 450$ for 6060 Ti 12GB.

I'm skeptical that Nvidia's "95% market share" is going to gaming consumers. Data shows AMD and 16GB cards besting Nvidia and 8GB cards at retail.



RTX neural texture compression is designed for high end cards, which will come with far more than 8GB VRAM. It's for developers to push out even more detail with the same high VRAM budget.

Remember that games are still designed around the memory constraints of Xbox Series S.

The PS6 may be a high end machine, but as we saw with PS4 Pro/Xbox One X, developers are going to optimise their games primarily around the configurations that high end users use. As heavier ray tracing models are embraced, PS5 will be running with low textures/low resolutions at 9GB VRAM, and Series S will be dropped. 12GB+ will soon be mandatory for non-soupy textures
 
I'm skeptical that Nvidia's "95% market share" is going to gaming consumers. Data shows AMD and 16GB cards besting Nvidia and 8GB cards at retail
It's going to gamers. Stop embarrassing yourself. Mindfactory is irrelevant.

Mind factory sold 2k GPUs in 2 weeks. In that period, around 2 million desktop dGPUs were sold. It's a small very low volume subset that is so biased it's not representative of anything.

We have the numbers from Jon Peddie research and you can see AMD earnings for yourself. Nvidia is at 93%+

The PS6 may be a high end machine, but as we saw with PS4 Pro/Xbox One X, developers are going to optimise their games primarily around the configurations that high end users use
Nonsense, look at PS5 Pro. Developers keep putting on patch jobs. Nothing major.

I am not saying developers will ignore PS6, I am saying no game will require PS6 or be designed around having it. Especially since PS6 Handheld exists and PS5 will be supported. Smh.

8GB will be low end next gen. It's entry level now.
 
Last edited:
I'm skeptical that Nvidia's "95% market share" is going to gaming consumers. Data shows AMD and 16GB cards besting Nvidia and 8GB cards at retail.



RTX neural texture compression is designed for high end cards, which will come with far more than 8GB VRAM. It's for developers to push out even more detail with the same high VRAM budget.

Remember that games are still designed around the memory constraints of Xbox Series S.

The PS6 may be a high end machine, but as we saw with PS4 Pro/Xbox One X, developers are going to optimise their games primarily around the configurations that high end users use. As heavier ray tracing models are embraced, PS5 will be running with low textures/low resolutions at 9GB VRAM, and Series S will be dropped. 12GB+ will soon be mandatory for non-soupy textures

This "data" is pure copium based on sales in one shop in one country.
 
Top Bottom