Where is the Neural Texture Compression Nvidia? We need you.
Pragmata demo sold me on the game unfortunately my 1060 made it not worth to play on my current rig. Runs smooth but graphic had to set so low it look like I was playing a PS3 game at 480p.
should still run better than the Switch 2 version on a 1060 tho.
Sure but I'm not going to play Switch 2 level of graphic on my just updated PC (except GPU) with a 32" monitor.
should be Series S level actually.
I do the same as you with a 3060ti and at 1080p I have zero issues with performance. I will be upgrading in the future, but I don't really worry about it at all.Lmfao
The article literally doesn't test or even mention performance at the resolution I'm using. It completely dodges the exact scenario I described: RTX 3070 8GB running every modern 2025/2026 release at 1080p, stable 50-60 FPS, no stuttering, no crashes, no texture pop-in. That's what you called 'inadequate for far too long'
Next time link something that actually addresses my setup instead of proving you can't read past the headline.
Weak af
Looking at those RT reflection suggests they have a poor quality variant, it really should be that bad.
I'm just comparing that to say Cybperpunk 2077 on Pro, which has surprisingly high-resolution RT reflections.
Also doesn't seem to support RT shadows and need path tracing to actually do decent shadows.
let's cope together, brotherI guess my 10gb card is fine?
It has been a problem since 2020 when consoles had 16 gigs as they are lowest common denominator. Capcom will be developing their games around having 16 gigs of vram.
Watanother foolish developer that will be deleted sooner or later.
another foolish developer that will be deleted sooner or later.
It doesn't matter if 8GB VRAM is enough or not.
8GB VRAM is all gamers can afford right now.
Meh it depends on resolution. Some people are still okay with 1080p.It has been a problem since 2020 when consoles had 16 gigs as they are lowest common denominator. Capcom will be developing their games around having 16 gigs of vram.
People that don't know what they are doing. AMD offered cheaper cards with more vram for years, intel as well. Nvidia also "by mistake" released 12GB 3060 in 2021 that is the most popular GPU right now.
No, on consoles that's the full, shared memory. It also counts as regular RAM.It has been a problem since 2020 when consoles had 16 gigs as they are lowest common denominator. Capcom will be developing their games around having 16 gigs of vram.
Meh it depends on resolution. Some people are still okay with 1080p.
It has been a problem since 2020 when consoles had 16 gigs as they are lowest common denominator. Capcom will be developing their games around having 16 gigs of vram.
![]()
They offered substantially inferior cards for slightly less money. They are quite literally half a console gen to a full console gen behind in GPU architecture like every gen. They then put them on maintenance driver support practically 2-4 years in. That's why RDNA2 is double digit better on Linux than Windows. AMD froze the RDNA2 driver for all intents a long time ago. They just fix critical issues and security vulnerabilities.
AMD Cards are what we call "value traps". It's the appearance of value for frugal suckers without actual value.
You buy a worse GPU from a worse brand so you pay a lower price. Real gamers understand this. It's why 9070 XT ends up competing with 9070 / 5070 in the real world.
I sworn i would never buy an AMD GPU after my x1950pro was on "legacy driver" before i knew it. I could play some newer, lighter games on my older/weaker Nvidia card that wouldn't even launch on the newer AMD one, because the drivers wouldn't support them.AMD Cards are what we call "value traps". It's the appearance of value for frugal suckers without actual value.
You buy a worse GPU from a worse brand so you pay a lower price. Real gamers understand this. It's why 9070 XT ends up competing with 9070 / 5070 in the real world.
Most consumers don't agree. It's a valid view held by a niche subset of real gamers.No matter the inferior upscaling or RT, you still won't be vram limited on AMD cards.
PS5 rush ports.There are raster games that require more than 8GB to function properly on console settings, like FFXVI
Most consumers don't agree. It's a valid view held by a niche subset of real gamers.
PS5 rush ports.
Exactly.I sworn i would never buy an AMD GPU after my x1950pro was on "legacy driver" before i knew it. I could play some newer, lighter games on my older/weaker Nvidia card that wouldn't even launch on the newer AMD one, because the drivers wouldn't support them.
Equivalence fallacy.Last time AMD was relevant was ~5000 series when they had around half of the market. Since then, nvidia was dominating and this was long before DLSS or RT
8 gigs is not enough and manufactures should not be offering it. They will keep selling supposedly mid range GPUs with 8 gigs of ram when the PS6 is out even if that console has 24 gigs or more if people defend it. Consoles should be seen as the minimum target which it is for developers. It's not just about resolution, games will be designed around whatever the next generation has and then Pragmata will be the norm. It's going to happen as it has done before several times. Nothing new. The only difference, people didn't defend GPUs with 512mb when the PS4 was out. That would be the equivalent of defending 8 gigs when the PS6 is out and apparently that's next year.they only have about 13GB usable by devs, and you can probably expect around 3 to 4 GB of that being used by the CPU and not as VRAM.
Yes.And maxwell offered no real advantages over AMD, no RT/ML like stuff - it was just more efficient.
they only have about 13GB usable by devs, and you can probably expect around 3 to 4 GB of that being used by the CPU and not as VRAM.
That would be the equivalent of defending 8 gigs when the PS6 is out and apparently that's next year.
Nvidia margin is law at GeForce.8/9 GB is going to be around for awihle... because NV wants to hit price points and still have some margin.
But that's 16gb in total ram. What's the difference between that and a PC with separate vram and system ram?Anyone who bought 8GB cards knowing consoles had 16 GBs is an idiot and needs to turn in their PC gaming card. I dont want you in my team. You are dumb and you should go buy a switch.
No doubt but it will set a precedent for next generation. I am skeptical but if the next consoles are anywhere near 30 gigs it will 100% cause problems. Not going to bet on an exception. I had a 4870 XT back in the day with 2 gigs and was perfectly fine during the Xbox 360/PS3 but died to death when the PS4 came out. My GeForce 4 became useless the generation previous. One thing I agree with Digital Foundry on was the discussion around Doom the Dark Age. People don't accept having to upgrade anymore which was the norm with PCs. Maybe it will be different this time with Steam Deck etc. still don't agree with Nvidia offering it because they can.8/9 GB is going to be around for awihle... because NV wants to hit price points and still have some margin.
512MB would be way below minimum requirements of games from that era. The correct equivalent would be 2GB since that amount was barely cutting it.The only difference, people didn't defend GPUs with 512mb when the PS4 was out. That would be the equivalent of defending 8 gigs when the PS6 is out and apparently that's next year.
The problem is path tracing. Thats what Alex was talking about.But that's 16gb in total ram. What's the difference between that and a PC with separate vram and system ram?
The demo looked and ran great at 4k (I think with one of the DLSS settings but can't remember) and over 60fps on my 8GB 3070TI. I don't know how much the later parts of the game push things.
8 gigs is not enough and manufactures should not be offering it. They will keep selling supposedly mid range GPUs with 8 gigs of ram when the PS6 is out even if that console has 24 gigs or more if people defend it. Consoles should be seen as the minimum target which it is for developers. It's not just about resolution, games will be designed around whatever the next generation has and then Pragmata will be the norm. It's going to happen as it has done before several times. Nothing new. The only difference, people didn't defend GPUs with 512mb when the PS4 was out. That would be the equivalent of defending 8 gigs when the PS6 is out and apparently that's next year.
You're moving the goalposts tho.
the fact is that devs already have to optimise for systems with around 8 to 9 GB of VRAM, the consoles don't have 16GB of VRAM, only realistically 8 to 9, due to a massive portion being taken up by the OS and another sizable chunk by the CPU
Switch 2 master race?It sucks that most gaming PC will struggle with Pragmata.
![]()
Switch 2 master race?
Nvidia margin is law at GeForce.
Means more VRAM => Higher MSRP. Inescapably.
In the past they had to compete with AMD and rely on gaming for profits. But neither of that is the case now. So the margins will not come down.
Except that the opening is fighting against a massive deficit in engineering competence. your margin is my opportunity only works when the technical competence is equalized. Furthermore, Tapeout and mask set costs are rising and they add fixed costs that hurt a lot more when you sell a fraction of the volume.Then there's an opening for AMD to undercut and deliver value.
They want Nvidia features because the features are genuinely revolutionary and industry shaking. The idea that for less than 1 ms you can generate 15/16 pixels artificially at native like quality and motion fluidity is absolutely nuts. Same for Nvidia's RTX neural texture compression.no one needs most of Nvidia's features
competent yet still architecturally worse than Ada.RDNA4 and beyond are actually competent at upscaling and RT.
it will set the standard for nothing mainstream. it's a dedicated gaming only device with >750$ initial price and likely well over 1500$ in total cost of ownership over its lifetime.PS6 having 30GB (24GB+ VRAM) will set the standard
Nonsense. most older games will work and many games will just have pop in or acceptable slowdowns. 4K gaming is just not that mainstream, it's more 500$ GPU Territory8GB is just useless for driving 4K displays.
People aren't on the market for new GPUs/consoles to play old games. New GPUs come with the expectation of better graphics and 1440p/4K TVs and monitors assist with this.Nonsense. most older games will work and many games will just have pop in or acceptable slowdowns. 4K gaming is just not that mainstream, it's more 500$ GPU Territory
GeForce is just so much better. just compare 9070 XT (N4C, 357 mm², 640 GB/s) and RTX 4080 Super (N5B, 379 mm², 736 GB/s). 4080 Super quite literally wins on all metrics including being half a console generation ahead architecturally.
8GB is fine. you're acting like it's a problem that a 300$ GPU has compromises of any sort.People aren't on the market for new GPUs/consoles to play old games. New GPUs come with the expectation of better graphics and 1440p/4K TVs and monitors assist with this.
Not really the norm. I can tell you don't play on PC.If you say that acceptable slowdown = 10-20FPS with awful stuttering,
They don't. XSX / PS5 are strictly inferior to an RTX 3060 in every way, including VRAM.. The PS5 Pro or even Xbox Series X's "outdated" AMD RDNA GPUs have been besting new Nvidia RTX cards
Yes, they have similar costs but Nvidia provides substantially more value. that was my point, if one looks at the silicon and the memory, and considers Ada is late 2022 and RDNA4 is early 2025, it's disgraceful that RDNA4 is so much worse than Ada.Not MSRP. It's a $600 card vs a $1000 card.
Yes and that's how it is in reality.The 9070XT should be compared to Nvidia's 4070 Super or 5070 cards
My 3080Ti is Path Tracing everything thus far precisely because I can live with 1080p. I can't see myself moving on from it. Whatever extra power I get I'll want it in effects.51% of steam hardware survey still use 1080p