STALKER 2 - PC System Requirements Revealed

Textures/assets. Diablo 4 will fill up your vram and then start grabbing system ram to store more for example. The game even warns that you need 32gb system ram for the ultra 4K texture setting.

that must be a massive jump in texture quality then.

because if a game suggests 16GB of system memory, usually it will maybe use 6~8 GB at most, and the 16GB is simply given as a requirement so that there's a big enough buffer zone there, that users will not run into issues while having programs running in the background, and of course because Windows uses a lot of it as well.

so if 16GB is recommended here for Medium, and 32GB is recommended for High, the game has to have either a ridiculously big jump in asset quality between these 2 settings. or their safety buffer for Medium is really damn small.

Cyberpunk 2077 I think uses at most 9GB of system memory on its absolutely maxed out settings and on Psycho RT.

and even 9GB of memory used by a game is usually not an issue if you have 16GB... at least if you don't run OBS, 2 game launchers, and 100 chrome tabs at the same time.


my bet is that they are overly cautious here, just so they don't get any bad feedback if someone does have issues.
and I bet that even fully maxed out, the game will run fine with 16GB
 
Last edited:
and I bet that even fully maxed out, the game will run fine with 16GB
Yeah, we won't be at a spot where most of even the most graphically intensive games will need more than 16GB of ram for a while yet. Unless the highest of high settings here are so significantly above that of the Series consoles, that it requires considerably more ram, I'm sure 16 will be just fine.
 
Last edited:
For Nvidia Ada owners willing to upscale and frame gen it, these can be helpful to see, too:

stalker-2-heart-of-chornobyl-geforce-rtx-3840x2160-nvidia-dlss-3-desktop-gpu-performance.png

stalker-2-heart-of-chornobyl-geforce-rtx-2560x1440-nvidia-dlss-3-desktop-gpu-performance.png

stalker-2-heart-of-chornobyl-geforce-rtx-1920x1080-nvidia-dlss-3-desktop-gpu-performance.png

stalker-2-heart-of-chornobyl-geforce-rtx-2560x1440-nvidia-dlss-3-laptop-gpu-performance.png
Something's REALLY off about these numbers
 
Why? it will play it at decent settings with DLSS, it's still a beast
Because it feels like really not too long ago I could get much better advertised performance at 1440p. I'm not saying anything is out of the norm since everything moves so fast but it just feels like a 2by4 in the face when I see the system requirements these days. Feels like in the blink of an eye I went from 120fps@1440 to barely 60fps.
 
This is definitely one of my biggest reasons for wanting to build a new computer setup once the 5000 series GPUs are out. Requirements don't seem crazy, but I hope the game feels polished out of the gates... I still remember the original game.
 
Looks to be CPU limited as fuck around 80FPS. Not a good sign at all...
Even at 4K. 10% faster than 4080s? 50% faster than a 4070? It should be much higher than that.

UE5 too. This game has disastrous PC performance written all over it.
 
Even at 4K. 10% faster than 4080s? 50% faster than a 4070? It should be much higher than that.

UE5 too. This game has disastrous PC performance written all over it.

Yep. They are testing with 7800X3D or 9800X3D? 14900k

Without frame gen this could be unplayable on many CPUs.
 
Last edited:
Even at 4K. 10% faster than 4080s? 50% faster than a 4070? It should be much higher than that.

UE5 too. This game has disastrous PC performance written all over it.

It doesnt actually say, but maybe its RT performance? Perhaps the RT features are poorly implemented and its why the numbers are weird.
 
Looks similar performance wise to wukong, just like that I bet not playing on max gets some big performance gains.
 
This still does not explain how there is only a 1 FPS difference in 1440P between the 4080 and 4090.
How so, if the CPU/IO is unable to feed the GPU, at more than X frames, it won't go higher. UE5 sadly isn't really strong on CPU side, you are very much limited by single core performance, given that one core is "main" and others are aux. UE5 is no ID Tech 7 sadly
 
How so, if the CPU/IO is unable to feed the GPU, at more than X frames, it won't go higher. UE5 sadly isn't really strong on CPU side, you are very much limited by single core performance, given that one core is "main" and others are aux. UE5 is no ID Tech 7 sadly
It also look much better than id tech 7, sadly.

Both indy and new doom are nowhere near the best ue5 games graphic wise.
 
Last edited:
4080 here almost double compare to 4070. So 50% believable.
How does this post make sense? The 4090 is over double the performance of the 4070 in this chart. Clearly, if it's only 50% faster, the difference is only half as big as it usually is.
 
It also look much better than id tech 7, sadly.

Both indy and new doom are nowhere near the best ue5 games graphic wise.
True, but it was more of technical side of things about the core architecture. Graphically wise, UE5 is better
 
How so, if the CPU/IO is unable to feed the GPU, at more than X frames, it won't go higher. UE5 sadly isn't really strong on CPU side, you are very much limited by single core performance, given that one core is "main" and others are aux. UE5 is no ID Tech 7 sadly
How is it able to reach 85 FPS in 1080P when CPU-limited, but only 74 FPS at 1440P? That's not how a CPU-limit works. The 4080 and 4090 can't both be at like 74 FPS +-, if the CPU can handle at least 85 obviously.

About ID-tech, at least it runs buttery smooth unlike UE, there is no other engine that runs this smooth except Source i would say.
 
Last edited:
How is it able to reach 85 FPS in 1080P when CPU-limited, but only 74 FPS at 1440P? That's not how a CPU-limit works. The 4080 and 4090 can't both be at like 74 FPS +-, if the CPU can handle at least 85 obviously.
Didn't see that other graph, you are correct. I am on the phone currently. However it could be due to Nanite, which obviously is more demanding as the resolution grows.
 
For Nvidia Ada owners willing to upscale and frame gen it, these can be helpful to see, too:

stalker-2-heart-of-chornobyl-geforce-rtx-3840x2160-nvidia-dlss-3-desktop-gpu-performance.png

stalker-2-heart-of-chornobyl-geforce-rtx-2560x1440-nvidia-dlss-3-desktop-gpu-performance.png

stalker-2-heart-of-chornobyl-geforce-rtx-1920x1080-nvidia-dlss-3-desktop-gpu-performance.png

stalker-2-heart-of-chornobyl-geforce-rtx-2560x1440-nvidia-dlss-3-laptop-gpu-performance.png
Lol what did I fucking tell you.... low system requirements MY ASS.
They are running it on 4090 sub 1080p with fucking lame frame generation and get 85fps.
Is it CPU limited? Frame gen is not cpu limited is it ?
 
Lol what did I fucking tell you.... low system requirements MY ASS.
They are running it on 4090 sub 1080p with fucking lame frame generation and get 85fps.
Is it CPU limited? Frame gen is not cpu limited is it ?
It's 85fps without frame generation and native 1080p. Flipping on DLSS3 toggles DLSS Quality+Frame Generation and the fps doubles as a result.
 
It's 85fps without frame generation and native 1080p. Flipping on DLSS3 toggles DLSS Quality+Frame Generation and the fps doubles as a result.
ah ok. I am a fucking idiot. This graph sucks ass.
Still seems more demanding that sys requirements no?
 
ah ok. I am a fucking idiot. This graph sucks ass.
Still seems more demanding that sys requirements no?
Yes, a lot more. The page says Epic settings, which I assume are max settings, and 4K need a 4080, but even a 4090 does not cut it. The official requirements probably include DLSS.
 
Pc masterracists realizing they might have to turn down a few settings to run at 60fps will never not be funny to me... Unless a game has path-tracing, changing between High and Ultra settings is pretty fucking unnoticeable nowadays unless you zoom in 400 times or play on a 70 inch TV.
Anywho, it's UE5 so it doesn't inspire much confidence but I will give them the benefit of the doubt for now, this apparent CPU heaviness may actually be warranted due to all the simulations the game might do in a huge open world, after all the first three games are pretty well known for their world being quite dynamic with or without player influence, so I will expect no less from this one.
 
this apparent CPU heaviness may actually be warranted due to all the simulations the game might do in a huge open world

the game supposedly runs at 60fps on Series X...

if that laptop grade Zen2 can hit 60fps, I don't really think there's anything to worry about.
CPU requirements on these spec sheets are always far off and often even entirely nonsensical.

the fact that Medium at 1080p and High at 1440p suddenly makes a jump from not only a Zen2 to Zen3, but also from 16GB to 32GB of RAM only makes sense if the High present has heavy Raytracing and a ridiculous jump in asset quality.

I doubt that either is true here, so I assume these specs are not really anything to go by.
 
Last edited:
The devs just confirmed it, no Pre-Load on PC. Bummer considering the game is 150gb big.
What's interesting is that you can pre-load on Xbox even if you didn't buy the game or if you don't have Game Pass. Complete opposite to what PC folks are (not) getting.

So for example you can download now without waiting for a retail copy and having to download stuff that didn't fit on disc.
 
Last edited:


People getting banned on steam forums for giving out info the community manager on discord said.

DLSS or other upscalers required for those settings.
 


People getting banned on steam forums for giving out info the community manager on discord said.

DLSS or other upscalers required for those settings.



xIUXNEW.jpeg


dev community manager responded in regards to the ban, but not the upscaling which means its true what the banned user said. Im still day one if it runs decent, dont care about its most likely poor RT implementation even tho DF will drool over it as if its Jesus coming back.
 
Last edited:
Too heavy for me at the moment, 160GB is a lot.
Why is it so big? is there so much dialogue/audio and animations in the game?
 
Have we got a peep on when the review embargo will lift?
I think someone said 2 hours before the game's launch.

All signs point to this being a problematic launch unfortunately, trying to hide potential negativity that would make people cancel their pre-orders.
 
Last edited:
Eesh those Nvidia numbers sure are interesting. I'll be holding off until nearer the holiday to grab this anyway, I'm sure it would have had a bunch of patches by that point anyway.

I'm sure my 3080, 12700K and 32GB DDR4 will be just fine at 1440p high settings.
 
if that laptop grade Zen2 can hit 60fps, I don't really think there's anything to worry about.

I wonder how big the gap between console settings and maxed out on PC is.
If a 14900K is limited to 85 FPS (Nvidia chart 1080P)) then it must be big, cause it's more than twice as fast as the console CPUs.
Also this isn't even with hardware lumen yet, which will be patched in later on PC.
 
Last edited:
Top Bottom