• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

[MLiD] PS6 & PSSR2 Dev Update

I've got no issue with using AIs like Copilot and ChatGPT in general, but you proxying my point while not advocating for the point that you obviously don't understand the significance of the advent of hardware accelerated z/stencil buffering to 3D graphics, how the last fragment to be shaded in a game frame can't be completed - without tearing - unless it passes that test, and how that becomes the main delay in the critical path of low resolution when shader units and ROPs are in abundance to accelerate the fragment shaders, is hardly the same thing as me conducting the same discussion with ChatGPT myself and correcting it on it misunderstanding contexts, failing to remember that Opengl/Vulkan are both Client/Server models, and that the client submission is single threaded on the main CPU core.

It didn't even seem from your response like ChapGPT accepted that the CPU primary core and GPU for an interactive game logic/simulation is in lockstep - where the GPU can't predict the future before the gamer has interacted with the present rendered frame. So if you think that wall of info covered up your lack of real game rendering knowledge from your previous comment, then whatever,

but it doesn't change the reality that a frame can't correctly finish rendering a v-synched frame in a game to advance the CPU logic until the very last fragment from the projected geometry passes or fails its zbuffer/stencil test, so at low resolution - as we tend to zero - the zbuffer/stencilling becomes the most important constraint on the critical path - even if combined with BVH hidden surface removal lowering the overdraw per pixel to an optimal one and a half.
I mean, ChatGPT 5.2 Codex Max, Claude Opus 4.5, and Gemini 3 Pro all agree you're factually wrong on quite a number of points. The advantage of your work having a paid GitHub copilot license means being able to use these advanced coding models is easy.

And, well, every single YouTube or technical testing I can find, finds the complete opposite as well, which I've provided quite a few, yet curiously you have not.

You should be a game developer, apparently Larian, CDPR, Ubisoft, Asobo, and a few others don't realize it's a simple matter of just lowering resolution to alleviate CPU bottlenecks. The Switch 2, despite having games upscaled from 360p still faces CPU bottlenecks, and that's from some of most technical competent game studios in the world.
 
Last edited:
I mean, ChatGPT 5.2 Codex Max, Claude Opus 4.5, and Gemini 3 Pro all agree you're factually wrong on quite a number of points. The advantage of your work having a paid GitHub copilot license means being able to use these advanced coding models is easy.

And, well, every single YouTube or technical testing I can find, finds the complete opposite as well, which I've provided quite a few, yet curiously you have not.

You should be a game developer, apparently Larian, CDPR, Ubisoft, Asobo, and a few others don't realize it's a simple matter of just lowering resolution to alleviate CPU bottlenecks. The Switch 2, despite having games upscaled from 360p still faces CPU bottlenecks, and that's from some of most technical competent game studios in the world.
Yet again, not following the context. When designed for the hardware the critical path limiting factor at low resolution will be the zbuffer/stencil test clearing the last pixel - to be shaded, and even more so with cascaded shadowmap updates because the PS6 portable probably won't use RT shadows - on any given untorn frame.

All the other reasons you were putting forward are related to software not being tailored (to the PS6 portable's 4-cores), and letting work on 3-cores that could be moved to an underutilised GPU or sub divided remain a bottleneck on the 3 other CPU cores that block the primary core flow control, or have work on the GPU that exceeds the needs of the lower resolution.

You can use other studios as a strawman, but PlayStation is going to be the lead platform even more so when PS6 Portable hits, and if PlayStation 6 will be using a low power mode as base for the main console(non-cross gen games), those games will scale with resolution for PS6 Portable as was my original point, because all the other ChatGPT reasons will already have been moved off the critical path that blocks higher frame-rate (60) above 30fps.

But feel free to contradict yourself and continue a back and forth on this pretty simple issue.
 
Last edited:
I mean, ChatGPT 5.2 Codex Max, Claude Opus 4.5, and Gemini 3 Pro all agree you're factually wrong on quite a number of points. The advantage of your work having a paid GitHub copilot license means being able to use these advanced coding models is easy.

And, well, every single YouTube or technical testing I can find, finds the complete opposite as well, which I've provided quite a few, yet curiously you have not.

You should be a game developer, apparently Larian, CDPR, Ubisoft, Asobo, and a few others don't realize it's a simple matter of just lowering resolution to alleviate CPU bottlenecks. The Switch 2, despite having games upscaled from 360p still faces CPU bottlenecks, and that's from some of most technical competent game studios in the world.
One could say the switch 2 hasn't exactly a perfect designed cpu neither eh.
 
Last edited:
Yet again, not following the context. When designed for the hardware the critical path limiting factor at low resolution will be the zbuffer/stencil test clearing the last pixel - to be shaded, and even more so with cascaded shadowmap updates because the PS6 portable probably won't use RT shadows - on any given untorn frame.

All the other reasons you were putting forward are related to software not being tailored (to the PS6 portable's 4-cores), and letting work on 3-cores that could be moved to an underutilised GPU or sub divided remain a bottleneck on the 3 other CPU cores that block the primary core flow control, or have work on the GPU that exceeds the needs of the lower resolution.

You can use other studios as a strawman, but PlayStation is going to be the lead platform even more so when PS6 Portable hits, and if PlayStation 6 will be using a low power mode as base for the main console(non-cross gen games), those games will scale with resolution for PS6 Portable as was my original point, because all the other ChatGPT reasons will already have been moved off the critical path that blocks higher frame-rate (60) above 30fps.

But feel free to contradict yourself and continue a back and forth on this pretty simple issue.
Mate you started it with your "lack of real first hand know how of graphics programming." Which was hilarious with your claim of "so zbuffering and stencil test has never been done on the main CPU core in 3D games or the CPU's other cores for that matter." when Quake existed. Hence, you know, my entire original point that lowering the resolution used to increase FPS by lowering load on the CPU. And, which I pointed out, hasn't been the case for ages. It's like you didn't even read my original post.

It's frankly simple, lowering resolution frees up GPU resources but it doesn't free up CPU resources. If you are CPU capped at 1080p, be it via game logic, physics, animation, etc, then lowering the resolution or upscaling from 360p ain't really going to alleviate that. In short, lowering resolution does not increase frame rate if you are CPU-limited but lowering resolution can increase frame rate only if you are GPU-limited on pixel/fragment work. That is factually true. And I can bet your next rebuttal will offer zero evidence to imply otherwise.

It's been so obvious and that way for ages now that I'm surprised anyone is arguing against it.
 
Last edited:
waiting-hurry.gif


It already is 2026 in Spain, Mark Cerny!! Your most important market!
 
One could say the switch 2 hasn't exactly a perfect designed cpu neither eh.
I never claimed otherwise, the Switch 2 CPU is very limited. But it proves my point perfectly. CPU power between docked and handheld is basically identical, while GPU power is not, that is almost cut in half. Now a game running on docked mode can face drops due to the CPU, Cyberpunk 2077 comes to mind, dropping all the way to 18fps. If we look at docked mode, the game is running 1080p, upscaled from a DRS range of 720p-1080p. Next, we look at the portable performance mode, upscaling to 720p from a low of 360p via DLSS and yet, what do you know, it still has the exact same CPU drops. Dropping rendering resolution to almost a quarter and basically the minimum any game should probably render at, did nothing to alleviate the CPU bottleneck.

For the PS6 Portable, devs are just going to need to do cutbacks on the portable version of games, as it will have half the number of cores and a much reduced frequency vs the main PS6 console. Lowering resolution is not going to help them at all either the CPU (GPU yes). Game logic and simulations would need to take a cut, and that's exactly what we see happening with the Switch 2 as well. Other than the obvious graphical cutbacks in SW Outlaws and AC Shadows we also see reduction in water and cloth simulation, as well as things like reduced animation effects when compared to the PS5.
 
Last edited:
I think it's more likely for devs to take the 60 FPS settings for the PS6, cut internal res by 4x and frame rate by 2x and call it a day.
That's the most likely scenario, but I'm assuming worst case, a very CPU heavy demanding game. But they should be rare. I think it depends a lot on what the clock speed difference between the two are.
 
Last edited:
Mate you started it with your "lack of real first hand know how of graphics programming." Which was hilarious with your claim of "so zbuffering and stencil test has never been done on the main CPU core in 3D games or the CPU's other cores for that matter." when Quake existed. Hence, you know, my entire original point that lowering the resolution used to increase FPS by lowering load on the CPU. And, which I pointed out, hasn't been the case for ages. It's like you didn't even read my original post.

It's frankly simple, lowering resolution frees up GPU resources but it doesn't free up CPU resources. If you are CPU capped at 1080p, be it via game logic, physics, animation, etc, then lowering the resolution or upscaling from 360p ain't really going to alleviate that. In short, lowering resolution does not increase frame rate if you are CPU-limited but lowering resolution can increase frame rate only if you are GPU-limited on pixel/fragment work. That is factually true. And I can bet your next rebuttal will offer zero evidence to imply otherwise.

It's been so obvious and that way for ages now that I'm surprised anyone is arguing against it.
You are moving the goal posts. How is a native 1080p resolution PS6 Portable LED/OLED being served by a lower resolution and ML AI upscaled by PSSR2 to 1080p going to be CPU capped at 1080p native rendering? That is even a strawman too.

You are also going completely by PC, and as Bojji Bojji 's first set of screenshots - that I could see in the UK - showed there was no real change in VRAM at artificial 50% resolution. Meaning the cascaded shadow maps, that are all part of the lower resolution freeing up the critical path, were never reduced by 50%, of which you would do on a portable, because at that minification size you won't see shadow mapping issues as badly when PSSR2 is scaling up (360p, 480p) using a cascade of sparse 2Ks worth of shadow maps rather than 8k or 16Ks worth on the main PS6 PS6 - or zero if doing RT/PT shadows.

So, for any modern game you believe is fully CPU limited at low resolution - presumably not using RT/PT - try turning shadow maps to lowest too or off, and then see if the VRAM use drops as expected - and the frame-rate increases.

For PlayStation 4 or 5 ports using 6 CPU cores, I suspect they are indeed more CPU limited by porting decisions for the hardware that pushes IO onto those cores, but for the rest that are still mainly GPU heavy and 2-3 CPU cores you should expect that massive lowering of (zbuffer) resolution results in frames finishing much quicker and then allowing the primary core to generate more workload for more frames per second.
 
You are moving the goal posts. How is a native 1080p resolution PS6 Portable LED/OLED being served by a lower resolution and ML AI upscaled by PSSR2 to 1080p going to be CPU capped at 1080p native rendering? That is even a strawman too.

You are also going completely by PC, and as Bojji Bojji 's first set of screenshots - that I could see in the UK - showed there was no real change in VRAM at artificial 50% resolution. Meaning the cascaded shadow maps, that are all part of the lower resolution freeing up the critical path, were never reduced by 50%, of which you would do on a portable, because at that minification size you won't see shadow mapping issues as badly when PSSR2 is scaling up (360p, 480p) using a cascade of sparse 2Ks worth of shadow maps rather than 8k or 16Ks worth on the main PS6 PS6 - or zero if doing RT/PT shadows.

So, for any modern game you believe is fully CPU limited at low resolution - presumably not using RT/PT - try turning shadow maps to lowest too or off, and then see if the VRAM use drops as expected - and the frame-rate increases.

For PlayStation 4 or 5 ports using 6 CPU cores, I suspect they are indeed more CPU limited by porting decisions for the hardware that pushes IO onto those cores, but for the rest that are still mainly GPU heavy and 2-3 CPU cores you should expect that massive lowering of (zbuffer) resolution results in frames finishing much quicker and then allowing the primary core to generate more workload for more frames per second.

VRAM going up or down is purely game dependent, some show big differences, some don't.

Screenshots with 720p output, 1080p output (both with 100% resolution) and 4k output (not imgur):


VRAM increases in (GPU limted) 4k output (and native res).
 
VRAM going up or down is purely game dependent, some show big differences, some don't.

Screenshots with 720p output, 1080p output (both with 100% resolution) and 4k output (not imgur):


VRAM increases in (GPU limted) 4k output (and native res).
Shadow map caches occupy large amounts of that base 6GB VRAM usage, turning off shadow maps or to low, which is all part of the native resolution makes a difference in any game not blocked by some slave CPU core workload or some heavy GPU workload like RT/PT. I mean if you are rendering at 720p, do the shadow map cache need to be more that 2K?
 
Last edited:
You are moving the goal posts. How is a native 1080p resolution PS6 Portable LED/OLED being served by a lower resolution and ML AI upscaled by PSSR2 to 1080p going to be CPU capped at 1080p native rendering? That is even a strawman too.

You are also going completely by PC, and as Bojji Bojji 's first set of screenshots - that I could see in the UK - showed there was no real change in VRAM at artificial 50% resolution. Meaning the cascaded shadow maps, that are all part of the lower resolution freeing up the critical path, were never reduced by 50%, of which you would do on a portable, because at that minification size you won't see shadow mapping issues as badly when PSSR2 is scaling up (360p, 480p) using a cascade of sparse 2Ks worth of shadow maps rather than 8k or 16Ks worth on the main PS6 PS6 - or zero if doing RT/PT shadows.

So, for any modern game you believe is fully CPU limited at low resolution - presumably not using RT/PT - try turning shadow maps to lowest too or off, and then see if the VRAM use drops as expected - and the frame-rate increases.

For PlayStation 4 or 5 ports using 6 CPU cores, I suspect they are indeed more CPU limited by porting decisions for the hardware that pushes IO onto those cores, but for the rest that are still mainly GPU heavy and 2-3 CPU cores you should expect that massive lowering of (zbuffer) resolution results in frames finishing much quicker and then allowing the primary core to generate more workload for more frames per second.
I'm not moving any goalposts. My very first post simply stated that lowering resolution to massively low amount simply won't give you any meaningful performance back if you are already heavily CPU limited. Which is still true.

Disabling other graphical effects (that may very well require CPU overhead like view distance or shadow casting lights, or some forms of object/world detail) or trying to reduce the VRAM footprint is something completely different from what I am claiming. Stuff like that can indeed claw CPU (and GPU) performance back. But not just dropping the resolution, which is what I have been arguing all along.
 
mark-cerny-ps5.gif


He walks on the stage and announces: "First game with PSSR2 support launches in December 2026".
i have no clue what this thread is about but I have to mention this



i did play death stranding 2 on ps5 pro just a week ago and indeed noticed that famous shimmering in foliage along with certain other graphical issues like lights flickering when moving the camera. interesting stuff as this game does not even use ray tracing

there's something seriously wrong with PSSR
 
Last edited:
i have no clue what this thread is about but I have to mention this



i did play death stranding 2 on ps5 pro just weeks ago and indeed noticed that famous shimmering in foliage along with certain other graphical issues like lights flickering when moving the camera. interesting stuff as this game does not even use ray tracing

there's something seriously wrong with PSSR


Interesting, IQ at launch was good overall with some slight foliage problems and aliasing in some parts of the image. It wouldn't be the first game fucked up with PSSR "update". Fucking devs, OPTIONS are needed for this stuff. Of course I don't know if this is true or not.

Launch reconstruction had issues like that for example:

 
Interesting, IQ at launch was good overall with some slight foliage problems and aliasing in some parts of the image. It wouldn't be the first game fucked up with PSSR "update". Fucking devs, OPTIONS are needed for this stuff. Of course I don't know if this is true or not.

Launch reconstruction had issues like that for example:


yeah, funny enough I didn't notice that kind of a problem in the game. there's this comparison shared by that poster



it looks similar to what I've experienced at certain times
 
Top Bottom