• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] PlayStation 5 Pro Hands-On: 11 Games Tested, PSSR and RT Upgrades Revealed, Developers Interviewed!

Rudius

Member
That moment when you realize this is the new sales pitch between hardware iterations.

FF7R-2.jpg
EX2poBh.jpeg
 
Sadly I don’t think outside of Sony we’re going to see on game built with the pro in mind. Best bet would be some tacked on “extras” which will be nice but developers are notorious for not putting the effort into niche configurations.

I’m still at a loss why they couldn’t just put in a slightly larger cpu of the same variant to give 30% boost in CPU which would have allowed almost all games to reach 60fps the 10% in the pro isn’t even going to be felt for most titles.

Also would've allowed the Pro to actually get good RT performance out of the 2-3x increase as that is reliant to a large degree on cpu. Now devs have to choose FPS or RT ...sucks
 

Zathalus

Member
Since when we need to take into account the absolute min framerate in order to get to 60fps territory? Show me what CPU is needed to get to locked 60fps on PC here, is there even such a CPU? Oddly we are only doing that on consoles when talking about those "30fps" games. The game is already mostly 60fps in the others areas.
A locked 60fps can probably be done with a 7600 or 13400. It’s mostly for the Act 3 Baldurs Gate city area. But the PC video was only there to show how heavy on the CPU that specific area is.
 
Last edited:

PaintTinJr

Member
Here is BG3. Max settings and 1080p and max settings 720p in Lower City in Act 3. It's 100% CPU limited there. RTX 4090+i9 13900K+32GB DDR5 6000MHz.

3LQ4K6d.png

ZMju370.png


Absolutely no difference in performance whatsoever. Resolution has no impact on CPU unless there are specific aspects of the rendering tied to it as @winjer mentioned. I still don't understand how PSSR will alleviate CPU bottlenecks, but I guess we'll have to wait and see if PaintTinJr PaintTinJr is onto something or not.
Why use max settings?

The fps graph comparison in the DF video using a Ryzen 3600 + RTX 4090 and i9-12900K + RTX 4090 showed is that the frame-rates tracked each other at the same displacement, indicating the draw calls are only being prepared by the single core of each CPU, so using max settings just further compounds that problem because lowering resolution isn't being tested if the max settings workload is already the bottleneck.
 

lh032

I cry about Xbox and hate PlayStation.
Here is BG3. Max settings and 1080p and max settings 720p in Lower City in Act 3. It's 100% CPU limited there. RTX 4090+i9 13900K+32GB DDR5 6000MHz.

3LQ4K6d.png

ZMju370.png


Absolutely no difference in performance whatsoever. Resolution has no impact on CPU unless there are specific aspects of the rendering tied to it as @winjer mentioned. I still don't understand how PSSR will alleviate CPU bottlenecks, but I guess we'll have to wait and see if PaintTinJr PaintTinJr is onto something or not.
but the ps5 pro will not be running the game at "max settings"
 

lh032

I cry about Xbox and hate PlayStation.
Why not? The base PS5 runs the game at max settings. The only scaling the game has is 1440p native for Quality and 1440p FSR 2 quality for Performance. All other settings are a match for the PC Ultra settings.
is it confirm the base ps5 is running the game at PC max settings?
 
Are we going to get any decent Pro footage from TGS? GT7 and FFVII Rebirth are supposed to be playable there

Review units will probably go out to publications either before the next pre-order phase on 10/10 or before launch on 11/7.

I'm still surprised that some specific games weren't highlighted as receiving pro updates

Cyberpunk, Wukong, Baldur's Gate, Elden Ring, FF16.

Elden Ring should probably get a locked 4K60 via Boost Mode it'll be interesting to see if it can handle the RT mode at 60. FF16 had DRS, but I think there are some things that are locked feature wise to fidelity mode.

All of these games should benefit from boost mode, but could have gone further.
 

lh032

I cry about Xbox and hate PlayStation.
Yeah, that’s what DF found in the analysis, everything is a match for Ultra.
is it the performance mode or the quality mode? im actually interested whether the PRO can improve the framerate for the performance mode.
 
Last edited:
Here is BG3. Max settings and 1080p and max settings 720p in Lower City in Act 3. It's 100% CPU limited there. RTX 4090+i9 13900K+32GB DDR5 6000MHz.

3LQ4K6d.png

ZMju370.png


Absolutely no difference in performance whatsoever. Resolution has no impact on CPU unless there are specific aspects of the rendering tied to it as @winjer mentioned. I still don't understand how PSSR will alleviate CPU bottlenecks, but I guess we'll have to wait and see if PaintTinJr PaintTinJr is onto something or not.
Not GPU limited at 1080p using... a 4090. This is DF logic at its best here.

Show me the same spots with a 2700 or a 5700XT please. Now we'll really know if it's 100% limited by the CPU or if the GPU (or bandwidth) has some parts, even small, in the framerate.
 
Last edited:

CloudShiner

Member
Review units will probably go out to publications either before the next pre-order phase on 10/10 or before launch on 11/7.

I'm still surprised that some specific games weren't highlighted as receiving pro updates

Cyberpunk, Wukong, Baldur's Gate, Elden Ring, FF16.

Elden Ring should probably get a locked 4K60 via Boost Mode it'll be interesting to see if it can handle the RT mode at 60. FF16 had DRS, but I think there are some things that are locked feature wise to fidelity mode.

All of these games should benefit from boost mode, but could have gone further.
I'd love Cyberpunk 2077 to get a Pro patch. You're saying even if it doesn't, then the dynamic resolution (I always played in performance mode) will still be higher than on regular PS5?
 

lh032

I cry about Xbox and hate PlayStation.
For both, only thing that changes is that instead of native res it switches to FSR2 quality for the performance mode.
I am very interested in the performance comparison between the base ps5 and pro ps5 now, this will be very interesting.
Either there will be no improvement at all, or there will be an increase on the framerate.
 

SenkiDala

Member
WHY the fuck those videos are always "90% of time is our faces and 10% are the games" I'd rather see the gameplay loop again and again than seeing those destitute faces. I don't mind how people look, how gaming journalists look, I wanna see the games, this is a video comparing the GRAPHICS of PS5 and PS5 PRO, nobody asked to see your faces for 45 minutes... Damn...
 

PaintTinJr

Member
Alex was stress testing the game, so it makes sense that his FPS was lower. However the point is that the effect of lowering the resolution even to 720p while decreasing settings to their minimum and likely lowering the DLSS cost, increased performance by only 19%.

What more evidence do you need to see to conclude that the game is largely CPU limited in the city in Act 3? And moreover think about the hypothetical case where no performance patches had been released. What do you think the Pro's chances of hitting a locked 60 would have been in that case?
But in your 19% video they were only testing patches at that resolution, not comparing the 3600 CPU with two different GPUs.

From the multitude of videos, we saw that day 1 code on Ryzen 3600 while dropping resolution to 720p on weaker RTX 3070 (with DLSS to 1440p) out performed a RTX 4090 atjust 1080p on the same CPU by 10fps in the same troubled area.

Meaning that resolution reduction had more of a positive impact than the difference between a 3070 and 4090, which clearly shows that the GPUs aren't able to parallelize their work well, and the RTX 4090 appears handicapped by its caches, VRAM or PCIe bus with the modest 2x increase in resolution not even allowing it to match the much weaker 3070, but actually fall behind by 10fps.
 
I'd love Cyberpunk 2077 to get a Pro patch. You're saying even if it doesn't, then the dynamic resolution (I always played in performance mode) will still be higher than on regular PS5?

That's right. The performance mode pushes for 60 fps, so it will reduce the resolution in order to stick the landing on that (and it still drops in frame rate). The resolution should be significantly more stable and the game hopefully runs at a smooth 60. DF mentions that they thought some of it was CPU related, so without a patch there might still be some dips.

It's interesting to me that more games aren't getting day 1 patches, but I'm also interested to see what simple Boost mode does as well.

Should be a pretty academic exercise to see the difference in Pro Enhanced and Boosted games. A lot of games have unlocked frame rates or dynamic resolution. We should get a better sense of just how much more powerful the GPU difference is due to the unpatched games. Once you throw in PSSR, you're in apples to oranges territory.
 

Bojji

Member
I thought PS5 was Ultra for everything? Did that change?

It's on max settings in quality mode you are right:

BmYeGIR.jpeg


What I posted was to show difference in fps between highest and lowest settings in CPU limited scenario.

But some people just can't comprehend what CPU limit is and what it looks like so it's a waste of time...
 

PaintTinJr

Member
Lowest vs. highest:

TLVAzuq.jpeg
And that right there shows that the cost of issuing the extra GPU work costs 25% of the CPU performance, because at 720p the GPU isn't limited to render all the FX on that card so should have seen zero performance degradation if CPU limited for game logic or simulation, but instead the CPU incurs more draw call preparation and losses 25% frame-rate, meaning the draw call preparation is at least one major bottleneck.
 

Zathalus

Member
And that right there shows that the cost of issuing the extra GPU work costs 25% of the CPU performance, because at 720p the GPU isn't limited to render all the FX on that card so should have seen zero performance degradation if CPU limited for game logic or simulation, but instead the CPU incurs more draw call preparation and losses 25% frame-rate, meaning the draw call preparation is at least one major bottleneck.
That is not just a comparison of graphical settings difference. The lowest setting disables dynamic crowds and lowers animation quality, both of which are CPU related performance options, especially in the city area which has hundreds of NPCs.
 

yamaci17

Member
It's on max settings in quality mode you are right:

BmYeGIR.jpeg


What I posted was to show difference in fps between highest and lowest settings in CPU limited scenario.

But some people just can't comprehend what CPU limit is and what it looks like so it's a waste of time...
yeah not to mention if a game's code is borked on CPU hardly any setting will make a noticable difference
it is just vain
 

FireFly

Member
But in your 19% video they were only testing patches at that resolution, not comparing the 3600 CPU with two different GPUs.

From the multitude of videos, we saw that day 1 code on Ryzen 3600 while dropping resolution to 720p on weaker RTX 3070 (with DLSS to 1440p) out performed a RTX 4090 atjust 1080p on the same CPU by 10fps in the same troubled area.

Meaning that resolution reduction had more of a positive impact than the difference between a 3070 and 4090, which clearly shows that the GPUs aren't able to parallelize their work well, and the RTX 4090 appears handicapped by its caches, VRAM or PCIe bus with the modest 2x increase in resolution not even allowing it to match the much weaker 3070, but actually fall behind by 10fps.
The change with the biggest positive impact was neither lowering the resolution and settings, nor swapping out the GPU. Indeed the 3070 was seemingly "outperforming" the 4090 across the two benchmarks even at 1440p with DLAA!

The change with the biggest impact was swapping out the CPU, which roughly doubled the frame rates on the 4090. Even if you're right that the 4090 is underperforming in some specific way in CPU limited scenarios, that's not relevant to the question at hand which is whether Act 3 is CPU limited in the first place! So can you finally agree, based on the evidence provided that Baldur's Gate 3 is predominantly CPU limited in the Act 3 city areas?
 

PaintTinJr

Member
That is not just a comparison of graphical settings difference. The lowest setting disables dynamic crowds and lowers animation quality, both of which are CPU related performance options, especially in the city area which has hundreds of NPCs.
They are just supplying more GPU data - assuming it needs streamed - and more draw calls, because the animation blending with quaternions has been accelerated since the PSP added that feature to its PS2-esq portable hardware. The number of collision tests isn't massively changing in that 10fps drop screenshot because the number of NPCs is the same or almost the same from manually counting what I think are and aren't NPC models.

So that comparison is only different by GPU data and draw calls being prepared by the CPU for the GPU, and is hardly 2 generational jumps from AC2 or later running on a single core 2way(SMT/HT) PPU of a PS3 Cell BE CPU to justify having a Ryzen 3600 to struggle with hitting 60fps with any modern GPU at 720p.

At some point you just have to say that the developers are better game designers than game developers, and that technical incompetence is on show for the PC release.

Which brings me back to a question you asked me many pages ago that I didn't answer(what will happen with the Pro version?), and my answer is probably that they'll get help from PlayStation to ship it in a quality state running at 60fps on the Pro.
 

Zathalus

Member
They are just supplying more GPU data - assuming it needs streamed - and more draw calls, because the animation blending with quaternions has been accelerated since the PSP added that feature to its PS2-esq portable hardware. The number of collision tests isn't massively changing in that 10fps drop screenshot because the number of NPCs is the same or almost the same from manually counting what I think are and aren't NPC models.

So that comparison is only different by GPU data and draw calls being prepared by the CPU for the GPU, and is hardly 2 generational jumps from AC2 or later running on a single core 2way(SMT/HT) PPU of a PS3 Cell BE CPU to justify having a Ryzen 3600 to struggle with hitting 60fps with any modern GPU at 720p.

At some point you just have to say that the developers are better game designers than game developers, and that technical incompetence is on show for the PC release.

Which brings me back to a question you asked me many pages ago that I didn't answer(what will happen with the Pro version?), and my answer is probably that they'll get help from PlayStation to ship it in a quality state running at 60fps on the Pro.
The reason the CPU cost is so high in the city area is that every single object in the game is simulated in a very large area (well outside of visual range) of the player. You can leave a trail of explosives through the entire city and blow up everything at once, and that brings even a 14900k to well under 20fps. Hence the the dynamic crowds and animation setting having such a large performance impact as it influences not just NPCs you can actually see.

Looking at how DD2 has very little (if any) gain on the Pro when it comes to the CPU I highly doubt this would reach 60fps in act 3.
 

Gaiff

SBI’s Resident Gaslighter
Not GPU limited at 1080p using... a 4090. This is DF logic at its best here.
No, it's PC logic 101 that every reputable reviewer like GN, HU, Debauer, and everyone else uses.
Show me the same spots with a 2700 or a 5700XT please. Now we'll really know if it's 100% limited by the CPU or if the GPU (or bandwidth) has some parts, even small, in the framerate.
It's no longer just about CPU if you go down to a 2700. You go down from DDR5 to DDR4, you get fewer PCIe lanes, and everything gets a lot slower. Clearly, if it was GPU-limited at 1080p, the performance would increase substantially at 720p. It doesn't because the GPU isn't the limiting factor.

7ISPotD.png


I sure hope you don't think a 4090 is the reason it tanks to 40-46fps here. Sorry for the overexposed photos. Fucking Windows HDR messes up the pictures.
 
Last edited:

Taycan77

Neophyte
After reading through the thread, it seems to me that the old CPU is not a big deal for the PS5 Pro. The number of CPU limited games this gen (as far as hitting 60 fps goes) can be counted on one hand. If CPU was the be all end all, then there should have been no game that shipped with a 60 fps mode on XSX/PS5 that also did not have a 60 fps mode on XSS. Heck, Tom Warren expected the XSS to outperform the PS5 (in fps) due to higher clocked CPU. However, that never panned out and many 3rd parties simply skip 60 fps modes on XSS.
More often than not, whenever we see a title with inconsistent framerates there are engine/dev issues. Time and again we hear how a particular title is too ambitious for current hardware, only to see significant performance gains months down the line. We are also told the CPU is the overwhelming culprit for poor FPS, only to see XSS often capped at 30FPS, indicating there is far more to it.

Still, it's all largely irrelevant as so few titles designed for consoles are CPU limited.

I have to say I'm pretty impressed with what I'm seeing from PS5 Pro and this will become even more obvious as games become more demanding. The whole 8K thing is a bit of a meme at this point, but it does highlight the headroom a number of AAA titles have. I'm excited to see how devs utilise the hardware and features. Some won't make much effort, but even PSSR utilisation will make a dramatic improvement to some titles. Everyone talks about the major titles, but on PS4 Pro some of the most interesting updates came from AA and Indie devs.
 

PaintTinJr

Member
Because that's what the consoles use and we want to maximize the load on the CPU. Max settings add more NPCs and other effects that also require more CPU horsepower.
But it proves nothing about the PC version, which is independent of the console using different hardware bindings. The discussion about the game on PC and lowering resolution - independent of PSSR lowering resolution discussion on Pro - was that it is CPU bound, when max settings just distorts the claim by becoming the bottleneck to stop the lower resolution showing that it can run the game faster with the CPU being the same.
 

Gaiff

SBI’s Resident Gaslighter
But it proves nothing about the PC version, which is independent of the console using different hardware bindings. The discussion about the game on PC and lowering resolution - independent of PSSR lowering resolution discussion on Pro - was that it is CPU bound, when max settings just distorts the claim by becoming the bottleneck to stop the lower resolution showing that it can run the game faster with the CPU being the same.
Dude, you're not making any sense. Rendering the game at 1080p or 720p sees no difference in performance, and you're here arguing that max settings are the cause? What kind of nonsense is that? If the GPU had been the limiting factor, chopping down the resolution to 720p would have increased the performance. Resolution is often the single biggest thing hitting the GPU. Max settings in BG3 won't bog down a 4090 so much that it can't break past 120fps at 720p. This is ridiculous.

Whether or not this applies to PSSR, I don't know, but resolution has no impact on CPU performance unless under very specific circumstances. Every hardware reviewer when running CPU tests will pair the most powerful GPU on the market with the test system and usually run the games at 1080p. BG3 in my example is CPU-limited and there's no way to argue that.

I have no idea how PSSR which is strictly related to the resolution will help CPU performance, but you obviously know something that the rest of us don't. Those discussions are often annoying because people are presented with mountains of evidence, but all they do is stonewall. They never provide their own footage or proof to defend their claims. You've been saying a lot of stuff over the last few pages, but provided nothing tangible to back you up. You simply deny what's there on unfounded grounds. If at least you put in some effort to show us proof of your claims, this would be a lot better. As it stands, you simply stonewall.
 
Last edited:

Bojji

Member
Dude, you're not making any sense. Rendering the game at 1080p or 720p sees no difference in performance, and you're here arguing that max settings are the cause? What kind of nonsense is that? If the GPU had been the limiting factor, chopping down the resolution to 720p would have increased the performance. Resolution is often the single biggest thing hitting the GPU. Max settings in BG3 won't bog down a 4090 so much that it can't break past 120fps at 720p. This is ridiculous.

Whether or not this applies to PSSR, I don't know, but resolution has no impact on CPU performance unless under very specific circumstances. Every hardware reviewer when running CPU tests will pair the most powerful GPU on the market with the test system and usually run the games at 1080p. BG3 in my example is CPU-limited and there's no way to argue that.

I have no idea how PSSR which is strictly related to the resolution will help CPU performance, but you obviously know something that the rest of us don't. Those discussions are often annoying because people are presented with mountains of evidence, but all they do is stonewall. They never provide their own footage or proof to defend their claims. You've been saying a lot of stuff over the last few pages, but provided nothing tangible to back you up. You simply deny what's there on unfounded grounds. If at least you put in some effort to show us proof of your claims, this would be a lot better. As it stands, you simply stonewall.

I think he is trolling us at this point.
 

PaintTinJr

Member
Dude, you're not making any sense. Rendering the game at 1080p or 720p sees no difference in performance, and you're here arguing that max settings are the cause? What kind of nonsense is that? If the GPU had been the limiting factor, chopping down the resolution to 720p would have increased the performance. Resolution is often the single biggest thing hitting the GPU. Max settings in BG3 won't bog down a 4090 so much that it can't break past 120fps at 720p. This is ridiculous.

Whether or not this applies to PSSR, I don't know, but resolution has no impact on CPU performance unless under very specific circumstances. Every hardware reviewer when running CPU tests will pair the most powerful GPU on the market with the test system and usually run the games at 1080p. BG3 in my example is CPU-limited and there's no way to argue that.

I have no idea how PSSR which is strictly related to the resolution will help CPU performance, but you obviously know something that the rest of us don't.
The CPU to GPU workload is the bottleneck we can clearly see from all the videos of the PC with different configs. As you lower the CPU to GPU work my dialling back resolution and settings that need additional CPU work instruction that block the CPU, you then see the CPU rises the frame-rate.
 

Gaiff

SBI’s Resident Gaslighter
The CPU to GPU workload is the bottleneck we can clearly see from all the videos of the PC with different configs. As you lower the CPU to GPU work my dialling back resolution and settings that need additional CPU work instruction that block the CPU, you then see the CPU rises the frame-rate.
Resolution has no impact on CPU performance unless under specific circumstances. Else you better tell the whole tech community that they've been conducting their tests incorrectly for years and that you know better even without data to support yourself.

This debate has grown stale because it's just stonewalling on your part. No point in continuing this.
 

onQ123

Member
For BG3 those launch comparisons are pretty obsolete as the game got significant performance patches since launch. For the PC CPU performance they had an update here:



For consoles here:



3600 goes up to the low 40s, PS5 went from the low 20s to the low-mid 30s. Basically Act 3 is still killer on the CPU, albeit not as much as before. I’m not sure how the Pro can take 32fps all the way up to a locked 60, as the game is certainly not GPU limited. Even using the 10% faster CPU and lowering the internal resolution to 720p via PSSR that’s not going to increase your CPU performance by 90%. Maybe if another impactful performance patch roles out.

The only PS5 games I’m aware of that have issues with CPU performance is BG3, DD2, and Space Marine 2. Talking about stuff people care about, not shit like Gotham Knights. For upcoming releases maybe MH Wilds and KC Deliverance, but that’s an unknown. So it’s a tiny amount of games and certainly not indicative of the PS5 library as a whole.


I believe the changes to the cache & the dual issue compute will do more than people think.
 
Yeah, massive difference. Hitman shows how much it can change look of the game:



24E0gXb.jpeg
rOUX3bF.jpeg


SSR can look good at times, but on mirrors and bodies of water it completely ruins the immersion. I cannot ignore the fact that the SSR reflections fade when the camera moves vertically (up and down).
 
Last edited:

Lysandros

Member
The ryzen 3600 is certainly not enough to match PS5 CPU performance. It has less CPUs cores and it also have to work a lot harder to make up for the lack of decompression chip. If the game is built around the PS5 decompression chip (like the TLOU1 remake) the 3600 cannot run such port well.



Dips below 60fps, almost 100% full CPU usage due to decompression resulting in stuttering.

I cannot understand why Digital Foundry keeps using 3600 in their PS5 comparisons. They mislead people because that CPU is not up to the job.

Not play DF's advocate here (i would be one of the last persons to do so) but... Some important facts to remind; R5 3600 can clock higher than PS5's CPU, 3.5 GHz vs 4.2 GHz. That's a substantial difference of 20%. PS5 CPU's 1 or 1.5 core is reserved for the OS exclusively and can not be accessed by the game engine as a direct performance resource. So it's 12 vs 12 threads essentially. R5 3600 has four times as much L3 cache as the PS5 CPU, 32 mb vs 8 mb. PS5 preforming better than R5 3600 in propertly optimized games in CPU bound scenarios is the direct result of inherent advantages of fixed spec APU platform, lower level API and the dedicated I/O hardware, that is not because of the CPU hardware in itself.

Consoles performing better than the close/equivalent PC hardware should be the expected result and be presented as such. It would be unfair and faulty methodology to use even more powerful CPUs to skew the results. In fact DF has an even closer match CPU (literally the XSX CPU) which most often performs noticeably worse than R5 3600 and i think it should be one used more often instead to make a closer, hardware based comparison.
 
Last edited:

ChiefDada

Gold Member
In the new DF, Oliver confirmed Jedi Survivor footage was running at 1080p internal resolution with PSSR upscale. This mode brings back RTGI and RT reflections. Alex let out a surprised "oh wow!", then proceeds to doubt PS5 Pro ability to hold 60fps because of the heavy RT and insignificant CPU upgrade, despite devs saying otherwise. I guess we'll see soon but at some point this guy has to start accepting the fact that the console setup isn't the same as PC.
 
In the new DF, Oliver confirmed Jedi Survivor footage was running at 1080p internal resolution with PSSR upscale. This mode brings back RTGI and RT reflections. Alex let out a surprised "oh wow!", then proceeds to doubt PS5 Pro ability to hold 60fps because of the heavy RT and insignificant CPU upgrade, despite devs saying otherwise. I guess we'll see soon but at some point this guy has to start accepting the fact that the console setup isn't the same as PC.
Like always typical Bugaga nonsense.
clint-eastwood-gran-torino.gif
 

Bojji

Member
In the new DF, Oliver confirmed Jedi Survivor footage was running at 1080p internal resolution with PSSR upscale. This mode brings back RTGI and RT reflections. Alex let out a surprised "oh wow!", then proceeds to doubt PS5 Pro ability to hold 60fps because of the heavy RT and insignificant CPU upgrade, despite devs saying otherwise. I guess we'll see soon but at some point this guy has to start accepting the fact that the console setup isn't the same as PC.

Jedi is still very heavy on CPU with RT even after many, many patches so we will see. Before they turned off rt from performance mode on consoles, fps was dropping like motherfucker and it was most likely caused by cpu.
 

Fafalada

Fafracer forever
fps was dropping like motherfucker and it was most likely caused by cpu.
We don't know that though - especially given console RT is almost always GPU limited, and RT is only CPU heavy on console if you chose to make it so (ie. if you have CPU cycles to spare, or you just couldn't be bothered).

Resolution has no impact on CPU performance unless under specific circumstances.
On most consoles - CPU/GPU memory contention means that GPU completing frames faster (with free time in the frame) = better CPU performance. This could be anywhere from single digit % of CPU performance to stupid things like 80-90% (two particular consoles had a broken memory-subsystem that could cripple the CPU if you didn't manually throttle the GPU work).
So since - higher resolution = more GPU time, that always had an effect on CPU performance as far as consoles are concerned.

It doesn't - however, impact draw-calls in the way that has been debated in this thread. The specific scenarios where CPU/GPU fences are used for synchronization are - usually - not going to have the hardware idle when waiting on such sync points, and most of the CPU work submitting draw-calls will be done entirely async (so not limited by draw-call execution at all).
It 'may' however - run slower due to the first thing I mentioned, but - that affects all of CPU work, nothing specific to draw-calls.

The only way this can be true, is if a game has an LOD system that is tied to screen resolution. So higher resolution will call higher detail lods, that will have more objects and mor complexity.
I mean - strictly speaking - this is the only correct* way to handle LOD (it should be based on screen coverage). It's how texture LOD has worked - well for as long as GPUs could mipmap textures, so the last 30+ years, give or take. It's also - at the heart of virtualized geometry approaches like Nanite to do exactly this.
That said - virtualized geometry will draw more detail depending on screen-coverage - but it wouldn't really impact draw-calls (or their numbers).

*yes - I am aware many games are using much more hacky methods like 'distance based' or even worse. Just something of a pet peeve of mine that this is still happening at all. It's a sad state of affairs that even many professionals in tech-industry still associate LODs to distance from camera/viewer.
 
Last edited:

Zathalus

Member
In the new DF, Oliver confirmed Jedi Survivor footage was running at 1080p internal resolution with PSSR upscale. This mode brings back RTGI and RT reflections. Alex let out a surprised "oh wow!", then proceeds to doubt PS5 Pro ability to hold 60fps because of the heavy RT and insignificant CPU upgrade, despite devs saying otherwise. I guess we'll see soon but at some point this guy has to start accepting the fact that the console setup isn't the same as PC.
He doesn’t just mention it without evidence, he shows that even on the latest patch the 7800x3D is running at under 60fps in one scene due to the CPU overhead in this title with RT enabled. As he rightly points out, I wouldn’t trust the Jedi survivor devs with this considering their track record. It not a problem with the hardware in this example, it’s a problem with the developers.
 
Top Bottom