• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] PlayStation 5 Pro Hands-On: 11 Games Tested, PSSR and RT Upgrades Revealed, Developers Interviewed!

winjer

Gold Member
Here is BG3. Max settings and 1080p and max settings 720p in Lower City in Act 3. It's 100% CPU limited there. RTX 4090+i9 13900K+32GB DDR5 6000MHz.

3LQ4K6d.png

ZMju370.png


Absolutely no difference in performance whatsoever. Resolution has no impact on CPU unless there are specific aspects of the rendering tied to it as winjer winjer mentioned. I still don't understand how PSSR will alleviate CPU bottlenecks, but I guess we'll have to wait and see if @PaintTinJr is onto something or not.

I have a suggestion. Use RTSS, and enable the CPU monitoring, per core.
That way we can see exactly the CPU usage at each resolution. On that scene, where the frame rate is identical, regardless of resolution.
My bet is that CPU usage will be very similar.
 

ChiefDada

Gold Member
He doesn’t just mention it without evidence, he shows that even on the latest patch the 7800x3D is running at under 60fps in one scene due to the CPU overhead in this title with RT enabled. As he rightly points out, I wouldn’t trust the Jedi survivor devs with this considering their track record. It not a problem with the hardware in this example, it’s a problem with the developers.

And my point is that's not the end all be all evidence to apply because that is simply focused on PC environment. CPU has much more tasks and overhead to juggle compared to console. And EA clearly are confident that RT won't bottleneck the 60fps mode. They worked on the game, they know how it runs, Alex doesn't.

In Quality Mode, we offer our highest-ever console resolution at 2160p. At the same time, Performance Mode continues to deliver a solid 60 fps—but now with higher resolutions and the added visual depth of ray tracing for reflections and ambient occlusion. The new PSSR upscaling ensures a sharper visual experience in both modes.
 

Zathalus

Member
And my point is that's not the end all be all evidence to apply because that is simply focused on PC environment. CPU has much more tasks and overhead to juggle compared to console. And EA clearly are confident that RT won't bottleneck the 60fps mode. They worked on the game, they know how it runs, Alex doesn't.
Yes PCs and consoles are different but on no planet will the PS5 CPU outperform a 7800x3D. And trusting a press blurb from EA? They were confident in the launch version as well, how did that pan out for them exactly? But maybe the game will get additional optimisation before launch.
 

kevboard

Member
Here is BG3. Max settings and 1080p and max settings 720p in Lower City in Act 3. It's 100% CPU limited there. RTX 4090+i9 13900K+32GB DDR5 6000MHz.

3LQ4K6d.png

ZMju370.png


Absolutely no difference in performance whatsoever. Resolution has no impact on CPU unless there are specific aspects of the rendering tied to it as winjer winjer mentioned. I still don't understand how PSSR will alleviate CPU bottlenecks, but I guess we'll have to wait and see if @PaintTinJr is onto something or not.

it clearly won't have any impact on CPU performance. It's wishful thinking of many that the Pro will make 30fps games run at 60fps.

for some games it might actually happen, games where the CPU limitations meant it would have been in the 45 to 50fps nomansland, where the small CPU bump might actually help get them to 60fps, or allow for an unlocked mode that works well with VRR.

Almost every 30fps game this gen is CPU limited or limited to 30fps due to pretentious devs (Hellblade 2).
GPU limitations that keep games from hitting 60fps aren't really a thing this gen.
 

Lysandros

Member
Do we have official confirmation that PS5 PRO CPU runs at 10% higher frequency by default? As far as i know there isn't any official info about clock speeds of the machine, be it GPU or CPU (which is pretty nuts by the way, those are elementary to specs).
 

Kangx

Member from Brazile

For F1, I took a look at this clip again. Oliver is kinda right though. Under the right circumstances like in this video, it can make a huge differences.

Here is the differences i noticted in favor of the pro. The reflections is so much better on the pro. The car look like they are floating on the ps5 compare to the pro. There are pit crews on the side that missing on the ps5. The ground look clearer on the Pro due to much better AF. The white paint lines on track are faded or completely gone in the Silverstone rain footage. Also, it's small, but image clarity is also a bit better on the pro if you look straight at the letter ROLEX behind the 2D start light. The only downside like oliver stated is image stability due to all those ray tracing effects when zoom in, but hardly noticeable without it.

How did codemaster do it? where Poliphony digital has barely managed to put in just raytrace reflection and compromise image quality vs Native 4k. And they are first party too.

Cpu on consoles is a big part of raytracing bottleneck, but both of these racing game seem pretty light on cpus when they can hit 120fps. So in theory, Poliphony should be able to implement more improvements and raytracing to the pro, but they could not. For many including myself, it's a huge disappointment and I think it's warranted.
 
Yes PCs and consoles are different but on no planet will the PS5 CPU outperform a 7800x3D. And trusting a press blurb from EA? They were confident in the launch version as well, how did that pan out for them exactly? But maybe the game will get additional optimisation before launch.
I thought the current performance mode on console was 60pfs 99% of the time?
 

Bojji

Member
I thought the current performance mode on console was 60pfs 99% of the time?

It is but this is without RT, and there is massive difference in cpu usage between rt on and off.

Developers say a lot of things so I wouldn't trust EA devs that launched this game in such shitty condition. It took them one year to fix most of issues but still not all.
 

JaksGhost

Member
For F1, I took a look at this clip again. Oliver is kinda right though. Under the right circumstances like in this video, it can make a huge differences.

Here is the differences i noticted in favor of the pro. The reflections is so much better on the pro. The car look like they are floating on the ps5 compare to the pro. There are pit crews on the side that missing on the ps5. The ground look clearer on the Pro due to much better AF. The white paint lines on track are faded or completely gone in the Silverstone rain footage. Also, it's small, but image clarity is also a bit better on the pro if you look straight at the letter ROLEX behind the 2D start light. The only downside like oliver stated is image stability due to all those ray tracing effects when zoom in, but hardly noticeable without it.

How did codemaster do it? where Poliphony digital has barely managed to put in just raytrace reflection and compromise image quality vs Native 4k. And they are first party too.

Cpu on consoles is a big part of raytracing bottleneck, but both of these racing game seem pretty light on cpus when they can hit 120fps. So in theory, Poliphony should be able to implement more improvements and raytracing to the pro, but they could not. For many including myself, it's a huge disappointment and I think it's warranted.
Codemaster engine has had ray tracing implementation since their 2021 entry so they’ve had much more time to play with the technology. They’re also just using settings that would’ve been implemented on a higher equivalent graphics card unlike Polyphony that had the base PS5 as the hardware that they used and scaled down from there for PS4. Gran Turismo is also filled with a huge amount of options already for the consumer including VR support.
 

Kangx

Member from Brazile
In the new DF, Oliver confirmed Jedi Survivor footage was running at 1080p internal resolution with PSSR upscale. This mode brings back RTGI and RT reflections. Alex let out a surprised "oh wow!", then proceeds to doubt PS5 Pro ability to hold 60fps because of the heavy RT and insignificant CPU upgrade, despite devs saying otherwise. I guess we'll see soon but at some point this guy has to start accepting the fact that the console setup isn't the same as PC.
I think this kinda interesting upgrade vs raise the res to 1440p and use PSSR to 4k.

1440p with PSSR can look a bit better than 4k with TAA on consoles. 1080p with PSSR will definitely look softer though. Most consoles gamers usually sit farther away so the effects of comparable 4k is not worth it vs better lighting and reflection.

The question is, will 1080p with PSSR image clarity look better than or comparable to the 30fps mode?
 
Last edited:
Do we have official confirmation that PS5 PRO CPU runs at 10% higher frequency by default? As far as i know there isn't any official info about clock speeds of the machine, be it GPU or CPU (which is pretty nuts by the way, those are elementary to specs).

I don't think it's gonna be by default. It's still using AMD SmartShift so if you put 10% more on the CPU, then the GPU clock is lowered

Sooner or later the real clocks will be known... :D
 

Lysandros

Member
I don't think it's gonna be by default. It's still using AMD SmartShift so if you put 10% more on the CPU, then the GPU clock is lowered

Sooner or later the real clocks will be known... :D
That was not the case for PS5 in which there was ample power to run both at their max frequencies most of the time according to the technical presentation by Cerny. Amd SmartShift was there to divert any unused power from the CPU to the GPU as a 'complementary' algorithm to their custom variable clocks solution. So the CPU's 3.5 GHz was more like the default value. We knew the frequencies for PS5 from the get go. PS5 PRO situation seems quite odd to me in this regard.
 
Last edited:
That was not the case for PS5 in which there was ample power to run both at their max frequencies most of the time according to the technical presentation by Cerny. Amd SmartShift was there to divert any unused power from the CPU to the GPU as a 'complementary' algorithm to their custom variable clock solution. So the CPU's 3.5 GHz was more like the default value. We knew the frequencies for PS5 from the get go. PS5 PRO situation seems quite odd to me in this regard.

My idea about the GPU clocks is that the default mode is 2.23 Ghz like the standard model, perfect for backwards compatibility with existing PS5 games

You have a minimum of 2.18 and a maximum of 2.35, so the average is right there

If the 3.85 Ghz CPU clock is called "High Frequency CPU mode", I don't think it's gonna be the default mode

But hey, I'm just guessing
 

Kangx

Member from Brazile
It looks really good for internal 1080p res to pssr 4k even on this crappy youtube capture
DV2gX16.jpeg
Yea. Went back and check. I think going for 1080p with raytracing is correct on this one.

The ps5 fidelity mode is 1152p which barely higher than 1080p. So essentially they just replaced FSR with PSSR. The image quality going to look substantially better than the fidelity mode it seems.

Bonus lol. In the water section at the end, the fidelity mode drop down to the bottom 10 fps. Let see how the Pro handle this.
 
Last edited:
I don't think it's gonna be by default. It's still using AMD SmartShift so if you put 10% more on the CPU, then the GPU clock is lowered

Sooner or later the real clocks will be known... :D

To confirm this.

from Tom Henderson leak

In High CPU Frequency Mode, more power is allocated to the CPU and will downclock the GPU by around 1.5%, resulting in roughly 1% lower GPU performance.
 

sachos

Member
Jedi Survivor footage was running at 1080p internal resolution with PSSR upscale. This mode brings back RTGI and RT reflections.
Matches pretty well with my predictions based on PC performance. Look at the 1080p Native Ultra+RT performance on similar cards. This is amazing if they can get a locked 60.
 

ChiefDada

Gold Member
Matches pretty well with my predictions based on PC performance. Look at the 1080p Native Ultra+RT performance on similar cards. This is amazing if they can get a locked 60.

These TechPowerup benchmarks are strange looking sometimes. A 3070 is able to achieve 56fps minimum even though those settings supposedly require 10gb+ aka 2gb in excess of framebuffer?
 
Matches pretty well with my predictions based on PC performance. Look at the 1080p Native Ultra+RT performance on similar cards. This is amazing if they can get a locked 60.
There is no such thing as locked 60fps. They said it runs as good as the base perf mode.
 

sachos

Member
even though those settings supposedly require 10gb+ aka 2gb in excess of framebuffer?
Yeah not exactly sure how to read that data. Maybe it points out that if the card had more memory it would have perform even better.

There is no such thing as locked 60fps. They said it runs as good as the base perf mode.
I thought it was running pretty good as of the last patch.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
These TechPowerup benchmarks are strange looking sometimes. A 3070 is able to achieve 56fps minimum even though those settings supposedly require 10gb+ aka 2gb in excess of framebuffer?

Yeah not exactly sure how to read that data. Maybe it points out that if the card had more memory it would have perform even better.
I didn't check for Jedi Survivor, but you have to be careful with those benchmarks because they often don't paint the full picture. In a lot of games now, what is being done instead of using page filing when running out of VRAM (that cripples performance), games will instead lower texture cache size (resulting in much more pop-in close to the camera, worse LOD, and draw distances) or destroy textures altogether. This will result in no visible performance loss, but your game will look like dirt. It's a good way to allow VRAM-starved cards to maintain good performance.

Hey, look, the 8GB 3070 is destroying the 16GB 6800, right?

Rc46w0t.png


Look at the ground textures lol.
 
Last edited:
I didn't check for Jedi Survivor, but you have to be careful with those benchmarks because they often don't pain the full picture. In a lot of games now, what is being done instead of using page filing when running out of VRAM (that cripples performance), games will instead lower texture cache size (resulting in much more pop-in close to the camera, worse LOD, and draw distances) or destroy textures altogether. This will result in no visible performance loss, but your game will look like dirt. It's a good way to allow VRAM-starved cards to maintain good performance.

Hey, look, the 8GB 3070 is destroying the 16GB 6800, right?

Rc46w0t.png


Look at the ground textures lol.


I never undertood how the 3070 could be a 8 GB card, while the 3060 was a 12 GB card
 

Gaiff

SBI’s Resident Gaslighter
I never undertood how the 3070 could be a 8 GB card, while the 3060 was a 12 GB card
It’s NVIDIA deliberately scamming their customers. They could have gone with 16GB, but that would have meant the card would have lasted too long for their liking. As for the 3060, sure, it’s 12GB (there’s also a 6GB version), but it’s also a much weaker card that’s still slower than the 2070, so this hardly matters.

8GB on the 3070 is planned obsolescence. NVIDIA has been doing this for years. 2GB when AMD-equivalents were giving 3-4. 4GB when AMD gave 8. 8 vs 12-16, etc.
 
Last edited:
As for the 3060, sure, it’s 12GB (there’s also a 6GB version), but it’s also a much weaker card that’s still slower than the 2070, so this hardly matters.

That's my point, why a weaker card would need 12 GB??

And then for the ones in the tier above, 3070/3070 Ti, 8 GB should be enough instead???
 

Gaiff

SBI’s Resident Gaslighter
That's my point, why a weaker card would need 12 GB??
It doesn't, but given the memory configuration, NVIDIA only had the choice between 6GB and 12GB and 6GB would have been a bad joke. 12GB it is, but since the card is kind of weak, it will fall to the wayside sooner anyway, because even though its VRAM pool is huge relatively speaking, the rest of the card is quite slow. Plus, it's a segment where NVIDIA doesn't have huge profit margins, so might as well attract customers there.
And then for the ones in the tier above, 3070/3070 Ti, 8 GB should be enough instead???
Those would have needed at the very least 10GB but ideally 12 or even 16. However, this would have meant they'd have been useful for years. The 3070 is effectively a crippled card in this day and age, and this is by design. It looked like an amazing deal at first. 2080 Ti performance 2 years later at half the price? Where do I sign? Less than a year later, it was already having issues with VRAM.
 
Last edited:

SolidQ

Member
Look at the ground textures lol.
not only ground, whole scene have bad textures on 3070

I never undertood how the 3070 could be a 8 GB card, while the 3060 was a 12 GB card
NV want money. Make cards with low vram, and they know with next gen cards, customer going buy new one.
 
Last edited:

RJMacready73

Simps for Amouranth
For F1, I took a look at this clip again. Oliver is kinda right though. Under the right circumstances like in this video, it can make a huge differences.

Here is the differences i noticted in favor of the pro. The reflections is so much better on the pro. The car look like they are floating on the ps5 compare to the pro. There are pit crews on the side that missing on the ps5. The ground look clearer on the Pro due to much better AF. The white paint lines on track are faded or completely gone in the Silverstone rain footage. Also, it's small, but image clarity is also a bit better on the pro if you look straight at the letter ROLEX behind the 2D start light. The only downside like oliver stated is image stability due to all those ray tracing effects when zoom in, but hardly noticeable without it.

How did codemaster do it? where Poliphony digital has barely managed to put in just raytrace reflection and compromise image quality vs Native 4k. And they are first party too.

Cpu on consoles is a big part of raytracing bottleneck, but both of these racing game seem pretty light on cpus when they can hit 120fps. So in theory, Poliphony should be able to implement more improvements and raytracing to the pro, but they could not. For many including myself, it's a huge disappointment and I think it's warranted.

This is the one game i've seen that is truly nite and day different between PS5 Quality & PS5 Pro mode, the PS5 version looks "gamey" whereas the Pro looks "realistic" if they can show more games where the lighting goes from gamey to realistic due to RTGI and other effects then it could sway me to drop the £££ on it, but i need to see proper tangible differences like F1 first, higher frame rates and more distant trees simply don't cut the mustard for me.. I'm surprised GG didnt show off Horizon with RT lighting, as much as i love Forbidden West and its insane graphics, the lighting looks gamey
 

Radical_3d

Member
Rift Apart is next on the video comparison list.

iD9GMht.jpeg
Interesting one but when he’s not zooming (and without spectacles) I couldn’t see the difference in most instances on my 4K TV. But when there is movement and FSR breaks everywhere is evident, so I guess that the comparison is more valid in those takes, since a game is not about having your character still and looking at a wall.
 

th4tguy

Member
I get caught up in the discourse of these technical comparisons and feel like the pro is a must, but then I start ff7 remake on performance and remember that despite what the internet says, the game still looks great to me.
I thought that might change when I got my new tv this year, upgrading from 2009 Bravia lcd tv, but it really hasn’t .
(Bravia 9 65”)
 

proandrad

Member
I get caught up in the discourse of these technical comparisons and feel like the pro is a must, but then I start ff7 remake on performance and remember that despite what the internet says, the game still looks great to me.
I thought that might change when I got my new tv this year, upgrading from 2009 Bravia lcd tv, but it really hasn’t .
(Bravia 9 65”)
Remake looks great, but rebirth’s resolution in performance mode looks like PS3 levels of blurry.
 

saintjules

Gold Member
Interesting one but when he’s not zooming (and without spectacles) I couldn’t see the difference in most instances on my 4K TV. But when there is movement and FSR breaks everywhere is evident, so I guess that the comparison is more valid in those takes, since a game is not about having your character still and looking at a wall.
Something like this I guess.

tLwrRK7.png
 
Last edited:
Top Bottom