• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro Specs Leak are Real, Releasing Holiday 2024(Insider Gaming)

320 INT8 TOPS is actually the same number of 4070 Ti

4070 SUPER has 284 INT8 TOPS

So it's a lot TOPS then... Sony are extremely fussy about the silicon space and feature set of the APU, if they're adding that amount of ML performance then it could be a good sign for how performant PSSR will be.
 

Mr.Phoenix

Member
At this point every console gamer will take anything but FSR.
I personally feel that the biggest mistake of this gen was Sony and MS not having dedicated ML hardware in their consoles from launch, even during the pre-launch speculation threads I used to say that reconstruction was the single biggest rendering feature of the last decade and I expected everyone to be doing it. Then again... we have AMd to thank for this oversight.
 
And Serrano jinxed it


Oct release date confirmed

dance dancing GIF
 

Loxus

Member
AMD Radeon RX 7900 XTX and XT Review: Shooting for the Top

AI Accelerators / Tensor Cores INT8
PS5 Pro - 300 TOPs
7700XT - 70 TOPs
7800XT - 75 TOPs
7900XT - 103 TOPs
7900XTX - 123 TOPs
3080 Ti - 273 TOPs
3090 Ti - 320 TOPs
4080 - 780 TOPs
4090 - 1321 TOPs

PS5 Pro making Next-gen leaps in a mid-gen refresh.

Still waiting to hear what some of you think the PS5 and PS5 Pro RT is equivalent to.

Note:
4090 1321TOPs is still INT8 but with Sparsity enabled, which gives a 2× improvement.
 
Last edited:

ChiefDada

Gold Member
320 INT8 TOPS is actually the same number of 4070 Ti

4070 SUPER has 284 INT8 TOPS

Has anyone done comparisons to see how the figure scales with quality of ai workloads such as dlss upscaling, ray reconstruction, etc.? Seems like this is the successor to native pixel counting.

But 4070ti level of AI throughput sounds awesome to me.
 

Loxus

Member
4070S in real world RT performance of actual games.
I said that same thing in a previous post.
Actually it was the 4070TI.
PS5 Pro RT should be around 7800XT / RTX 3070 TI (33 fps), maybe 4060 Ti (38 fps).

PS5 RT isn't better than the 6700 XT (13 fps), and the leak states 2-3 times better, sometimes 4.

3× is most likely the average, which matches this test based on averages.
For example:
12 × 2 = 24 fps (worst case scenario)
12 × 3 = 36 fps (average)
12 × 4 = 48 fps (best case scenario)

Radeon RX 7800 XT reference review
kJiuVua.png


The above benchmark is Full Pathtracing, which properly test RT performance.


Hybrid Raytracing is a bit more reasonable.
22 × 2 = 44 fps (worst case scenario)
22 × 3 = 66 fps (average)
22 × 4 = 88 fps (best case scenario)

Which should put it around 4070 Ti (65 fps), which is a crazy uplift if you think about it. Definitely using RDNA4 RT, which probably gives a ~20% uplift in performance.

Radeon RX 7800 XT reference review
B74PHG0.jpeg
 
Last edited:

Loxus

Member
Has anyone done comparisons to see how the figure scales with quality of ai workloads such as dlss upscaling, ray reconstruction, etc.? Seems like this is the successor to native pixel counting.

But 4070ti level of AI throughput sounds awesome to me.
Here.

Stable Diffusion Benchmarks: 45 Nvidia, AMD, and Intel GPUs Compared
We've benchmarked Stable Diffusion, a popular AI image generator, on the 45 of the latest Nvidia, AMD, and Intel GPUs to see how they stack up.
vVuGtBw.png

yQrJwmU.jpeg


Edit:
Just realized this test isn't using Int8.

Nvidia's Tensor cores clearly pack a punch, except as noted before, Stable Diffusion doesn't appear to leverage sparsity with the TensorRT code. (It doesn't use FP8 either, which could potentially double compute rates as well.)

Stable Diffusion is mostly likely only using FP16 for all cards tested.

Nvidia Tensor Cores FP16 and Int8 is 1:2,
While AMD AI Accelerators FP16 and Int8 is 1:1.

This is the FP16 comparison, which matches the Stable Diffusion benchmarks above.
XDayQyI.jpeg


So PS5 Pro AI Accelerators using FP16 would put it just under the 4090 but with Int8, the difference would look more like below. It still puts the PS5 Pro AI performance around the 4070TI as FP16 and Int8 scales 1:1 with AMD.
guOfNgh.jpeg
 
Last edited:

Bitstream

Member
Here's to hoping it's 4070TI performance, would love to see something like a 40hz Path Tracing mode released for CyberPunk.
 

Panajev2001a

GAF's Pleasant Genius
So it's a lot TOPS then... Sony are extremely fussy about the silicon space and feature set of the APU, if they're adding that amount of ML performance then it could be a good sign for how performant PSSR will be.
Until we can see it in a die layout like for Strix Point

WmVuMoe.jpeg


I would not assume PS5 Pro has a full NPU, although I think a full NPU might be what we get for PS6, and the 300 TOPS number is achieved with the vector ALUs in the CUs with maybe some small additional HW support (even if it were additional registers and the hardwiring of instructions).
A full NPU rated for 40+ TOPS is massive, see above diagram.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Here's to hoping it's 4070TI performance, would love to see something like a 40hz Path Tracing mode released for CyberPunk.
That's extremely unlikely. The 4070 Ti is well over twice the performance of the regular PS5, so it's hard to believe Sony's internal document would have downplayed a 2x improvement to just 45%.

RTX 4070 as Heisenberg so often said sounds like a safe bet. I don't see how such a small GPU could get up to 4070 Ti's tier of performance.
 

Bitstream

Member
Much of the discussion around the Pro's enhancements revolve around PSSR and AI upsampling, but I'm curious if they're able to get some sort of frame gen incorporated into the machine as well. (Non FSR3, obviously.)
 
Last edited:
That's extremely unlikely. The 4070 Ti is well over twice the performance of the regular PS5, so it's hard to believe Sony's internal document would have downplayed a 2x improvement to just 45%.

RTX 4070 as Heisenberg so often said sounds like a safe bet. I don't see how such a small GPU could get up to 4070 Ti's tier of performance.

I referenced 4070 Ti but ONLY in relation to RT INT 8 TOPS numbers (320 vs 300 for PS5 Pro).

It's impossibile that the rasterization performance will be close to that GPU

Even Vanilla 4070 would be amazing!
 
Last edited:

winjer

Gold Member
Has anyone done comparisons to see how the figure scales with quality of ai workloads such as dlss upscaling, ray reconstruction, etc.? Seems like this is the successor to native pixel counting.

But 4070ti level of AI throughput sounds awesome to me.

Sometimes ago, a user in reddit used a 4090 and nsight to check how much of the tensor cores DLSS was using.
Basically, 4K balance used around 20% of the tensor cores. But this was not a constant workflow. It would spike every frame, as it did the upscaling.
But while the rest of the frame was being rendered, the tensor cores would have very low, almost no tensor core usage.

One thing to consider, is that the more pixels DLSS has to interpolate, the more work needs to be done.
So for example, 4K quality will use less Tensor Cores per frame, than 4K performance.
 

ChiefDada

Gold Member
That's extremely unlikely. The 4070 Ti is well over twice the performance of the regular PS5, so it's hard to believe Sony's internal document would have downplayed a 2x improvement to just 45%.

RTX 4070 as Heisenberg so often said sounds like a safe bet. I don't see how such a small GPU could get up to 4070 Ti's tier of performance.

Well, it's possible you'll argue with the game used as an example but at the end of the day it is a real game that exists, regardless of opinions on optimization.

Richard tested 4070S vs TLOU Pt. 1 at console settings and found a mere 36% uplift. If you believe the 45% raster uplift of PS5 Pro, that would place it within 2% of 4070ti. We should not be surprised to see many first party games running on Pro comfortably in 4070S-4070ti territory as the generation progresses.





cLQ0M8l.jpeg
 

Gaiff

SBI’s Resident Gaslighter
Well, it's possible you'll argue with the game used as an example but at the end of the day it is a real game that exists, regardless of opinions on optimization.

Richard tested 4070S vs TLOU Pt. 1 at console settings and found a mere 36% uplift. If you believe the 45% raster uplift of PS5 Pro, that would place it within 2% of 4070ti. We should not be surprised to see many first party games running on Pro comfortably in 4070S-4070ti territory as the generation progresses.
Even among first-party games, TLOU is an outlier. Across all the games tested, the 4070S was right around twice the performance of the PS5. What we care about is the norm, not exceptions. 99.9% of the PS5's library will be third-parties. This is more representative of what we'll get than first-party games, which are only a handful of games.

If you wanna argue that TLOU Part III will perform like a 4070S+, on the Pro, sure, why not? But I don't think this is representative of the general performance.
 
Last edited:

ChiefDada

Gold Member
Sometimes ago, a user in reddit used a 4090 and nsight to check how much of the tensor cores DLSS was using.
Basically, 4K balance used around 20% of the tensor cores. But this was not a constant workflow. It would spike every frame, as it did the upscaling.
But while the rest of the frame was being rendered, the tensor cores would have very low, almost no tensor core usage.

One thing to consider, is that the more pixels DLSS has to interpolate, the more work needs to be done.
So for example, 4K quality will use less Tensor Cores per frame, than 4K performance.

Very interesting. They must have a plan for putting such a relatively large amount of ML horsepower to use. We already know Sony wants developers to not chase native 4K with emphasis on PS5 Pro graphics mode. I really want devs to stay at 1080p-1440p and go crazy with visuals/RT.
 
Their theories all fall apart when you mention 8K because the spec they think it is doesn’t support it.

That alone shows we don’t know what PS5 Pro is capable of, everyone is just randomly guessing.
After the teraflop thing between Xbox Series X and PS5 i'll never just look at specs and pretend to know what's going on. Most of Gaf should probably do the same.

Hell even Digital Foundry fell for that drama and for what?
 

ChiefDada

Gold Member
Even among first-party games, TLOU is an outlier.

You can say that about Cyberpunk RT on AMD cards, Black Myth Wukong on PS5, Returnal on PS5, and many others. They are still valid data points.

Across all the games tested, the 4070S was right around twice the performance of the PS5. What we care about is the norm, not exceptions. 99.9% of the PS5's library will be third-parties. This is more representative of what we'll get than first-party games, which are only a handful of games.

Maybe, but I personally care much more about the first party games.
 

Bojji

Member
Even among first-party games, TLOU is an outlier. Across all the games tested, the 4070S was right around twice the performance of the PS5. What we care about is the norm, not exceptions. 99.9% of the PS5's library will be third-parties. This is more representative of what we'll get than first-party games, which are only a handful of games.

If you wanna argue that TLOU Part III will perform like a 4070S+, on the Pro, sure, why not? But I don't think this is representative of the general performance.

I think TLoU was proven to not utilize PC GPUs properly.

4070S, 4070ti is around 2x PS5 performance. Again people are hoping that 45% will magically turn into 100%.

RT games will show bigger gains for sure but no game is just RT, only part of calculations will be in that 2-4x uplift.

Not to mention UE5 games will use software lumen, nanite and VSM. RT hardware won't help at all and raster difference is 45%...
 

Gaiff

SBI’s Resident Gaslighter
You can say that about Cyberpunk RT on AMD cards, Black Myth Wukong on PS5, Returnal on PS5, and many others. They are still valid data points.
I think Wukong runs fine on PS5 though? Equivalent PC parts don't do much better. Cyberpunk is an outlier without a doubt. However, this massive difference is mainly for games with insane ray tracing, which are again, a minority.
Maybe, but I personally care much more about the first party games.
Fair, but there really aren't that many of them.
 
Last edited:

bitbydeath

Member
Black Myth Wukong, Lords of the Fallen, and Immortals of Aveum all beg to differ.
I believe they were referring to the stutter issues that plagues PC. Consoles don’t have that. Those games you mentioned eg. Black Myth don’t hold a steady framerate on consoles. That’s a fixable issue given the right hardware/optimisations.
 

Gaiff

SBI’s Resident Gaslighter
I believe they were referring to the stutter issues that plagues PC. Consoles don’t have that. Those games you mentioned eg. Black Myth don’t hold a steady framerate on consoles. That’s a fixable issue given the right hardware/optimisations.
Consoles do have traversal stutters. Black Myth Wukong, thankfully, has no shader compilation stutters but does have traversal stutters. On PC, they can range from very bad to a minor annoyance depending on your hardware and just luck. On PS5, they seem to be bearable in general.
 

bitbydeath

Member
Consoles do have traversal stutters. Black Myth Wukong, thankfully, has no shader compilation stutters but does have traversal stutters. On PC, they can range from very bad to a minor annoyance depending on your hardware and just luck. On PS5, they seem to be bearable in general.
Are you talking about frame rates dipping when entering more busy areas because that is different. If you’re referring to something else can you link it?
 

Gaiff

SBI’s Resident Gaslighter
Are you talking about frame rates dipping when entering more busy areas because that is different. If you’re referring to something else can you link it?
Yes, traversal stutters. It's not necessarily busy areas. It's when you cross invisible boundaries and the area loads in the background. The game will have a slight hiccup.
 

bitbydeath

Member
Yes, traversal stutters. It's not necessarily busy areas. It's when you cross invisible boundaries and the area loads in the background. The game will have a slight hiccup.
I’m happy to be proven wrong (with a link), but the stutters have always been a PC exclusive and don’t feature on consoles because UE5 was built to take advantage of the unique console hardware/setup.

But please provide a link where it’s mentioned. I know Black Myth doesn’t have these issues and Digital Foundery doesn’t make mention of it either.
 

winjer

Gold Member
Very interesting. They must have a plan for putting such a relatively large amount of ML horsepower to use. We already know Sony wants developers to not chase native 4K with emphasis on PS5 Pro graphics mode. I really want devs to stay at 1080p-1440p and go crazy with visuals/RT.

At this point we don't exactly know how the ML units in the Pro will be.
But if they are similar to RDNA3, as in having WMMA extensions in the shaders, that means they are not fully dedicated tensor cores, like with nvidia.
So the shader units will be doing either shader calculations, or ML calculations.
Those estimated TOPS, are for all the shaders being used at the same time.
What will probably happen, is that some shaders will be doing ML upscaling, while other will be rendering the each frame.
But also consider, that the ML pass will be done at the end of each frame, so it's not like there will be a group of shader units just doing ML calculations.
 

Bojji

Member
Ok, same excuse for HFW?

PS5 performs better than 3070 and even using a 1.45x multiple from said baseline we are above 4070S territory.

Bz4V8vb.jpeg

Hmm? PS5 version uses dynamic res in 40fps mode with CB, 30fps mode is probably not 4k all the time as well. Not to mention I doubt PS5 is comparable to max settings.

But even if it is 4k all the time and on max settings in 30fps mode you still see 4070ti with almost 60fps. So again 2x performance.
 

Gaiff

SBI’s Resident Gaslighter
I’m happy to be proven wrong (with a link), but the stutters have always been a PC exclusive and don’t feature on consoles because UE5 was built to take advantage of the unique console hardware/setup.

But please provide a link where it’s mentioned. I know Black Myth doesn’t have these issues and Digital Foundery doesn’t make mention of it either.
DF does mention traversal stutters in Black Myth Wukong. What you're thinking of are shader compilation stutters which the consoles indeed don't have.

We have the thread here:

https://www.neogaf.com/threads/digi...t-visuals-but-too-many-tech-problems.1674119/
 

bitbydeath

Member
DF does mention traversal stutters in Black Myth Wukong. What you're thinking of are shader compilation stutters which the consoles indeed don't have.

We have the thread here:

https://www.neogaf.com/threads/digi...t-visuals-but-too-many-tech-problems.1674119/
They also mention it’s fixable, so it is just framerate fluctuations, and not similar to the UE5 engine stutters found on PC.

The comment I originally quoted mentioned how new hardware wouldn’t resolve UE5 stutters. (the PC exclusive feature)
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
They also mention it’s fixable, so it is just framerate fluctuations, and not similar to the UE5 engine stutters found on PC.

The comment I originally quoted mentioned how new hardware wouldn’t resolve it.
Shader compilation stutters are fixable as well...they don't occur in Wukong. What happens in both the PC and PS5 version of Wukong are traversal stutters. It's fixable on both platforms and indeed, more powerful hardware won't just make the issue go away. The dev has to put in the time to do it.

You also get traversal stutters in Dead Space on consoles, but they're not as bad as on PC. PC also gets shader compilation stutters in addition to traversal stutters.

 
Last edited:

bitbydeath

Member
Shader compilation stutters are fixable as well...they don't occur in Wukong. What happens in both the PC and PS5 version of Wukong are traversal stutters. It's fixable on both platforms and indeed, more powerful hardware won't just make the issue go away. The dev has to put in the time to do it.

You also get traversal stutters in Dead Space on consoles, but they're not as bad as on PC. PC also gets shader compilation stutters in addition to traversal stutters.



You sure?
while shader compilation problems also rear their ugly head (on PC, not on PS5).
 
Top Bottom