• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro Specs Leak are Real, Releasing Holiday 2024(Insider Gaming)

Bojji

Member
But using PSSR, if they decide to run the game at a lower resolution and then use PSSR to upscale it, wouldn't that extra headroom allow higher frame rates?

Yes, framerate or graphics but the problem is many games are already in low resolutions even for PSSR/DLSS to handle things for 4k screen. Just some quick comparisons without RT into play:

1. Game runs 1440p on PS5 in 60FPS, devs can leave that 1440p for Quality PSSR and have 45% more power left for better framerate or graphics quality (or both) OR they can drop to 1080p and have more than 100% performance increase in total. This is ideal scenario.

2. Game runs 1080p on PS5 in 60FPS, devs can realistically only switch from FSR to PSSR so they have 45% more performance left. This will be the most typical scenario based on resolutions we have right now.

3. Game runs 1080p on PS5 in 60FPS but struggles to keep that FPS, runs around 45-50FPS. Most of the performance uplift will go into fixing performance, we could see games that look essentialy identical when it comes to graphical features but have better performance and image quality (while having the same resolution)

4. Game runs in 900p or 720p in 60FPS on PS5, all that + power of the Pro will go into fixing image quality of the game, increasing resolution to something that PSSR can change into good 4K experience. No change in performance or graphics.

5. Game runs in 900p or 720p but struggles to reach 60FPS, I won't even comment on that and we have games like that on PS5 :messenger_tears_of_joy:

Even DLSS can't produce good IQ (in motion) from 720p to 4K so some games will look like shit anyway...
 

IDWhite

Member
What sram and coherency hardware in the SoC?
Are you talking about the SSD controller and it's cache?

F4dXUn8.png


The coherency engines are capable of communicate with CPU, GPU and all the coprocessors to improve cache utilisation.
 

twilo99

Gold Member
I beg to differ. I've been playing on PC since 1999 and to me there's been a steady decline in the PC gaming experience - for many various reasons you probably know or understand.

Consoles are far from perfect but at least they are fluid and deliver the goods without too much hassle. Aka, gaming.

You and I have completely opposing experiences with PC gaming in the last 25 years.
 
I was saying this from day one, yet people are going 4070... 4070S... 4080 next week?

a 7700XT will be comparable to a 4070 in rasterization but way off on RT, but PS5 Pro will have better RT capabilities than a 7700XT.

Yes, framerate or graphics but the problem is many games are already in low resolutions even for PSSR/DLSS to handle things for 4k screen. Just some quick comparisons without RT into play:

1. Game runs 1440p on PS5 in 60FPS, devs can leave that 1440p for Quality PSSR and have 45% more power left for better framerate or graphics quality (or both) OR they can drop to 1080p and have more than 100% performance increase in total. This is ideal scenario.

2. Game runs 1080p on PS5 in 60FPS, devs can realistically only switch from FSR to PSSR so they have 45% more performance left. This will be the most typical scenario based on resolutions we have right now.

3. Game runs 1080p on PS5 in 60FPS but struggles to keep that FPS, runs around 45-50FPS. Most of the performance uplift will go into fixing performance, we could see games that look essentialy identical when it comes to graphical features but have better performance and image quality (while having the same resolution)

4. Game runs in 900p or 720p in 60FPS on PS5, all that + power of the Pro will go into fixing image quality of the game, increasing resolution to something that PSSR can change into good 4K experience. No change in performance or graphics.

5. Game runs in 900p or 720p but struggles to reach 60FPS, I won't even comment on that and we have games like that on PS5 :messenger_tears_of_joy:

Even DLSS can't produce good IQ (in motion) from 720p to 4K so some games will look like shit anyway...

This is just a baseline example, 45% more headroom is massive, in reality if a game runs at 720p and struggles to reach 60fps on PS5 then it's a pile of shit and wouldn't run well on a 4070 either, it's all relative right, some games are just not saveable no matter how much brute force you throw at them.
 

Bojji

Member
a 7700XT will be comparable to a 4070 in rasterization but way off on RT, but PS5 Pro will have better RT capabilities than a 7700XT.



This is just a baseline example, 45% more headroom is massive, in reality if a game runs at 720p and struggles to reach 60fps on PS5 then it's a pile of shit and wouldn't run well on a 4070 either, it's all relative right, some games are just not saveable no matter how much brute force you throw at them.

4070 is still above that by more than 10% and 3080 is almost 20% better in raster.

11EEpsO.jpeg
CaZKDSh.jpeg


45% is big and also... not big. It will take 30fps game ~45FPS and game that runs in 45FPS (like wukong hahaha) to stable 60. This is nice improvement but nothing earth shattering.

When it comes to resolutions 45% won't even let you go from 1080p to 1440p...
 
4070 is still above that by more than 10% and 3080 is almost 20% better in raster.

11EEpsO.jpeg
CaZKDSh.jpeg


45% is big and also... not big. It will take 30fps game ~45FPS and game that runs in 45FPS (like wukong hahaha) to stable 60. This is nice improvement but nothing earth shattering.

When it comes to resolutions 45% won't even let you go from 1080p to 1440p...

Firstly that's relative performance assuming PC, DX11/12 PS5 vs 2080Ti has a 25% relative performance increase toward the 2080Ti, but I can play Fortnite at a very stable 120fps on PS5 but get around 70/80fps on a 2080Ti at the same native resolution, PS5/Pro have other advantages over PC's.

There are PS5 games that have offered similar performance to a 2080Ti at the same settings which defies expectation of the PS5, all engines are not created equally.

Secondly - you can't have it both ways 45% is huge or 10% is nothing - it will be comparable to a 4070 and will surpass it in some titles.

Wukong will do 120fps on PS5 Pro PSSR upscaled from 1080p with the same settings and better IQ thanks to the 45% uplift in rasterization and the improved 4xRT increase assuming they patch it.
 

Bojji

Member
Firstly that's relative performance assuming PC, DX11/12 PS5 vs 2080Ti has a 25% relative performance increase toward the 2080Ti, but I can play Fortnite at a very stable 120fps on PS5 but get around 70/80fps on a 2080Ti at the same native resolution, PS5/Pro have other advantages over PC's.

There are PS5 games that have offered similar performance to a 2080Ti at the same settings which defies expectation of the PS5, all engines are not created equally.

Secondly - you can't have it both ways 45% is huge or 10% is nothing - it will be comparable to a 4070 and will surpass it in some titles.

Wukong will do 120fps on PS5 Pro PSSR upscaled from 1080p with the same settings and better IQ thanks to the 45% uplift in rasterization and the improved 4xRT increase assuming they patch it.

In 2020 when PS5/XSX released both consoles were compared to 2070S - 2080, nothing changes since then other than Nvidia and developers not giving a fuck about turing architecture anymore. Raw power of PS5 is still around that level. 2080ti is comfortably above that and your anecdotal evidence is just that, anecdotal evidence. Go watch some DF videos.

Wukong runs in 45FPS in 1080p, you need that 45% just to reach stable 60 in 1080p. It also uses software lumen so 45% is all you get from raster improvement. If they patch it it will be more heavy anyway, hardware lumen performs worse than software version on Ada cards even with best RT hardware available.

120fps only with frame generation...
 

Gaiff

SBI’s Resident Gaslighter
Firstly that's relative performance assuming PC, DX11/12 PS5 vs 2080Ti has a 25% relative performance increase toward the 2080Ti, but I can play Fortnite at a very stable 120fps on PS5 but get around 70/80fps on a 2080Ti at the same native resolution, PS5/Pro have other advantages over PC's.
PS5 outperforming the 2080 Ti by 50%? I don't think so.
There are PS5 games that have offered similar performance to a 2080Ti at the same settings which defies expectation of the PS5, all engines are not created equally.
There is one game: The Last of Us Part I.
 
In 2020 when PS5/XSX released both consoles were compared to 2070S - 2080, nothing changes since then other than Nvidia and developers not giving a fuck about turing architecture anymore. Raw power of PS5 is still around that level. 2080ti is comfortably above that and your anecdotal evidence is just that, anecdotal evidence. Go watch some DF videos.

Wukong runs in 45FPS in 1080p, you need that 45% just to reach stable 60 in 1080p. It also uses software lumen so 45% is all you get from raster improvement. If they patch it it will be more heavy anyway, hardware lumen performs worse than software version on Ada cards even with best RT hardware available.

120fps only with frame generation...

I stated it will run 120fps if they patch it to include PSSR as an upscaler so running on the same settings, not if they improve settings.

I don't need to watch Digital Foundry videos, I have 2 rigs here with a 2080Ti and a 4090 as well as a PS5, I can run my own comparisons, PS5 performs to a 2080Ti level in some games because of the other benefits of the PS5 arch and API's vs a PC, of course nothing improved from a hardware level, nobody claimed that, this will be my last response to you, because you don't actually know anything, you're just parroting things Digital Foundry have said and quoting 45% from developer documentation like it's going to be a straight 45% uplift, like i said, it's a baseline, most games will offer more than 45% improvement overall on PS5 Pro vs PS5, some will offer less than 45% it depends on the engine and the developers.
 
PS5 outperforming the 2080 Ti by 50%? I don't think so.

There is one game: The Last of Us Part I.

Where did I say PS5 outperforms a 2080Ti by 50%, I used one example, being Fortnite where 2080Ti struggles at the same settings as PS5 and get's outperformed by 30/40% unless I drop to DX11 on the 2080Ti where it will perform within 5% of the PS5.
 

Gaiff

SBI’s Resident Gaslighter
Where did I say PS5 outperforms a 2080Ti by 50%, I used one example, being Fortnite where 2080Ti struggles at the same settings as PS5 and get's outperformed by 30/40% unless I drop to DX11 on the 2080Ti where it will perform within 5% of the PS5.
70-80fps to 120 is a 50% increase using 80fps as the baseline. This never happens.

"I don't need DF." You sure as shit do because your magic PS5 is performing almost like a 3090 and we're all dying to see it in action.
 

Bojji

Member
Where did I say PS5 outperforms a 2080Ti by 50%, I used one example, being Fortnite where 2080Ti struggles at the same settings as PS5 and get's outperformed by 30/40% unless I drop to DX11 on the 2080Ti where it will perform within 5% of the PS5.

You forget that Fortnite uses dynamic resolution on PS5 so good luck comparing this title.
 

Gaiff

SBI’s Resident Gaslighter
120fps vs 80fps is a 33.33% uplift in FPS, PS5 performs nothing like a 3090 and i haven't claimed it anywhere.
This is not how basic mathematics work.

The baseline is 80fps. You add 50% to 80fps, what do you get? 120. 80 x 1.5 = 120. The PS5 performs 50% better than the 2080 Ti in this case or the 2080 Ti performs 33% worse than the PS5. Either way, that's 3090 territory and it's bollocks.
 
Stop comparing 7700xt to PS5 Pro in such a rudimentary manner. It has much better memory setup.

No doubt about that: the memory setup is the same as the 7900 GRE

256 bit/16 GB/18 Gbps

13.7 GB are available for games so still more than 12 GB
 
Last edited:

Bojji

Member
Yet here we are:




Stop comparing 7700xt to PS5 Pro in such a rudimentary manner. It has much better memory setup.


It has faster memory for sure but you forget about super fast L3 memory that both 6800 and 7700XT have while PS5 Pro don't.

NMS update probably talks about upper bounds of dynamic resolutions, we will see actual stats in game.
 

ChiefDada

Gold Member
It has faster memory for sure but you forget about super fast L3 memory that both 6800 and 7700XT have while PS5 Pro don't.

I didn't forget anything. IC means jack shit at the output resolution levels PS5 Pro will be rendering.

NMS update probably talks about upper bounds of dynamic resolutions, we will see actual stats in game.

No, lower bounds.
 

Aaravos

Neo Member
Is the machine learning for the pro AMD tech or Sony own custom solution? because Kepler yesterday was suggesting it's likely AMD but on the documentation sony sent out it said fully custom but I don't know
 

Bojji

Member
It talks about lower bounds of dynamic res
ZstlZ3o.png

I didn't forget anything. IC means jack shit at the output resolution levels PS5 Pro will be rendering.



No, lower bounds.

If infinity cache means jack shit for ~1080p then more memory bandwidth of PS5 Pro (vs. 7700XT) means that as well...

Game probably doesn't have to drop res as low thanks to that INCREASED MEMORY BANDWITH of PS5 Pro vs. PS5.

1080p - 2 073 600 pixels
1440p - 3 686 400 pixels

1440p needs 78% more power.
 

Zathalus

Member
Doesn’t Fortnite drop to around 720p (upscaled with TSR to 1440p) on the PS5 for the 120fps mode? The 60fps drops to 864p for 60fps so it might even be lower.

You telling me a 2080ti won’t get over 200fps with those settings?
 
Doesn’t Fortnite drop to around 720p (upscaled with TSR to 1440p) on the PS5 for the 120fps mode? The 60fps drops to 864p for 60fps so it might even be lower.

You telling me a 2080ti won’t get over 200fps with those settings?

I wasn't aware of PS5 using TSR and dynamic resolution the IQ is very very good in that case on PS5, I haven't checked Fortnite on the 2080Ti for a while, but i'd expect it will easily hit 240fps in DX11 but i don't think it would get anywhere close to 200fps with DX12 and Nanite/lumen enabled even with TSR and Dynamic Res on the 2080Ti based on past performance.
 

Gaiff

SBI’s Resident Gaslighter
I wasn't aware of PS5 using TSR and dynamic resolution the IQ is very very good in that case on PS5, I haven't checked Fortnite on the 2080Ti for a while, but i'd expect it will easily hit 240fps in DX11 but i don't think it would get anywhere close to 200fps with DX12 and Nanite/lumen enabled even with TSR and Dynamic Res on the 2080Ti based on past performance.
Fornite on the consoles uses dynamic resolution with TSR with a 4K target and the range is 864p-1836p. The average according to epic is 55% of 4K or 1188p. DF's analysis said it's comparable to High settings on PC. That's in the 60fps mode. The 120fps mode also gets rid of software virtual shadow maps, Lumen and Nanite.

I can't find something similar for the 2080 Ti because it's too old, but here is a 4060 running it at 1080p/High/Nanite and Lumen off and no dynamic resolution. He gets 200fps+.



Shortly after, he also runs it at 1440p with the same settings and gets over 130fps.

No way does a 2080 Ti with 120fps mode PS5 settings and resolution gets a paltry 70-80fps..
 
Last edited:

Bojji

Member
Me coming onto this thread seeing we're arguing about the PS5 Pro's PC counterpart for the millionth time :


IFMKlYm.jpeg

What do you expect when they pack almost retail PC hardware in consoles since 2013?

With IBM, power VR, Toshiba tech... this discussion would be much more interesting.
 

Aaravos

Neo Member
Kinda have to. In the absence of an actual PS5 Pro, all we can do is use PC parts in a similar ballpark to get an idea.
I think adds to the discussion and it can be informative but some posters use the pc comparison to belittle pro not saying you I think your always fair when comes to console and pc chat
 

Zathalus

Member
PCMR is more interested in the PS5 Pro than people that will actually buy it....

LOL

It tells you everything you need to know
People will keep jumping in and correcting people if they keep posting things about PC that are blatantly wrong.

As for those that will actually buy it, well I’m one of them. Just because I’m PC first doesn’t mean I don’t extensively play PS5 as well.
 

winjer

Gold Member
Consoles use a lot of PC hardware, so of course comparisons are made.
It's just logical.
And in a thread about the hardware specs, people will talk about hardware.
 

PaintTinJr

Member
This is from the same article.

Now, lets talk about that 123TFLOP FP16 number that AMD claims. While this is technically correct, there are significant limitations on this number. Looking at the RDNA3 ISA documentation, there is only one VOPD instruction that can dual issue packed FP16 instructions along with another that can work with packed BF16 numbers.

image-1.png
These are the 2 VOPD instructions that can used packed math.

This means that the headline 123TF FP16 number will only be seen in very limited scenarios, mainly in AI and ML workloads although gaming has started to use FP16 more often.



In the article, it states Dual-Issue can work with FP16 & BF16 instructions.
Below we can see WMMA utilizing FP16, BF16, & Int8 instructions.
4mE7aBR.jpeg


From my understanding, it seems the SIMD32 utilize the AI Accelerators for higher throughput when doing Matrix operations but the AI Accelerators aren't dedicated in the same way as CDNA2 Matrix Cores. Maybe this changes in RDNA4.

You can read more on RDNA4 here.
Examining AMD’s RDNA 4 Changes in LLVM

Yeah I already discussed that quite a few messages back and showed exactly how those two instructions expanded, and then in my next message in the thread I showed how that aligns to the GoW Ragnarok ML AI implementation.

Thanks for the link, that was a great read, and interestingly ....assuming I understood the article -especially the relevant paragraph halfway down about V_DUAL_DOT2ACC_F32_F16 - and how AMD ISA denotes instructions for the two instructions:

V_DUAL_DOT2ACC_F32_F16
and
V_DUAL_DOT2ACC_F32_BF16

Going by what the article said, I'm taking the first instruction as a dual issue instruction, where presumably two vectors v1 [x, y, z, w] and v2[s, t, q, r] with eight FP16 components total(packed in pairs into FP32) so
1st instruction [xy] dot [st] = RPM (x * s) + (y * t) = A
2nd instruction[zw] dot, [qr] = RPM (z * q) + (w * r) = B
Accumulate = A+B

And the same for the BF16 datatype which AFAIK is a specialist neural network leaning FP data type (by Google) for clamped data that swaps 3 significant bits of FP16 for 3 more decimal bits, that just so happens to match FP32 datatype's decimal bit precision. Making it ideal for FP32 high precision AI ML inference but just using 16bits of data.

Each inference depending on the variable count is effectively broken down into a multitude of those dual issue dot2 accumulations as expanded below(assuming I haven't misinterpreted anything) which for the purpose of PSSR would mean that between both RPM and dual issue that would effectively be 4 times the throughput of FP32, but at the precision of FP32 if using BF16

i = x * s + y * t + z * q + w * r ........

As for the WMMA, I might be misunderstanding but I thought it was just a hardware scheduler that could see opportunities to improve throughput by identifying scaler ops being done in vector ops and eliminating redundant work like multiplying with zeros that is common in lots of matrix maths like ML regression training and transforms. But for inference that wouldn't be the case as all the vector components will be needed with very low frequency of multiplications with zeros AFAIK to inference a pixel prediction in a PSSR, so ignoring the WMMA scheduler and manually writing inferencing to use either of those two dual issue packed BF16 or FP16 instructions would still be far superior performance and precision IMO, and especially as the BF16 would yield quality of eight FP32 vector components in 4 multiplications and 4 accumulations per instruction.
 

ChiefDada

Gold Member
Me coming onto this thread seeing we're arguing about the PS5 Pro's PC counterpart for the millionth time :


IFMKlYm.jpeg

Honestly the only reason I'm on here going back and forth with people is out of love and cognizant of the fact that in another week or so this thread will effectively die and I'm sad about it

Sad Married At First Sight GIF by Lifetime



Anyways, back to arguing...


If infinity cache means jack shit for ~1080p then more memory bandwidth of PS5 Pro (vs. 7700XT) means that as well...

Lol 1080p? Are you crazy my friend!? Try 4k output always. Even when upscaling with PSSR it will always be a 4k output aka 4k size framebuffer and memory requirements.
 

Aaravos

Neo Member
Honestly the only reason I'm on here going back and forth with people is out of love and cognizant of the fact that in another week or so this thread will effectively die and I'm sad about it

Sad Married At First Sight GIF by Lifetime



Anyways, back to arguing...




Lol 1080p? Are you crazy my friend!? Try 4k output always. Even when upscaling with PSSR it will always be a 4k output aka 4k size framebuffer and memory requirements.
Hi does this thread definitely close when ps5 pro launches? That would be shame lot of good discussion here
 

Mr.Phoenix

Member
Ok fine I will post some charts...

Using the base PS5 as the starting point, its easy to see what we should expect from the pro.

PS5PS5Pro
1440p-4K@30fps (Fidelity Mode)1440p + PSSR > 4K @40fps+/ ~1296p + PSSR >4K @ 60fps
900-1080p@60fps (Performance Mode)1080p + PSSR > 4K @60fps+ (maybe as high as 80fps)/ 1080P + PSSR > 1440p + RT @ 60fps

4070 is still above that by more than 10% and 3080 is almost 20% better in raster.

11EEpsO.jpeg
CaZKDSh.jpeg


45% is big and also... not big. It will take 30fps game ~45FPS and game that runs in 45FPS (like wukong hahaha) to stable 60. This is nice improvement but nothing earth shattering.

When it comes to resolutions 45% won't even let you go from 1080p to 1440p...
Truth be told is that most of this stuff is moot. I think a better way to look at this stuff... is something like this instead.

Lets take a game like Star wars for instance. Here is it running at max settings and at 1440p
performance-2560-1440.png


Notice the 7700XT is at ~40fps? Thats about the minimum we should expect from the PS5pro. The difference being that in a game like this it may tweak its IQ presets a bit (so not high/ultra across the board) and may even drop the rez from 1440p to 1296p then add RT better than whatever is used in the base PS5 and use PSSR to get it to 4k. That would be the PS5pro fidelity mode. In a situation like this, depending on how the devs wanna push it, you can even get a 60fps fidelity mode in such a game.

Then now lets look at the same game in 1080p
performance-1920-1080.png


Now notice the game is running at 64fps? Again, the PS5pro will have tweaked IQ settings, which would typically mean it can get closer to 75fps+ in this mode. But its likely that they would then go onto add RT in this mode and have the game average 60fps.

Fme... I don't bother with all that relative performance stuff, as there can be a lot of little differences between GPUs that we can't account for. I just look at game performance... I believe the PS5pro would be at the very least, as good as the 7700XT. So I use that as my baseline, look at how it performs at 1440p and 1080p... and extrapolate from there. Thats a $450 GPU. And mind you, the PS5pro has more RAM, more bandwidth, better RT and PSSR.
 
Last edited:
Top Bottom