PS5 Die Shot has been revealed

From the most recent DF comparisons both consoles look to be performing better than an equivalent(MHz and CU count factored in) RDNA 2 card on PC, not sure I would get so invested in X chip lacks X part or X part is version X... Its more important how the chip function as a whole. Without further info from the Architects/designers were still pissing in to the wind.

Both consoles offer great value if you can get them at RRP... and so far comparable performance.
 
I don't know why Locuza uses the word "downgrade" for it, this kind of tweaking (removing some redundant parts) is common in console space, it's more than likely to be the same or similar for XSX's CPU. Also, i think he was a bit too quick to claim half width (erroneously it seems) for the FPU.
Ah here we go with the downgrade debate. Again in Road to the ps5 Cerny spent a lot of time to explain a console it's not like a pc, he designed navi GPU for his purpose, as the whole soc, he admitted doesn't knew what would have been better for the pc scenario, he also said some hardware parts important on pc are completely pointless on console, more or less, and surprise, he omits a cpu part. Call it downgrade is sensationalist and very stupid, this guy should think for a second if the SOC costs would have levitate with such part indeed to invoke the downgrade. IMO.
 
Last edited:
When I did this I separated out the GPU as two components: CUs (incl. RT) and other. There is a big difference in 'other' between the two. I also split out the RAM bus.
Can you post your estimate / link to post? Curious to see what you found.
 
I don't get what is going on here. Can we have coloured headlines in the future? Dark blue if the content makes the ponys's panties wet and green for the Xboys?
 
Last edited:
Isn't 10 mm2 too small of a difference for the I/O knowing all the extra coprocessors and SRAM that it has? PS5's GPU is confirmed to be 308 mm2 as far as i know, maybe a fair part of this 8 mm2 can be added to the I/O size?
we have a list of the extra coprocessors ? probably are + o - the same number of the xsx in the end
 
Isn't 10 mm2 too small of a difference for the I/O knowing all the extra coprocessors and SRAM that it has? PS5's GPU is confirmed to be 308 mm2 as far as i know, maybe a fair part of this 8 mm2 can be added to the I/O size?
Ya I was just labeling that 40-50 mm as I/O and other stuff cause I don't' know all the little stuff that is on an SoC.

For example, XSX has a decent sized area between its CPU clusters labeled multimedia. Idk what exactly that is so I just put it into the I/O + other area, idk.

The PS5 definitely has more dedicated die space for I/O hw. If it didn't it would be even smaller die.
 
Ah here we go with the downgrade debate. Again in Road to the ps5 Cerny spent a lot of time to explain a console it's not like a pc, he designed navi GPU for his purpose, as the whole soc, he admitted doesn't knew what would have been better for the pc scenario, he also said some hardware parts important on pc are completely pointless on console, more or less, and surprise, he omits a cpu part. Call it downgrade is sensationalist and very stupid, this guy should think for a second if the SOC costs would have levitate with such part indeed to invoke the downgrade. IMO.
Locuza called out from assurdum here things get serious) hahah
 
From the most recent DF comparisons both consoles look to be performing better than an equivalent(MHz and CU count factored in) RDNA 2 card on PC, not sure I would get so invested in X chip lacks X part or X part is version X... Its more important how the chip function as a whole. Without further info from the Architects/designers were still pissing in to the wind.

Both consoles offer great value if you can get them at RRP... and so far comparable performance.
Well said :).
 
So what was this "key difference between the machines" that Matt Hargett was talking about? He put pretty heavy emphasis on CPU latency reduction in twitter, we know now that it is not unified L3 cache. I remain puzzled.
 
As I said on the first page. It doesn't matter considering how games are performing. Just curious how another thread about Sony became about Xbox.

And iirc neither offered soc shots for last gen.
That was my point, nothing to be inferred about Sony not offering an SoC shot really as they stopped that deep dive into low level tech with the PS3 really.
If they went to the same level of detail (when they apparently knew they had the superior SoC) with PS4 as they did with the PS5 SoC, I am not sure what would be the point of raising it as if they had something to hide as you were winking and nudging there.

Still sure, some people have warring in their hearts and either Sony fanboys or Xbox fans would have brought the other HW for comparison, with some waiting for that with popcorn 🍿 on their lap :)
 
Last edited:
I have not run the math but AVX makes your power and thermal values to go through the roof. The question is if in an environment such as the console with clear power and thermal limits there is a major benefit or not from the AVX instruction set? Is your CPU primarily bound by power and thermal limitations? Then it does not add much.

On the PC where you can play with both the thermal solution and the power supply the situation is very different. I can see how AVX adds very little in a console environment actually.
AVX-512 is a power virus according to many, most prominently https://www.zdnet.com/article/linus-torvalds-i-hope-intels-avx-512-dies-a-painful-death/
 
That was my point, nothing to be inferred about Sony not offering an SoC shot really as they stopped that deep dive into low level tech with the PS3 really.
If they went to the same level of detail (when they apparently knew they had the superior SoC) with PS4 as they did with the PS5 SoC, I am not sure what would be the point of raising it as if they had something to hide as you were winking and nudging there.

Still sure, some people have warring in their hearts and either Sony fanboys or Xbox fans would have brought the other HW for comparison, with some waiting for that with popcorn 🍿 on their lap :)


Where did I "wink and nudge" on that post? Another disingenuous accusation.

Funny it took you this long to reply while trying to encourage someone else to antagonize me. Thats why people called you out as a disingenuous poster who likes to play both side just to add fuel to the fire.

Next time don't hide behind others trying to encourage flamebait. It makes you look bad, regardless of your bad reputation here already.
 
uhm and what about being an rdna2 based gpu with all.rdna2 feautures make sense perfomance wise? lol

because i know what you him and etho are trying to allude at
 
Last edited:
That was my point, nothing to be inferred about Sony not offering an SoC shot really as they stopped that deep dive into low level tech with the PS3 really.
If they went to the same level of detail (when they apparently knew they had the superior SoC) with PS4 as they did with the PS5 SoC, I am not sure what would be the point of raising it as if they had something to hide as you were winking and nudging there.

Still sure, some people have warring in their hearts and either Sony fanboys or Xbox fans would have brought the other HW for comparison, with some waiting for that with popcorn 🍿 on their lap :)
Honestly don't bother with him at best he won't read or understand.
 
I think people keep going crazy over the mention of RDNA3.

Having a tweak in ps5/XsX that ends up also being in RDNA3 really wouldn't be surprising. Thinking that tweak is gonna boost the performance 2x is crazy though.
Fanboys read one thing and then take that thing out of context and then run with it. Longdi just did that claiming PS5 lacks AVX256 as fact. He didn't bother reading what others had written and just assumed it by that one line.
No one has claimed PS5 has RDNA3 features. But custom features developed with AMD might make it in future AMD GPU spec.
 
Fanboys read one thing and then take that thing out of context and then run with it. Longdi just did that claiming PS5 lacks AVX256 as fact. He didn't bother reading what others had written and just assumed it by that one line.
No one has claimed PS5 has RDNA3 features. But custom features developed with AMD might make it in future AMD GPU spec.
Hey hey hey ... Stop making sense right now, this s neogaf we argue about out of context sentences and speculations as truth here.
How-dare-you.gif

But yeah didn't they always said that they co created features with AMD that may or may not be in future cards?
 
Maybe I'm out of the loop and people really are claiming PS5 is using full on RDNA 3 or something but I just assume when people talk about RDNA 3 they mean that some small customizations will end up integreated in the next version of RDNA3. Like the cache scrubbers or something.
It was like the how PS4 Pro having some Vega tech before Vega launched. I'm sure some ppl do know this and are being disingenuous.
 
Still sure, some people have warring in their hearts and either Sony fanboys or Xbox fans would have brought the other HW for comparison, with some waiting for that with popcorn 🍿 on their lap :)

The sad thing is that the design choices made by both are really interesting but all discussion gets dragged down by idiots jumping on pics they don't understand or marketing phrases that they elevate to magic.

The more interesting debate is should either have waited to release... Only AMD first gen ray tracing, Infinity Cache would have helped allot with bandwidth issues especially as consoles aim for 4k on 4k screens, current chip supply issues, 5nm later this year, Ryzen 5 series..etc..

I hope that both are able to take full advantage of a future non-NVidia DLSS feature, would fix some of the current short comings in performance.
 
Yeah the praise for the ps5 has been consistent. But if there were problems with the XsX like some have been trying to claim, then I would have thought that an insider or a dev anonymously or both would have spilt the beans by now.
I do not think people not as famous as Gabe Newell would ever go on record on their negative concerns (and even Gabe walked it back a bit in public when demoing the Orange Collection on stage at the PS3 E3 presser).
Then again multiple devs (two very senior ones from Id for example) went on record with their technical concern about the XSS and the baseline it would set and people still refuse to believe them at all.
 
Playstation 5 is especially challenging because the CPU supports 256 bit native instructions that consume a lot of power.

These are great here and there but presumably only minimally used or are they if we plan for major 256 bit instruction usage we need to set the CPU clock substantially lower or noticeably increase the size of the power supply and fan.
Guys, just put it down to the ground. It's not downgrade in the way you may think. Obviously, this is a special design solution so these 256-bit native instructions that they can actually be used in games and not for bla-bla show, as is often the case. It's the practical part that is important and the AVX-code is very power-hungry. As a result, the CPU generates a huge amount of heat.
 
Last edited:
Sure, but AVX-256 could also fall under that category.
To a lesser degree, yes sure. But PS5 would handle it easier i think, thank to its custom variable frequency + smartshift feature. Besides Cerny already confirmed native AVX-256 instructions.
 
Last edited:
No need to read much more than the last page of the same spec thread trolls derailing another thread yet again.
It is a bit ironic since you are not adding much to this thread beside such comments (as if you just wanted to stir the pot : get fanboys arguing for your amusement / taking potshots from the gallery).
 
Last edited:
PS5 got "Large" SRAM in I/O complex with high probaility for similiar task.
Sure, but that would have a much different effect than the unified CPU cache (it is way farther and possibly much higher latency than the L3 would be I would think, I might be wrong).
 
man is the ps5 that is rumored lacking rdna2 features not the xsx and we can't compare the gain and if is because of the IC (we don't know for sure about ps5) this % at this point just for academic chatter..when we know that one ceil at 10 and the other at 12 tf it's and one have 36 and the other 52 cu's and especially we know every single IP that there's inside the xsx soc ...stop spinning

Me SPINNING?? WTF?

So, once again. Quoting an Xbox engineer. Since you love PS5 engineers sooo much.

Here is the official full HotChips conference for XSX. Of course, there isn't slide for every word they've said. It's impossible to make slides that much. LOL

But anyway :

Architecturally these CUs have 25% better performance per clock on average graphics workloads relative to the GCN generation

UpNlOnE.jpg


Timestamped at 16:50

 
Me SPINNING?? WTF?

So, once again. Quoting an Xbox engineer. Since you love PS5 engineers sooo much.

Here is the official full HotChips conference for XSX. Of course, there isn't slide for every word they've said. It's impossible to make slides that much. LOL

But anyway :



UpNlOnE.jpg


Timestamped at 16:50


It has been posted many times to him. Give up
 
PS5 got "Large" SRAM in I/O complex with high probaility for similiar task.
I can't see how that would be the case, the SRAM in question is closely tied to the coprocessors within its own subsystem, it has a different (I/O throughput) purpose. Besides, it seems to be far apart, this would increase latency drastically instead of reducing it.
 
Why everyone is suddenly so eager to accept that PS5's CPU doesn't have native AVX-256 despite Cerny's clear statement about it? Am i missing something? It was just an early wild guess from Locuza, they think now that Sony just removed the FADD block deemed to be redundant.
 
Why everyone is suddenly so eager to accept that PS5's CPU doesn't have native AVX-256 despite Cerny's clear statement about it? Am i missing something? It was just an early wild guess from Locuza, they think now that Sony just removed the FADD block deemed to be redundant.
If I'm not wrong Cerny already stated on ps5 some parts would have been removed if not necessary. Until native AVX 256 is still there, who cares, with all respect.
 
Last edited:
WTF is this guys problem? He adds nothing to the converstion but snarky comments?
Fanboism has screw his mind. The funny things he has the courage to blame whatever to Sony fanboy when his major contribution it's just laughing to the other posters with mostly nonsense argumentations and zero knowledge of what he talking about.
 
Last edited:
Why everyone is suddenly so eager to accept that PS5's CPU doesn't have native AVX-256 despite Cerny's clear statement about it? Am i missing something? It was just an early wild guess from Locuza, they think now that Sony just removed the FADD block deemed to be redundant.
Even if they did, it is nobody's business.They are the engineers, they know what they're doing.The console is still performing well, it is perhaps interesting on academical point of view, other than that, i'll say both consoles are achieving what they purposed to do.
 
I do not think people not as famous as Gabe Newell would ever go on record on their negative concerns (and even Gabe walked it back a bit in public when demoing the Orange Collection on stage at the PS3 E3 presser).
Then again multiple devs (two very senior ones from Id for example) went on record with their technical concern about the XSS and the baseline it would set and people still refuse to believe them at all.
Yeah I remember the XsS comments.

Still, if there were any issues, I would have thought that they would have surfaced by now. An anonymous dev with some sort of corroboration by an insider like Rees god Matt. We will see in near future if there is I feel
 
PS5 SOC

~ 300 mm^2 die area
~ 1.78 ratio rectangular die
~ 13 mm wide x ~ 23.25 mm long

~ 210 mm^2 for GPU + RAM bus
~ 40 mm^2 for CPU
~ 50 mm^2 for I/O + misc


XSX SOC

360 mm^2 die area
~ 1.5 ratio rectangular die
~ 15.5 mm wide x ~ 23.25 mm long

~ 280 mm^2 for GPU + RAM bus
~ 40 mm^2 for CPU
~ 40 mm^2 for I/O + misc
These numbers here makes no sense.
CPU is smaller on PS5 die the smaller PFU.
I will try to get a better pixel size late.
 
PS5 got "Large" SRAM in I/O complex with high probaility for similiar task.
yeah like they showed in a Foil in Cernys Talk but it is named "on chip ram" , if they did for some unknown reasons not to choose exact implementation of infinity Cache but develoip their own more fitting solution thy maybe not allowed to name it as "infinity cache" anyway.

Oh no Nxgamer... Not you..
dont know what you mean here - he is giving the alterartion to the Zen2 cores a thumps up not down..
 
Yeah I remember the XsS comments.

Still, if there were any issues, I would have thought that they would have surfaced by now. An anonymous dev with some sort of corroboration by an insider like Rees god Matt. We will see in near future if there is I feel

Also other Devs came out and said exactly the opposite.
 
I don't know why Locuza uses the word "downgrade" for it, this kind of tweaking (removing some redundant parts) is common in console space, it's more than likely to be the same or similar for XSX's CPU. Also, i think he was a bit too quick to claim half width (erroneously it seems) for the FPU.
Series X has a normal sized Zen 2's FPU.
 
Top Bottom