PS5 Die Shot has been revealed

you amaze me :


"RGT is probably the best sourced youtuber but he got the unified ZEN 3 cache wrong. --->>>

Also the upgrade to 38CUs, upgrade to 16 gbpp GDDR6 iirc." (wrong)

"He gets info by piecing the bits he hears over internet + fishing for it via DMs. Believe me, he doesnt know much"

"So no unified L3 cache? I guess we can all blacklist RedGamingTech now."


"Yes I am happy he is caught to have a bad source."

"Eyeballing it, looks like 2 X 4MB L3 for the CPU so same size as XSX / Renior.
Surprise to no-one, I'm sure.
Also lmao Red Gaming Tech."

etc etc etc
I would be more careful in dancing victory laps. There has been speculation about two things for the PS5 die: That the new GE is part of the AMD roadmap and then people have speculated about the cache sizes (and the only way the PS5 could have a large cache is by having an off-die cache - not enough mm2 for that on the die). It is the GE that has had the 'RDNA3' rumour attached to it.

The middle GE and command processor section in this die shot is massive compared to the graphics cards and the XSX. So the first speculation - the key one - could very much be legit.
 
It's MonarchJT like usual taking one thing and running with it. I personally think he's jealous...................................
naah I just don't have any "coven" to defend. I understand why you guys attack Digital Foundry called them biased but felt so impressed that you had to defend that guy (redgamingt). He has been wrong multiple times and he will still be wrong at least as far as I am concerned about console is completely unreliable as well as biased Please don't want to continue this discussion, even if relevant as he speculated about probable rdna3 features and the IC and other bullshits. I know we will think differently so let it be it
 
Last edited:
Why should people care WHY games generally run better on PS5 ? All that matter is they do. If they're some sort of custom RDNA1, I frankly don't care, as long as games seem to run better or at least on par to Series X... I just don't understand why it would matter.

You're not devs, you won't understand what each manufacturer did anyway in terms of customizations.

Look at the games and the results. They speak for themselves. PS5 is extremely well designed (I should say better designed), otherwise... games wouldn't run like they do.

And that's BEFORE Sony 1st party biggest games, which means the 1st party games will destroy everything else available on competitor platform as usual...
They better do, because there is no competing platform on an exclusive game.
 
It's all we ever wanted. No wonder it has a performance delta on the competition. That Geometry Engine smack in the middle is big and sexy as hell.

Lol at Cornholio with the RDNA 1 FUD. Keep at it champ.

Honestly the CUs are RDNA2 and the GPU itself resembles desktop GPUs more that the XSX does. That's what I got from this die shot.

What I'm wondering is if there's any difference between the PS5 CUs and the XSXs?

If they are both vanilla RDNA2 CUs they should look the same. If they are not there should be some differences between them.

Very interesting how the GE is smack dab in the middle of the PS5s GPU while the GE in the XSXs chip is off the side.

I wonder why they did this?
 
Why should people care WHY games generally run better on PS5 ? All that matter is they do. If they're some sort of custom RDNA1, I frankly don't care, as long as games seem to run better or at least on par to Series X... I just don't understand why it would matter.

You're not devs, you won't understand what each manufacturer did anyway in terms of customizations.

Look at the games and the results. They speak for themselves. PS5 is extremely well designed (I should say better designed), otherwise... games wouldn't run like they do.

And that's BEFORE Sony 1st party design for it, which means the 1st party games will destroy everything else on competitor platform as usual...
Season 4 Crying GIF by Fuller House
 
Well, I like that the mystery is becoming more....mysterious.

Seriously, the die shot is welcome. More tech speculation and debates. I love it.
 
B6YSjIL.jpg

wFyFcOG.jpg



They seem basically the same just arranged differently and have different amounts of CUs and memory bus.

PS5 seems to have more of its smaller die dedicated to I/O, I think.

I wonder why they're arranged fairly differently. PS5 seems more like how AMD PC GPUs are.
 
Last edited:
I also have a slight curiosity about the number of ACEs, maybe the high resolution shot will provide some info about that.
 
Last edited:
naah I just don't have any "coven" to defend. I understand why you guys attack Digital Foundry called them biased but felt so impressed that you had to defend that guy. He has been wrong multiple times and he will still be wrong at least as far as I am concerned, console is completely unreliable as well as biased Please don't want to continue this discussion, even if relevant as he speculated about probable rdna3 features and the IC. I know we will think differently so let it be it
Lmao man... just keep it real. Jesus.

That's why we like Ricky. May not be too bright, but he don't hide what platform he prefers. You can respect that.
 
Why should people care WHY games generally run better on PS5 ? All that matter is they do. If they're some sort of custom RDNA1, I frankly don't care, as long as games seem to run better or at least on par to Series X... I just don't understand why it would matter.

You're not devs, you won't understand what each manufacturer did anyway in terms of customizations.

Look at the games and the results. They speak for themselves. PS5 is extremely well designed (I should say better designed), otherwise... games wouldn't run like they do.

And that's BEFORE Sony 1st party biggest games, which means the 1st party games will destroy everything else available on competitor platform as usual...
So you will care if multi plats eventually run slight worse then xsx? That's a bit sad.
 
Tempest is a CU but with its caches stripped down, so it could look completely different from a RDNA2 CU.
I don't think it's in one of the two disabled WGPs either. Those are disabled for redundancy and higher yields.
It can't be one of the active CU's.
You ever heard of a GPU with 35 or 55 active CU's?
It's most likely, 36 Active - 3 Disable - 1 repurpose as Tempest Engine.

Why would Cerny waste CU's like that?
Cerny is smart enough not to waste CU's disabling, but rather repurpose them.
 
you amaze me :


"RGT is probably the best sourced youtuber but he got the unified ZEN 3 cache wrong. --->>>

Also the upgrade to 38CUs, upgrade to 16 gbpp GDDR6 iirc." (wrong)

"He gets info by piecing the bits he hears over internet + fishing for it via DMs. Believe me, he doesnt know much"

"So no unified L3 cache? I guess we can all blacklist RedGamingTech now."


"Yes I am happy he is caught to have a bad source."

"Eyeballing it, looks like 2 X 4MB L3 for the CPU so same size as XSX / Renior.
Surprise to no-one, I'm sure.
Also lmao Red Gaming Tech."

etc etc etc
Seriously dude are you pretending to be dense? Or are you stupid? I can't tell anymore...........................
Did you ever listen to what he said? or read anything he said?
What did he get wrong about the unified Zen3 cache?
38CU's and upgrade to 16gbpp GDDR6? Did you watch that video? He got asked could Sony do any last minute upgrades like they did with PS4 and 8gb of ram. He said it was probably too late because these specs are set in stone years in advance. Then he SPECULATED what could possibly be done if Sony indeed did do any last min upgrades. Yet again you make a fool of yourself.
So no unified L3 cache? Again already answered this in an earlier post and again speculation as to why the PS5 CPU was performing so well.
I think you need help. Jealousy is a bad disease.
 
It seems History repeats with PS5/sex (PS4Pro/OneX).... imagine if MS didn't announce the price of Sex first, how Sony would charge for their rdna1,5/zen1 heater.
 
The die shot does show that it closely resembles a desktop RDNA2 GPU so that's probably the case. Not saying the XSX isn't RDNA2 but for some reason it looks alot different to me. I honestly have no idea why it looks the way it does.

74560_01_this-is-the-gpu-die-shot-of-xbox-series-rdna-2-looks-delicious_full.jpg


rdna-2-arch-100867215-large.jpg

evEpzRW.jpg


Maybe it's nothing though.
Talking real estate (so, no capabilities, but placement of transistor clusters) the PS5 is visibly in line with RDNA2, while the XSX looks like the APU architecture with the GNC.
 
Desktop Zen 2 and Zen 3:
o59e4sz43vy51.png

XSX and XSS die-shots:
Die_Shot_Comp.jpg
Those (top) are 3xxx series zen2 ('matisse' cores)- it looks like both PS5 and Series S/X are using Zen2 'renoir' cores as found in moblile parts (4xxx series) - that's 4MB L3 cache per 4 core CCX.

Both PS5 and Series CPU cores look very similar to renoir cores imo - there are some images already in this thread eg here https://www.neogaf.com/threads/ps5-die-shot-has-been-revealed.1591559/page-3#post-262358033

tldr - both Series X/S and PS5 are have mobile Zen2 cores, closer to "supercharged" mobile zen chips (with extra CU) ..
 
Last edited:
It can't be one of the active CU's.
You ever heard of a GPU with 35 or 55 active CU's?
It's most likely, 36 Active - 3 Disable - 1 repurpose as Tempest Engine.

Why would Cerny waste CU's like that?
Cerny is smart enough not to waste CU's disabling, but rather repurpose them.
The tempest engine isn't one of the GPU CU's. There are four disabled for redundancy.
 
I wonder what they did cut as Cerny specifically mention the CPU had 256 bits operations support (was rewatching the Road to PS5 piece today of all days). I do not think they went with the old two cycle AVX256, 1x256 bits instruction executed as 2x128 bits ones, one per cycle.

Why it would consume a lot less power to process AVX256 with two 128 bits instructions in two cycles than single cycle 256 bits. Still, but confused I will be honest.
Because it means a lot less pressure for the system. But the thing with AVX256 is just a guess. Zen2 is fast, even without that. And AVX instructions are a nice to have. You can do everything even without AVX, but AVX support makes it faster/more efficient. More efficient doesn't mean it needs less power. It needs more in a short time. E.g. Intel CPUs clock themself down into <3GHz territory when AVX(256) is really used and the CPUs still need much power and generate a lot of heat. But you can do it with less power and less heat, but you also need more time.
Heat and power is essential for the PS5, because if the CPU needs more power, it is taken from the whole budget, which means the PS5 GPU must clock down because it has less power available.
If you remove AVX256 from the equation, you might need a bit longer on the CPU side to make your calculations, but you don't need a bigger chunk of the power budget.

AVX instructions are great, but can really hurt your power budget. Btw you need more than 2x cycles if you want to make 256 bit calculations with 128-bit support. But there is a high chance that games won't use those instructions to often, so removing them might be a great way to reduce the overall power-draw of the CPU.

But that is all speculation. And not some dubios insider info ;)
 
It can't be one of the active CU's.
You ever heard of a GPU with 35 or 55 active CU's?
It's most likely, 36 Active - 3 Disable - 1 repurpose as Tempest Engine.

Why would Cerny waste CU's like that?
Cerny is smart enough not to waste CU's disabling, but rather repurpose them.
It is one CU outside the 40CUs of the GPUs.
It is a silicon unit outside the GPU.
 
Last edited:
I would be more careful in dancing victory laps. There has been speculation about two things for the PS5 die: That the new GE is part of the AMD roadmap and then people have speculated about the cache sizes (and the only way the PS5 could have a large cache is by having an off-die cache - not enough mm2 for that on the die). It is the GE that has had the 'RDNA3' rumour attached to it.

The middle GE and command processor section in this die shot is massive compared to the graphics cards and the XSX. So the first speculation - the key one - could very much be legit.
Look everything is possibile and trusting Locuza I'm sure he will release more info about the dieshot very soon. I have my theory and i also asked ethomaz to bet with me about it . he know what I'm pointing at . about the rdna3 things for me are all total bs to gain clicks from avid fanboys
 
Last edited:
It can't be one of the active CU's.
You ever heard of a GPU with 35 or 55 active CU's?
It's most likely, 36 Active - 3 Disable - 1 repurpose as Tempest Engine.

Why would Cerny waste CU's like that?
Cerny is smart enough not to waste CU's disabling, but rather repurpose them.
that's normal to dctivate CU's due to yields, if those news about 50% yilds are correct than if every CU would be active it would be yield at 30% or maybe lower...
 
Honestly the CUs are RDNA2 and the GPU itself resembles desktop GPUs more that the XSX does. That's what I got from this die shot.

What I'm wondering is if there's any difference between the PS5 CUs and the XSXs?

If they are both vanilla RDNA2 CUs they should look the same. If they are not there should be some differences between them.

Very interesting how the GE is smack dab in the middle of the PS5s GPU while the GE in the XSXs chip is off the side.

I wonder why they did this?

Yes, in IPC gains. XSXs CUs have 25% better perf/clock compared to last gen. PS5 CUs has 50%

ethomaz ethomaz can correct me
 
B6YSjIL.jpg


Vsa8j1d.jpg



They seem basically the same just arranged differently and have different amounts of CUs and memory bus.

PS5 seems to have more of its smaller die dedicated to I/O, I think.

I wonder why they're arranged fairly differently. PS5 seems more like how AMD PC GPUs are.

You have a small mistake


Some part of the lower WGPs is still GPU front end
Small white frame is the 28 Dual CUs
01e4871fca1be99b3c7e16780ac539e3efcb0113.jpeg
 
One things for sure, they are definitely designed differently.

The die shot comparisons are undeniable.
 
The die shot does show that it closely resembles a desktop RDNA2 GPU so that's probably the case. Not saying the XSX isn't RDNA2 but for some reason it looks alot different to me. I honestly have no idea why it looks the way it does.

74560_01_this-is-the-gpu-die-shot-of-xbox-series-rdna-2-looks-delicious_full.jpg


rdna-2-arch-100867215-large.jpg

evEpzRW.jpg


Maybe it's nothing though.
The top-down appearance of the SOCs does not tell the whole story. Both GPUs are custom. They were developed alongside RDNA2.
 
B6YSjIL.jpg


Vsa8j1d.jpg



They seem basically the same just arranged differently and have different amounts of CUs and memory bus.

PS5 seems to have more of its smaller die dedicated to I/O, I think.

I wonder why they're arranged fairly differently. PS5 seems more like how AMD PC GPUs are.

I mean the PS5 is supposed to have something called the I/O complex on it. Makes sense why it would have more I/O stuff in it.

Just weird how the XSX is arranged though compared to your typical AMD GPU. I have no idea if moving that stuff around has any impact on peformance.
 
I would be more careful in dancing victory laps. There has been speculation about two things for the PS5 die: That the new GE is part of the AMD roadmap and then people have speculated about the cache sizes (and the only way the PS5 could have a large cache is by having an off-die cache - not enough mm2 for that on the die). It is the GE that has had the 'RDNA3' rumour attached to it.

The middle GE and command processor section in this die shot is massive compared to the graphics cards and the XSX. So the first speculation - the key one - could very much be legit.
quickly playing with superposition using gddr6 memory interface as reference it is ~15% bigger i would say than the one in rdna1 shot we have
there is no real rdna2 shot so far (or i didn't find it)
 
The top-down appearance of the SOCs does not tell the whole story. Both GPUs are custom. They were developed alongside RDNA2.

This is correct. They have lots of freedom on how large blocks (ip, cache, processors) is layed out on a chip. You can't deduced anything from the layout simply alone.
 
Last edited:
The top-down appearance of the SOCs does not tell the whole story. Both GPUs are custom. They were developed alongside RDNA2.

My guess.

Sony probably were fine with how a typical RDNA2 GPU is arranged so they used that design. Microsoft for whatever reason needed theirs to be different. Maybe it was due to them wanting more CU per shader array or something else.
 
DeepEnigma DeepEnigma

I take back what I said. You're right.

We are hootin' and hollerin' now.

Mothers Day No GIF by IFC


"Oh noooos he implied my master cerny lied to me". Garbage like you should just be removed from the forums. Don't like a discussion, so you launch insults at people. Sad little person lol

I find it really odd that his post ("get fucked ya fuckin idiot") went completely ignored. Like, wtf? Get that shit outta here. Unless he and MC are buddies who always joke around with each other like that. If that's the case, carry on.
 
Last edited:
Wait a whole lot of nothing picture that most people initially joke about has turned into an 8 page war in a Sony thread. Where are all the Sony fans crying that the xboys shitting up yet another Sony thread? They're not it seems.

Concern Concern wheres your concern for this thread being ruined by fans boys spreading the usual fud? You're quick to screech if it's an Xbox thread.
 
Last edited:
Yes, in IPC gains. XSXs CUs have 25% better perf/clock compared to last gen. PS5 CUs has 50%

ethomaz ethomaz can correct me
Have you any proof/sources for that claim?

I just asked about the 256bits support in Road of PS5.



Seems like it is not set on the FPU 256bits being cut yet.

Nobody said that this is really missing. But it would be the first logical thing to cut. But they might just have cut something else. It is pure speculation.
 
Last edited:
naah I just don't have any "coven" to defend. I understand why you guys attack Digital Foundry called them biased but felt so impressed that you had to defend that guy (redgamingt). He has been wrong multiple times and he will still be wrong at least as far as I am concerned about console is completely unreliable as well as biased Please don't want to continue this discussion, even if relevant as he speculated about probable rdna3 features and the IC and other bullshits. I know we will think differently so let it be it
Who is you guys who attack DF? I only said if it's a Tom Morgan video I know what to expect and the rest are fine.
He's been wrong on what? Do you know the difference between speculation and fact? His coverage for Xbox has been spot on and fair. He's a PC gamer first. If you can't differentiate between fact and him giving his opinion and speculation you need help bad.
 
Seriously dude are you pretending to be dense? Or are you stupid? I can't tell anymore...........................
Did you ever listen to what he said? or read anything he said?
What did he get wrong about the unified Zen3 cache?
38CU's and upgrade to 16gbpp GDDR6? Did you watch that video? He got asked could Sony do any last minute upgrades like they did with PS4 and 8gb of ram. He said it was probably too late because these specs are set in stone years in advance. Then he SPECULATED what could possibly be done if Sony indeed did do any last min upgrades. Yet again you make a fool of yourself.
So no unified L3 cache? Again already answered this in an earlier post and again speculation as to why the PS5 CPU was performing so well.
I think you need help. Jealousy is a bad disease.

lol man you are out. those was things that people are writing on b3d. Wake up. and I'm not interested in learning more anyway. Redgamingtech is proven as absolutely unreliable in the console space
 
My guess.

Sony probably were fine with how a typical RDNA2 GPU is arranged so they used that design. Microsoft for whatever reason needed theirs to be different. Maybe it was due to them wanting more CU per shader array or something else.
We would need a full log of the development process for both cards to know for sure how each of them came into how they are now.
 
Top Bottom