PS5 Die Shot has been revealed

The article says it is not 100% RDNA2 because there is no Infinite Cache.

"making the console not fully RDNA 2 as previously thought.
What is missing from the appeal within PS5, it would seem to be the Infinity Cache..."

Do you know any other console that doesn't have Infinite Cache?

was it ever thought that it was full rnda2? there was a ton of speculation about what the chip actually contained because they kept calling it a "custom rdna2". i never saw any legit statement that it was full rdna2
 
No where does it state anything about avx256 was a 'problem' that needs to go away, or necessary evil as a blade server.
Please for christ sake, drop that FUD spun by [IMG alt="geordiemp"]https://www.neogaf.com/data/avatars/s/326/326117.jpg?1584465491[/IMG]
These technical white papers are part of marketing and PR, no company will say something like 'we need to have a discussion about AVX' or 'top 10 reason we need to do away with AVX'... And then keep it on their chip.

My suspicion is that MS just took whatever was on the CPU core and adjusted the caches, their own custom blocks and went with it, which is what pretty much everyone expects.

Maybe they have a use for it (it seems some games use it on PC)... But as far as I know it's a bit redundant and on the context of those console APUs where you can count on the presence of a powerful GPU and the use of AVX code could create contention issues for memory access... For tasks that are better handled on the GPU to begin with, it may be a good idea to take the option off of the table and use the silicon for something else.

It's not like it was clearly an error to keep that on the chip, but it's not clear there is a benefit to it and this is not the kind of thing most people would have bothered with while they are working on a chip design made of who knows how many parts that have to work well together.
 
Youtube video is far more longer than some tweet. As he explained intm the video

Don't spin his words
the youtube video is the same as what he wrote after the research (and all the tweet) he asked if ppl would like a video content of the same text version. i watched both and both arrive to the same conclusion ..stop already to spread fud and to move goalpost young warrior
 
Last edited:
Scaling down is what some devs have been attempting in these latest 9th-gen ports. The result has been the SeriesS version lacking features like raytracing altogether.
When actual 9th-gen engines start appearing that use raytracing for e.g. shadows without a fallback to traditional shadowmaps, how are they going to solve that on the SeriesS? The only way is to develop for the SeriesS first, SeriesX second.




Yes, but that always happens. Cores maintain coherency (i.e. checking each others' calculation results) by snooping through the others' last level cache (LLC). The LLC in Jaguar was the L2 that served 4-core blocks, and in Zen1/2/3 the LLC is the L3.
If the cores couldn't access the instruction + data in each others' LLC, then they'd need to maintain coherency through the RAM, which is very slow in comparison.

The big difference between "unified cache" (Zen3) and two blocks of cache that can be accessed by cores that are external to that complex (Zen2) is the number of cycles needed to access the cache on that "far end" CPU core.
I don't know the numbers so this is just an example, but if in Zen3 it takes 8 cycles to access any part of the unified L3, in Zen2 it takes 8 cycles to access the L3 in its own CCX, but 40 cycles to access the L3 in the other CCX.
This makes a sizeable difference in gaming performance when the GPU isn't a bottleneck (usually high framerates), and it's a good part of the reason why Zen3 is so good for games.



There are two blocks of L3 seen in the PS5's die shot (one for each CCX), but it's also true that both CCXs are close together, and there's a relatively large unidentified space between the CPU and GPU blocks (which could be just glue logic).
It could be that the PS5's custom Zen2 has specific optimizations for reducing the cycles for inter-CCX L3 accesses, to the point where it behaves closer to an unified L3 than a separate one.

Something is responsible for the PS5 getting consistently better performance at framerate levels where the CPU is usually a bottleneck (120FPS modes), despite the SeriesX getting a non-variable higher clockrate and a full fledged Zen2 FPU.
And I don't buy the "current devkits are bad but future ones will bring the magic sauce" theories for the SeriesX. I expect both consoles to mature at parallel levels.
Lets not forget the cpu also has no io load. Microsoft claim only small part of 1 core (10%) for io but maybe thats best case and it can take more? Even if not its wasting cycles and will reduce xbox cpu efficiency compared to ps5 with zero io on it, all offloaded.

As said earlier add in ps5 advantages of the clocks, caches all faster, cache scrubbers for less misses, much better io and maybe more small gains and we should see why ps5 is revolutionary and performing above the normal expected paper specs.

There is unknown things in PS5 SoC and the most surprised is that the FPU is around half of the size of a normal Zen 2's FPU (or even Zen 1's FPU) and that is leading to several theories like the lack of 256bits native support or the cut of FDAA.
But silicon to silicon comparisons are not matching either theories.

Outside that... the CPU is very similar to Zen 2 so it doesn't have the L3 shared cache of Zen 3.

In the GPU the core part is bigger than actual RDNA/RDNA2 GPUs while the others parts are very similar (CUs, TMUs, cache, etc).

There is a big part of the silicon (close to the size of the CPU) that probably is where the Tempest Engine, I/O Complex, additional cache for I/O (?), etc is but nobody knows where it is or how big it is.

The FPU Differences:

EuS-WEBXEAEliTP


EuSrpONXMAc-stz


Overall silicon layout (look there is some mistakes in the labels because the color were photoshoped like the 4MB L3 should be 2MB L3):

EuTIgxNXMAMhKZ_


The gray parts with no label is what we don't know what is.

Here is a Paint comparison with Series X with a lot of guesstimates:

EuTIhDVWYAE6VWg


As for the ps5 vs sx chip comparisons its a bit above my level but, the gaps between the ram on ps5 are larger on ps5, is this anything meaningful?

It was mentioned about sx having more cu per shader array, 7 is it, ps5 having 5.

Xbox says it has 5mb L2 cache for those cu's but ps5 has 4mb L2 for their cus, meaning more cache per cu, is this correct?

Damn really need someone to break all this down.

Also is this anything worth noting? Are fanboys going to go to war?


 
oh wow so is emerging that ps5 isnt doing hw vrs this is HUGE if true (isn't just this tweet on twitter are spawning a lots that they aim for this

 
Last edited:
oh wow so is emerging that ps5 isnt doing hw vrs this is HUGE if true (isn't just this tweet on twitter are spawning a lots that they aim for this


No Vaseline Rendering Solution in the standard flare, HUGE!

Activision dev posted on RE that Sony has a VRS solution in the GE, so it's whatever. They just chose to update their software solution on a cross-gen port rather than dive into the GE from the ground up on an already coded and established game and engine on that platform.
 
Last edited:
metro devs saying they doing it sw on ps5...hw on the sx people are just asking this on twitter
The guy asked that VRS is listed for PS5 but it doesn't have support do DX12U API.
Devs replies they are using their own solution for VRS on PS5... the solution still uses hardware just like DX12U API solution.
 
Last edited:
No Vaseline Rendering Solution in the standard flare, HUGE!

Activision dev posted on RE that Sony has a VRS solution in the GE, so it's whatever. They just chose to update their software solution on a cross-gen port rather than dive into the GE from the ground up on an already coded and established game and engine on that platform.
interesting can you let me read what the dev said ? ...on reddit??
 
oh wow so is emerging that ps5 isnt doing hw vrs this is HUGE if true (isn't just this tweet on twitter are spawning a lots that they aim for this



Doesn't Sony have their own versions of VRS and Mesh Shaders?

 
The guy asked that VRS is listed for PS5 but it doesn't have support do DX12U API.
Devs replies they are using their own solution for VRS on PS5... the solution still uses hardware just like DX12U API solution.
no they saying they updating the sw version of it from ps4 ...and leviathan came in and said

 
no they saying they updating the sw version of it from ps4 ...and leviathan came in and said


Yes just like MS updated DX12 to DX12U.
It is software.

Now if you ask where it is being done at hardware level I believe that neither that dev can reply you.
 
Last edited:
no they saying they updating the sw version of it from ps4 ...and leviathan came in and said


Next question is how much cpu and gpu time it takes away .. from other rendering budgets ..

Much like MS. still uses a percentage of cpu for decompression..

It seems Sony spends that on VRS and other stuff .

Another wash.
 
Last edited:
the youtube video is the same as what he wrote after the research (and all the tweet) he asked if ppl would like a video content of the same text version. i watched both and both arrive to the same conclusion ..stop already to spread fud and to move goalpost young warrior

Me spreading FUD and calling me a warrior? LOL. You, Xbox fans are damn messed up. Shit, man.

Locuza said this in his video and you trying to spin, the man whom you trust and trying to spin his words
but is Microsoft really using the most advanced technology and are there no hardware differences between Xbox Series X/S and RDNA 2 GPUs from AMD? The answer to that is..... NO!
 
Next question is how much cpu and gpu time it takes away .. from other rendering budgets ..

Much like MS. still uses a percentage of cpu for decompression..

It seems Sony spends that on VRS and other stuff .

Another wash.
i don't know who is this leviathan guy honestly but he is stating that the ps5 does this via software using the GE. is this guy reliable? lots people asking him questions

 
Last edited:
Me spreading FUD and calling me a warrior? LOL. You, Xbox fans are damn messed up. Shit, man.

Locuza said this in his video and you trying to spin, the man whom you trust and trying to spin his words
he said there are differences (large) but it is a rdna2 based console ..pls stop it you spread just fud ..go to ask him if he said what i said of what you said and come back with the sad (for you) answer
 
i don't know who is this leviathan guy honestly but he is stating that the ps5 does this via software using the GE. is this guy reliable? lots people asking him questions
GE is hardware accelerated. The SW side is the very same/akin to SW side as DX12U updates. APIs still receive SW updates for their hardware they tool for.
 
Last edited:
he said there are differences (large) but it is a rdna2 based console ..pls stop it you spread just fud ..go to ask him if he said what i said of what you said and come back with the sad (for you) answer

Me spreading FUD? His statement is referred to MSs crappy PR about FULL RDNA 2. Again, you're messed up.

but is Microsoft really using the most advanced technology and are there no hardware differences between Xbox Series X/S and RDNA 2 GPUs from AMD? The answer to that is..... NO!
 
i don't know who is this leviathan guy honestly but he is stating that the ps5 does this via software using the GE. is this guy reliable? lots people asking him questions
That is the biggest question.
VRS indeed needs hardware to work.
MS VRS is a software API that uses some part of the hardware on RDNA2 that is said to be exclusive to RDNA2 but AMD didn't detail that.
Metro Dev uses their own VRS software API that uses some parts of the hardware on GCN and now was updated to use some parts of RDNA 2 found in PS5.

Is Metro VRS solution using the same parts of the hardware that MS VRS solution? Nobody knows.
Is Metro VRS solution using GE? Nobody knows.

We can ask them but I don't believe they will answer.

There is another question is Metro using the MS DX12 VRS API or the own VRS API Solution they created? Because the game is already made and it was probably using their own VRS API Solution on Xbox one and Xbox One X... so they will change on Series X to use DX12U VRS? They were not clear about that too.
 
Last edited:
GE is hardware accelerated. The SW side is the very same/akin to SW side as DX12U updates. APIs still receive SW updates for their hardware they tool for.
he saying another thing (explaining it on the tweeets) but after i did understand who he is im not interested anymore ..but if you trust him you can follow the link
 
Me spreading FUD? His statement is referred to MSs crappy PR about FULL RDNA 2. Again, you're messed up.
it mean that the are differences between the design of the xsx and a amd rdna2 gpu not that the xsx is not rdna2 based .... my English is shitty but gosh is very clear what he wrote
 
he saying another thing (explaining it on the tweeets) but after i did understand who he is im not interested anymore ..but if you trust him you can follow the link
I'm not even talking about him, I am talking about what is reality. GE is hardware accelerated and Sony customized their APU and API toolset to take advantage of it.
 
Yeah, that MSs PR full RDNA 2 on XSX is a lie
Please show the quote where MS actually said that, you can't because they didn't.

They said the Series consoles are the only consoles with hardware support for the features AMD revealed that day.

VRS, Mesh Shaders and SFS.

"Xbox Series X|S are the only next-generation consoles with full hardware support for all the RDNA 2 capabilities AMD showcased today.........including hardware accelerated DirectX Raytracing, Mesh Shaders, Sampler Feedback and Variable Rate Shading."
 
Last edited:
No Vaseline Rendering Solution in the standard flare, HUGE!

Activision dev posted on RE that Sony has a VRS solution in the GE, so it's whatever. They just chose to update their software solution on a cross-gen port rather than dive into the GE from the ground up on an already coded and established game and engine on that platform.


VRS cannot be done on the GE, it's Variable Rate Shading, no shading is being done on the GE.
 
VRS cannot be done on the GE, it's Variable Rate Shading, no shading is being done on the GE.
So go to RE and argue about it with the Activision developer.

 
Last edited:
Please show the quote where MS actually said that, you can't because they didn't.

They said the Series consoles are the only consoles with hardware support for the features AMD revealed that day.

VRS, Mesh Shaders and SFS.

"Xbox Series X|S are the only next-generation consoles with full hardware support for all the RDNA 2 capabilities AMD showcased today.........including hardware accelerated DirectX Raytracing, Mesh Shaders, Sampler Feedback and Variable Rate Shading."
How the narrative shifted 😳
 
How the narrative shifted 😳

What narrative, that's the quote from the press release. People not being able to read or not being able to understand what is being said is not Microsoft's problem, it was all there in black and white.

Is a console APU going to be identical to a PC GPU, only in the mind of the uninformed.
 
Please show the quote where MS actually said that, you can't because they didn't.

They said the Series consoles are the only consoles with hardware support for the features AMD revealed that day.

VRS, Mesh Shaders and SFS.

"Xbox Series X|S are the only next-generation consoles with full hardware support for all the RDNA 2 capabilities AMD showcased today.........including hardware accelerated DirectX Raytracing, Mesh Shaders, Sampler Feedback and Variable Rate Shading."

Thank you
 
Top Bottom