• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Die Shot has been revealed

ethomaz

Banned
Just basing what is and isn't RDNA2 on clock speeds is really dumb. The boost clock of a 5700XT can be higher than the Game clock of a 6800, does that make it RDNA2?
Neither of you answered my question on Compute Units, which 68 series card has only 36 compute units? Does this mean PS5 can't be RDNA2?
There will be 6 series cards with lower clocks and less compute units and guess what, they will all be RDNA2 cards.

Microsoft had a target of 12 teraflops, they even said very recently they would have clocked XSX lower if yields were better and they could have had more compute units. That wouldn't change the XSX being RDNA2, same if yields were worse and they would have needed higher clock to get to that 12 teraflops.

This is the stupidest thread on GAF when AMD says both PS5 and Xbox Series consoles are RDNA2 and let's face it they should know.
RX 5700XT (Boost Clock: 1905Mhz but runs lower than that)
clocks-and-thermals.jpg


RX 5700 (Boot clock: 1725Mhz and runs lower than that)
clocks-and-thermals.jpg


Can't you see the change in clocks? RDNA 2 reaches clocks ways higher and frequently... AMD GPUs used to run below the Boost Clock... now with RDNA 2 it runs way higher than Boost clock.
 
Last edited:

Old Empire.

Member
Read it again.



They're specifically talking about the features that were showcased today.

Why would they need things such as "Sampler Feedback" and "DirectX Raytracing" when they have their own Raytracing and texture streaming solution based on their own IO structure? If/When it's officially announced that Sony has their own VRS, then people are going to look pretty dumb falling victim to Microsoft PR BS all over again.

MS said there GPU has Hardware accelerated capabilities. Including Mesh Shaders, Sampler Feedback and Variable Rate Shading. This is not patched in by Microsoft.

AMD announced these new features for RDNA2.

Digital foundry said already they have not seen any game with VRS on PS5 yet. Xbox yes they have.
 

Hashi

Member
We had silly threads PS5 was RDNA3 that was a good laugh. Die shot confirms the PS5 GPU not RDNA2.

AMD only announced the RDNA2 full set for one console the Xbox.

This is old news since MS revealed all the details in Oct 2020.

This quote was overlooked.
Xbox Series X|S are the only next-generation consoles with full hardware support for all the RDNA 2 capabilities AMD showcased today.

Do you think Sony would let MS get away with saying that if it untrue?
Wait till Horizon Forbiden West engine will be loaded. We have suprise..
:messenger_smiling_with_eyes:
PS5 Apu is RDNA that has features that RDNA1 and RDNA2 doesnt have.
;)
Have nice day for everybody.
 

DForce

NaughtyDog Defense Force
MS said there GPU has Hardware accelerated capabilities. Including Mesh Shaders, Sampler Feedback and Variable Rate Shading. This is not patched in by Microsoft.

AMD announced these new features for RDNA2.

Digital foundry said already they have not seen any game with VRS on PS5 yet. Xbox yes they have.

You guys try hard to ignore facts and keep repeating the same stuff over and over again

Sony doesn't need all features because they have their OWN. Sony doesn't need DirectX Raytracing and if Sony doesn't have that, then MS can say they're the only console that has support for the features that were shown today.

Digital Foundry doesn't work on the console and they're only speculating based on what they have seen so far. If they KNEW, then they wouldn't need to ask Sony for an answer and have them break NDA.

Repeating the same line isn't going to work. It's just going to make you look like you're in denial.
 

jroc74

Phone reception is more important to me than human rights
RX 5700XT (Boost Clock: 1905Mhz but runs lower than that)
clocks-and-thermals.jpg


RX 5700 (Boot clock: 1725Mhz and runs lower than that)
clocks-and-thermals.jpg


Can't you see the change in clocks? RDNA 2 reaches clocks ways higher and frequently... AMD GPUs used to run below the Boost Clock... now with RDNA 2 it runs way higher than Boost clock.
I thought with actual RDNA 2 cards out ppl would stop going back to the 5700, 5700XT. That DF test with it was flawed from the beginning. I get why they did it, but still. Its time to move on from that.

The move from RDNA 1 to RDNA 2 is what made 2Ghz github data make sense.

Everything about the github data made more sense once AMD officially revealed RDNA 2. AdoredTV, AMD engineer a few weeks before the reveal made sense in hindsight. One mentioned RDNA 2 going over 2Ghz, the other mentioned both next gen consoles doing ray tracing.

To those still living on that github data, pre RDNA 2 hill....its time to come down.
 

martino

Member
RX 5700XT (Boost Clock: 1905Mhz but runs lower than that)
clocks-and-thermals.jpg


RX 5700 (Boot clock: 1725Mhz and runs lower than that)
clocks-and-thermals.jpg


Can't you see the change in clocks? RDNA 2 reaches clocks ways higher and frequently... AMD GPUs used to run below the Boost Clock... now with RDNA 2 it runs way higher than Boost clock.
but 5700 is 36 cu and consume ~160w for that.
5700 XT with 40cu is ~210w.
for me it doesn't check with XSX total system at ~210w with 52cu.
From a perf/watt perspective i doubt it's rdna1.
 
Last edited:

S0ULZB0URNE

Member
Sony says and not yet introduced. So you confirming the PS5 GPU not RDNA2. Since VRS is hardware accelerated on rdna2 GPUs and should be there as standard!!!
Everything MS claimed about XSX has been introduced?

Sony has its own custom features they added to it's RDNA 2 "based" GPU.
 

Old Empire.

Member
Everything MS claimed about XSX has been introduced?

Sony has its own custom features they added to it's RDNA 2 "based" GPU.
The topic of the thread is PS5 die. If the PS5 gpu, had the full set of capabilities of RDNA2, then was is Sony saying they be adding a software version of VRS in later, an unspecified time, that's unknown?. It should be there already for devs to use?. So we have a check mate Sony GPU missing AMD hardware VRS here. PS5 guys/women on here will keep telling you MS lying.
 

jroc74

Phone reception is more important to me than human rights
Sony says and not yet introduced. So you confirming the PS5 GPU not RDNA2. Since VRS is hardware accelerated on rdna2 GPUs and should be there as standard!!!

The topic of the thread is PS5 die. If the PS5 gpu, had the full set of capabilities of RDNA2, then was is Sony saying they be adding a software version of VRS in later, an unspecified time, that's unknown?. It should be there already for devs to use?. So we have a check mate Sony GPU missing AMD hardware VRS here. PS5 guys/women on here will keep telling you MS lying.
You are trying to hard.

Look, the PS5 doesnt have hardware VRS, you happy now?

Mind you, the XSX isnt full RDNA 2. Neither console is. XSX has more of the vanilla features.

I'm more interested in the cu's tho. Because Cerny explicitly stated the PS5 has RDNA 2 cu's. Right now that might be a difference between the PS5 and Series consoles.

but 5700 is 36 cu and consume ~160w for that.
5700 XT with 40cu is ~210w.
for me it doesn't check with XSX total system at ~210w with 52cu.
From a perf/watt perspective i doubt it's rdna1.

But XSX runs at RDNA 1 clock speeds.
Yup, this is why I'm interested in the cu's now.
 
Last edited:

martino

Member
But XSX runs at RDNA 1 clock speeds.
Are you trying to say mobile rdna2 will not be rdna2 if they run lower clocks because of thermal constraint of this use case ?
The clocks is also the consequence of the case , the cooling capacity and the various target coming with it (silence for example)
 

Riky

$MSFT
RX 5700XT (Boost Clock: 1905Mhz but runs lower than that)
clocks-and-thermals.jpg


RX 5700 (Boot clock: 1725Mhz and runs lower than that)
clocks-and-thermals.jpg


Can't you see the change in clocks? RDNA 2 reaches clocks ways higher and frequently... AMD GPUs used to run below the Boost Clock... now with RDNA 2 it runs way higher than Boost clock.

And so what? You don't know what the upper bounds of the XSX clock speeds are, the clock speed was set as I said for a very specific purpose to get to 12 tflops.
Read what MS just said, that they would have clocked it lower if yields would have allowed more usable compute units on the wafer.
So we simply don't know how high XSX could have been clocked.
 

HoofHearted

Member
It runs way higher than that.
All reviewers pointed that.

"which can be boosted up to 2105 MHz" this is marketing PR that turned out to be false... it runs most of time higher than that.

The AMD marketed clocks are not accurate... they are doing what nVidia does for gens already.
Understood - and that's pretty much expected for any GPU today (at least in PC market). They all come now with a base clock and a boost clock. You could also get varying results based on binning, etc. The ONLY thing guaranteed here is the base clock. That way if you don't get the 2100Mhz ("up to") - you can't state that the GPU/card isn't meeting specs. In theory - you could possibly get the bottom of the bin pile and only get a GPU that runs at 1700Mhz.

NVidia cards do the same exact thing (I get quite a significant perf/boost bump out of my water cooled 2080 FE above spec'd base/boost clocks).

The first thing everyone does is try to OC the damn things to see what is optimal performance for the card. That's what TechPowerUp did with the article you quoted.

Generally speaking - GPU manufacturers will report minimum base and boost clocks for the GPU chip.

Then, depending the actual card design (AIB vs Reference), in most instances you'll see higher performance based on several other factors with respect the card/cooling solution implemented.

Old days - "Reference" designs were just that - you wouldn't expect much of a bump over base specs, and typically you wouldn't want to buy the reference card unless you planned to water cool it.

AMD and Nvidia changed that recently because they wanted to be more competitive in the market to actually sell their own branded cards.

Extending this further - you could buy a "lesser" card (i.e. 1070), water cool it, and OC the crap out of it to get 1080+ performance for significantly less investment.

With respect to the XSX vs PS5 comparison in GPU clocks.... IMHO - we're seeing similar approach with respective target architecture/designs.

I'm not surprised that we're seeing these early current/cross-gen games run "better" (at least in certain instances) on the PS5 due to the higher (variable) clock rate compared to the locked/lower clock rate of the XSX.

Regardless of all of this chatter around the IC and RDNA 1v 2 debate...

The real question is - over time, as the various studios/developers update and transition their respective overall software architecture/game engines to implement and adopt these newer technologies within their games (VRS, ML, Mesh Shading, etc.) - will the "fast and narrow" design of the PS5 be able to keep up with the "fat/wide/slower" design of the XSX?

There was a very similar discussion years ago between the buying 2080 and the 1080Ti - as they both performed similarly in raw/basic FPS and game-engine rasterization at that time. At the time it was released - it didn't really make much sense to "upgrade" to a 2080 versus owning a 1080Ti.

However, since then - games have been updated to take advantage of the new capabilities/features of the 2080.
 

S0ULZB0URNE

Member
The topic of the thread is PS5 die. If the PS5 gpu, had the full set of capabilities of RDNA2, then was is Sony saying they be adding a software version of VRS in later, an unspecified time, that's unknown?. It should be there already for devs to use?. So we have a check mate Sony GPU missing AMD hardware VRS here. PS5 guys/women on here will keep telling you MS lying.
The actual claim is RDNA 2 BASED.
Features not needed were removed and custom features were added by Sony engineers.
Simple
 

ethomaz

Banned
but 5700 is 36 cu and consume ~160w for that.
5700 XT with 40cu is ~210w.
for me it doesn't check with XSX total system at ~210w with 52cu.
From a perf/watt perspective i doubt it's rdna1.
52CUs is exactly why Series X runs at lower clock than RX 5700.
BTW the power draw for Series X is for the APU (CPU included) while the RX 5700 is only for GPU... not comparables.
 

3liteDragon

Member
RDNA 2.0 FeatureRay tracing VRSMesh ShadersMachine LearningPerformance Per WattInfinity Cache
Playstation 5YesNoNoNoYesNo
Xbox Series XYesYesYesYesYes but using RDNA 1.0 clocksNo

Using this metric, I would say the PS5 is RDNA 1.33 and XSX is RDNA 1.66
FTFY. RDNA 1.xxxxx, Jesus fucking Christ lol. Both consoles are custom RDNA 2-based ffs.

RDNA 2 FeatureRay-TracingVRSMesh ShadersMachine LearningPerf. Per WattInfinity Cache
PlayStation 5Yes (custom)Yes (custom)Yes (custom)YesYesNo
Xbox Series XYes (custom)Yes (PC RDNA 2)Yes (PC RDNA 2)YesYes (but using RDNA 1 clocks)No
 

SlimySnake

Flashless at the Golden Globes
infinity cache is contributing a lot to Perf/W in rdna 2 GPUs. so im pretty confident that neither console has real parity in that regard. sadly i didn't find the time to do my power scaling tests when i had the 6800 lying around here. we will know for sure when RX6700 series launches i guess.

as a reminder, this was how rdna 1 behaved:

powerscalinggpuonlyuljwr.png
Oh I know your graph like the back of my hand, Ive posted it so many times lol.

The higher power consumption is what I was referring to. Your graph showed 220w for just 2.15 Ghz 5700xt. That wouldve been almost 250w for a 2.23 ghz GPU just for the GPU die. So to me, whatever arch or node improvements allowed AMD to reduce that level of power consumption are in the PS5 GPU. I am not well versed in GPU die speculation enough to tell you what those improvements are, but they arent node improvements since AMD said they are not using 7nm+ for RDNA 2.0 chips.

The PS5 seems to be outperforming the 5700xt despite its lower CU count due to its higher tflops/clocks. So I wouldnt be surprised if its firestrike score is somewhere in the 28k-30k region despite 4 fewer CUs.

resultsshjg4.png
 

Old Empire.

Member
Yeap PS5 doesn't support any DX12U feature with RDNA2 at all.
It should be weird if it supported these DX12U features.
Direct x is meaningless here not getting your point.. Only listed DX feature is for the Ray tracing.

Ray tracing performance is depend on the spec of the hardware not software.

PS5 36 CU and Series 52 CU. with Ray tracing on, the Xbox at the same resolution as PS5 should run better.
 

ToTTenTranz

Banned
RDNA 2.0 FeatureRay tracing VRSMesh ShadersMachine LearningPerformance Per WattInfinity Cache
Playstation 5YesNoNoNoYesNo
Xbox Series XYesYesYesYesYes but using RDNA 1.0 clocksNo

This table is erroneous in the sense that it's mistaking a bunch of hardware capabilities with the marketing names they were given to be implemented in DX12 Ultimate.
You might as well just remake that table like this:


Xbox Series DX12 Ultimate feature:DX RaytracingDX12U VRSDX12U Mesh ShadersDirectMLDirect Storage w/ HW BCPack DecompressionDX12U Sampler Feedback
Playstation 5NoNoNoNoNoNo
Xbox Series XYesYesYesYesYesYes

Or like this:

PS5 GNM/GNMX feature:GNM/GNMX RaytracingGNM/GNMX Foveated RenderingPSSL Primitive ShadersPSSL Machine LeariningGNM/GNMX HW Kraken DecompressionGNM/GNMX I/O Streaming
Playstation 5YesYesYesYesYesYes
Xbox Series XNoNoNoNoNoNo


Both these tables are useless for comparing RDNA versions.



i'm still confused as to why full rdna2 wasn't just used across both platforms.

The SoCs on both consoles serve different purposes.
The Project Oberon SoC on PS5 was designed to go into a gaming machine exclusively. Project Scarlett was designed to serve both the SeriesX platform (local console or XCloud) and Azure servers for general-purpose HPC and remote virtual machines:

with around 11 per cent of the space given over to what Microsoft describes as server-class Zen 2-based CPU clusters. A similar amount of area is also consumed by the GDDR6 memory controllers - there are ten of these in total - and while they address 16GB of total RAM in a retail console, the channels are also good for 40GB of memory in the Project Scarlett devkit, and we should assume that once integrated into the Azure cloud, the chips will be using some other kind of non-retail memory set-up.
(...)
We can also assume that these server-class CPUs will also take their place as standard Windows servers when not used for gaming


This alone is reason for both SoCs to be very different, which will affect their R&D priorities and eventually the ISA for each chip.

Microsoft clearly wants to do GPGPU on the SeriesX SoC. Proof of that is the fact that they've been targeting 12 TFLOPs throughput first and foremost, which is why at some point they entertained the idea of using all of the 28 WGPs in the chip at the cost of lower 1675MHz GPU clocks to increase TFLOPs-per-watt.

zkSSZZf.jpeg




As kindly reminded to us by sebbbi / Sebastian Altoonen (ex-Ubisoft dev, Claybook dev, Unity engine dev), the 26 WGP @ 1825MHz version is obviously going to perform better. This is apparently a sacrifice Microsoft had to make to increase gaming performance at the cost of a less power-effective HPC solution. HPC often only cares about bandwidth and floating point / integer throughput, fill-rate and geometry performance don't matter much, which is why e.g. Vega 20 is still pretty good as a HPC solution despite being mediocre for gaming.




Xbox Series X|S are the only next-generation consoles with full hardware support for all the RDNA 2 capabilities AMD showcased today.

Why would MS say that when PS5 exists?? Sony would sue the crap out of them for lying,. Wake up.

Microsoft didn't lie because during the 6800 series' launch AMD only presented DX12 features, none of which are supported by the PS5.
Their marketing team simply used a wordplay that makes them look good for comparisons but doesn't mean much.
 

Riky

$MSFT
If Lisa Su says its RDNA2 (and she does/did) that's as official as it gets.

I just do not get why people are still arguing the point.

Exactly.

Everyone saying otherwise is just being stupid as there would only be two conclusions.

AMD is lying
AMD doesn't know what RDNA2 is despite creating it.
 

TheContact

Member
If Lisa Su says its RDNA2 (and she does/did) that's as official as it gets.

I just do not get why people are still arguing the point.

Question: Do you have any semi-custom options in the server space?

Su: There are opportunities for us in the semi-custom space as it relates to — obviously game consoles have been the largest piece of it. But there are opportunities outside of game consoles, including in the datacenter.

 

Old Empire.

Member
Sony is not saying thing because they don't have to.

They already confirmed that it's RDNA 2.

Xbox fanboys want the information so they can confirm it has less RDNA features than Xbox Series X. PS fans want so Xbox fanboys will stop with the RDNA 1.5 BS.

It's an interesting debate.

My view is here the PS5 GPU missing a lot of core features for it to truly called next gen RDNA2.

End of the day, real game performance is what matters to gamers. We'll know lot more about the in's and outs of the hardware when next gen games arrive.

Be interesting to see what the PS5 and Series capable of then. GPU arguments will matter little, if PS5 holds it own against the Series.

I expect Hitman to be level we'll see going forward. PS5 not hitting 4k, but getting very close to it. Series 4K.

AMD DLSS feature would be interesting.; Personally i think resolution is a waste of resources. I rather see devs increase the realism in games, and not use that power for clarity.
 

John Wick

Member
Not the Game clock it doesn't, quoting charts of the boost clock is irrelevant. As I said the XSX runs about the Game clock of a 6800, AMD says so.

But by the perverse logic people are using, which of the 6800 series cards has only 36 compute units? Answer none, so does this make PS5 not RDNA2 also?

Answer no, because AMD says both XSX, XSS and PS5 are RDNA2 and they actually make them. Making most of this thread stupid and irrelevant.
Riky talking sense. I'm amazed seriously
 

Clear

CliffyB's Cock Holster
It's an interesting debate.

My view is here the PS5 GPU missing a lot of core features for it to truly called next gen RDNA2.

End of the day, real game performance is what matters to gamers. We'll know lot more about the in's and outs of the hardware when next gen games arrive.

Be interesting to see what the PS5 and Series capable of then. GPU arguments will matter little, if PS5 holds it own against the Series.

I expect Hitman to be level we'll see going forward. PS5 not hitting 4k, but getting very close to it. Series 4K.

AMD DLSS feature would be interesting.; Personally i think resolution is a waste of resources. I rather see devs increase the realism in games, and not use that power for clarity.

The better implementations of CBR look virtually as good as native res anyway unless you are a pixel-counter, which is why it kinda bugs me when people like DF kinda diminish its effectiveness by always talking about the un-reconstructed res whilst basically talking up DLSS output as if it is actual native res.
 

John Wick

Member
Not the Game clock it doesn't, quoting charts of the boost clock is irrelevant. As I said the XSX runs about the Game clock of a 6800, AMD says so.

But by the perverse logic people are using, which of the 6800 series cards has only 36 compute units? Answer none, so does this make PS5 not RDNA2 also?

Answer no, because AMD says both XSX, XSS and PS5 are RDNA2 and they actually make them. Making most of this thread stupid and irrelevant.
Riky talking sense. I'm amazed ser
It's an interesting debate.

My view is here the PS5 GPU missing a lot of core features for it to truly called next gen RDNA2.

End of the day, real game performance is what matters to gamers. We'll know lot more about the in's and outs of the hardware when next gen games arrive.

Be interesting to see what the PS5 and Series capable of then. GPU arguments will matter little, if PS5 holds it own against the Series.

I expect Hitman to be level we'll see going forward. PS5 not hitting 4k, but getting very close to it. Series 4K.

AMD DLSS feature would be interesting.; Personally i think resolution is a waste of resources. I rather see devs increase the realism in games, and not use that power for clarity.
Doesn't matter. Unreal engine is probably the most popular engine for 3rd party. With UE5 more devs will use it. Sony are covered as Epic will add support for HW features. It's already amazing without using any PS5 specific features beside IO and SSD.
 

SlimySnake

Flashless at the Golden Globes
FTFY. RDNA 1.xxxxx, Jesus fucking Christ lol. Both consoles are custom RDNA 2-based ffs.

RDNA 2 FeatureRay-TracingVRSMesh ShadersMachine LearningPerf. Per WattInfinity Cache
PlayStation 5Yes (custom)Yes (custom)Yes (custom)YesYesNo
Xbox Series XYes (custom)Yes (PC RDNA 2)Yes (PC RDNA 2)YesYes (but using RDNA 1 clocks)No
It's just a way to measure performance. Nothing more. RDNA 1.xxx could also mean custom RDNA2. You are acting as if I am Riky or someone who thinks Mark Cerny and Lisa CU are lying about the PS5 being a custom RDNA 2.0 GPU.

The fact of the matter is that even going by your updated table, the PS5 nor the XSX are fully RDNA 2.0 which is reliant on the Infinity Cache to boost performance. Thats a 6 billion transistor part. Transistor count equivalent to the entire PS4 Pro GPU. They wouldnt have added it in there if its wasnt a big fucking deal. So if the PC GPUs with infinity cache are full RDNA 2.0 then surely the xsx and ps5 should be under 2.0.
 

kyliethicc

Member
GPU in PS5 is 36CUs nowhere in the road map to be RDNA2. It similar to a GPU released in 2019. Far too early to have RDNA2 hardware features.. Sony did some tweaking on there own, to the clocks, to get better performance from CUs.

Reality is PS5 is RDNA1. Xbox Series is RDNA2 mostly. MS waited a bit longer for the refresh.
Wrong.

Big Navi the 80 CU die, aka Navi 21, took the 40 CU die of Navi10 and doubled the number of shader engines, from 2 to 4.

The upcoming Navi22 die is the same 40 CUs as the PS5.

PS4, PS4 Pro, PS5 and Navi10, Navi21, Navi22 all use the same 5 DCUs per SA, 2 SA per SE layout. Navi21 just has 4 SE not 2.

XSX uses the same 2 shader engines as PS5 and Navi22 and Navi10, but added 2 extra DCUs per shader array. Not like Navi21.


RDNA1 Navi10 - 40 CUs (RX 5700XT, 5700)
5 Dual Compute Units per Shader Array
2 Shader Arrays per Shader Engine
2 Shader Engines
5x2x2x2=40 CUs
2560 shaders
64 ROPs
160 TMUs

RDNA2 Navi21 - 80 CUs (RX 6900XT, 6800XT, 6800)
5 Dual Compute Units per Shader Array
2 Shader Arrays per Shader Engine
4 Shader Engines
5x2x2x4=80 CUs
5120 shaders
128 ROPs
320 TMUs

RDNA2 Navi22 - 40 CUs (RX 6700XT, 6700)
5 Dual Compute Units per Shader Array
2 Shader Arrays per Shader Engine
2 Shader Engines
5x2x2x2=40
2560 shaders
64 ROPs
160 TMUs

PS5 GPU - 40 CUs
5 Dual Compute Units per Shader Array
2 Shader Arrays per Shader Engine
2 Shader Engines
5x2x2x2=40
2560 shaders
64 ROPs
160 TMUs

XSX GPU - 56 CUs
7 Dual Compute Units per Shader Array
2 Shader Arrays per Shader Engine
2 Shader Engines
7x2x2x2=56
3584 shaders
64 ROPs
224 TMUs

XSS GPU - 24 CUs
6 Dual Compute Units per Shader Array
2 Shader Arrays per Shader Engine
1 Shader Engine
6x2x2x1=24
1536 shaders
32 ROPs
96 TMUs

And of course the consoles disable 4 CUs for yields, which also disables 16 TMUs.
And some of the PC GPUs get cut down for segmentation and yields.
 
Last edited:

Old Empire.

Member
The better implementations of CBR look virtually as good as native res anyway unless you are a pixel-counter, which is why it kinda bugs me when people like DF kinda diminish its effectiveness by always talking about the un-reconstructed res whilst basically talking up DLSS output as if it is actual native res.

I just want the next gen battlefield now to play. I want no drop of realism to hit the magic 4K resolution.

In some ways i hope AMD DLSS is a gamechanger. it be good for all gamers for resolution debates to be a think of the past. I want to be blown away by the physics of explosions going off, the tanks look real life. stuff like that.
 

SlimySnake

Flashless at the Golden Globes
This table is erroneous in the sense that it's mistaking a bunch of hardware capabilities with the marketing names they were given to be implemented in DX12 Ultimate.
You might as well just remake that table like this:


Xbox Series DX12 Ultimate feature:DX RaytracingDX12U VRSDX12U Mesh ShadersDirectMLDirect Storage w/ HW BCPack DecompressionDX12U Sampler Feedback
Playstation 5NoNoNoNoNoNo
Xbox Series XYesYesYesYesYesYes

Or like this:

PS5 GNM/GNMX feature:GNM/GNMX RaytracingGNM/GNMX Foveated RenderingPSSL Primitive ShadersPSSL Machine LeariningGNM/GNMX HW Kraken DecompressionGNM/GNMX I/O Streaming
Playstation 5YesYesYesYesYesYes
Xbox Series XNoNoNoNoNoNo


Both these tables are useless for comparing RDNA versions.
Pretty sure mesh shaders and VRS were terms before RDNA 2 was revealed. nvidia was doing mesh shader demos in 2019 and they released a VRS benchmarking tool last year before RDNA 2.0 was officially revealed.

Other than that, i dont disagree with your point. It does seem like MS is taking DX12 features and pretending the PS5 doesnt support them. Maybe if Sony had lifted their gag order on Mark Cerny, we would get more clarification on just what kind of support PS5 has for mesh shaders, VRS and machine learning. We shouldnt have to look at the die shots to get clarifications on basic hardware features AFTER the console has been released.
 

kyliethicc

Member
It's just a way to measure performance. Nothing more. RDNA 1.xxx could also mean custom RDNA2. You are acting as if I am Riky or someone who thinks Mark Cerny and Lisa CU are lying about the PS5 being a custom RDNA 2.0 GPU.

The fact of the matter is that even going by your updated table, the PS5 nor the XSX are fully RDNA 2.0 which is reliant on the Infinity Cache to boost performance. Thats a 6 billion transistor part. Transistor count equivalent to the entire PS4 Pro GPU. They wouldnt have added it in there if its wasnt a big fucking deal. So if the PC GPUs with infinity cache are full RDNA 2.0 then surely the xsx and ps5 should be under 2.0.


The PlayStation 5 and the Xboxes have custom GPUs based on AMD's RDNA 2 architecture.

There is no such thing as RDNA 1.5 or whatever.

The GPUs are literally "custom RDNA 2 based GPUs." AMD & Sony & Microsoft have made this very fucking clear.

Any other label (like RDNA 1.x) is false.


00ZJdQ0.jpg
 
The PlayStation 5 and the Xboxes have custom GPUs based on AMD's RDNA 2 architecture.

There is no such thing as RDNA 1.5 or whatever.

The GPUs are literally "custom RDNA 2 based GPUs." AMD & Sony & Microsoft have made this very fucking clear.

Any other label (like RDNA 1.x) is false.


00ZJdQ0.jpg

Bill O'Rights Bill O'Rights

With such an overwhelming amount of evidence from official sources can the admins take against against anyone calling either system anything other than a custom RDNA2 one?

This reminds me of all the claims of the PS5 running at 8TFs or the brute force XSX ones that we had in the past. It might be a good idea to take some action otherwise many discussions will get derailed over and over again.

Thank you for looking at this.
 

Riky

$MSFT
It's just a way to measure performance. Nothing more. RDNA 1.xxx could also mean custom RDNA2. You are acting as if I am Riky or someone who thinks Mark Cerny and Lisa CU are lying about the PS5 being a custom RDNA 2.0 GPU.

What are you talking about? I've said the exact opposite several times in this thread you absolute tool.
 

ToTTenTranz

Banned
Pretty sure mesh shaders and VRS were terms before RDNA 2 was revealed. nvidia was doing mesh shader demos in 2019 and they released a VRS benchmarking tool last year before RDNA 2.0 was officially revealed.
Neither nVidia or AMD established the name for Variable Rate Shading. Microsoft did, because they're the ones establishing the names for DirectX features.
VRS is just Microsoft's name for foveated rendering in DX12 Ultimate, just like Khronos calls Fragment Shading Rate to foveated rendering in Vulkan while using the exact same hardware.

Microsoft is just claiming exclusivity to a name they invented themselves.



Other than that, i dont disagree with your point. It does seem like MS is taking DX12 features and pretending the PS5 doesnt support them. Maybe if Sony had lifted their gag order on Mark Cerny, we would get more clarification on just what kind of support PS5 has for mesh shaders, VRS and machine learning. We shouldnt have to look at the die shots to get clarifications on basic hardware features AFTER the console has been released.
It would be more fun to us if Cerny could give more details on the PS5's architecture, but in the grand scheme of things, him talking more about specs wouldn't translate into more sales and it would risk diluting their message. We need to acknowledge we're not a significant portion of Sony's (or Microsoft's) total addressable market.
 
Top Bottom