• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

THE:MILKMAN

Member
The RAM chips are piquing my interest. Saying that if you had access to the devkit PCB then why not just anonymously post pics instead?

I could Google all those parts and make a believable spec list. A picture of this PCB in the devkit would be harder to argue against.
 

ethomaz

Banned
We need the bus width to know the bus width and the real clock of the RAM.

With 16x chip it can be either 256bits (576GB/s) or 512bits (1152GB/s) with the RAM running at 18000Mhz... the clock can be lower than the manufacture specs to lower vcore.
 
Last edited:
Great - if it's fake they did a good job.

What it said
Summary

Notes
monolithic die ~22.4mm by ~14.1mmdie size 316mm2close to people's expectations
16 Samsung K4ZAF325BM-HC1832 GB GDDR6 @ 18Gb/s1152 GB/s bandwidth WOW!
memory vrm seems like overkilldevkit overengineered - nothing to see here
3 Samsung K4AAG085WB-MCRC, , 2 of those close to the NAND acting as DRAM cache3 x 2GB DDR4So 2GB DDR4 for OS, and 4GB as SSD cache - this must be Sony's secret sauce for load times
4 NAND packages soldered to the PCB which are TH58LJT2T24BAEG from ToshibaIt's NVMe SSD 4TB totalObviously more than we'll get in baseline console
PS5016-E16 from PhisonPCIe Gen 4.0 controllermakes stuff work . nothing to see here

Sounds good.

Suprised about the monolithic die - I was expecting chiplets on an interposer - maybe it still is but they didn't take the lid off.

Can someone check I got the bandwidth calc. right
Wait, is that a 2TB SSD on a console (not dev kit)?

He said 4 chips and each one is 512GB.

Here it also mentions 2TB, but it's hybrid (like 2TB HDD + 256GB SSD cache):


2TB sounds batshit insane for a console! Some other rumors mention 1TB SSD (no HDD) for both Sony and MS.

Honestly, 32GB GDDR6 sounds a lot more believable than slapping 2TB of SSD storage.

Either way, I get the impression that Sony has multiple PS5 iterations in their labs, so multiple leaks from various sources (minus fake ones) might be explained by that.

Does anyone remember the 2 PS4 Pro iterations? The one we got had Jaguar, the 2nd one ($499 SKU) had Zen 1 CPU. The 2nd one is probably what became the proto-PS5.

We need the bus width to know the bandwidth.

With 16x chip it can be either 256bits or 512bits.
No way it's gonna be a big fat 512-bit bus! Unless it's $599 minimum.

I'll take an avatar bet if that happens on a consumer console.
 
Last edited:

ethomaz

Banned
No way it's gonna be a big fat 512-bit bus! Unless it's $599 minimum.

I'll take an avatar bet if that happens on a consumer console.
Well 256bits gives a bandwidht of 576GB/s if the GDDR6 is running at specification speeds (18000Mhz).

But remember consoles uses to run the RAM at lower speed to decrease the vcore so less power consumption.

At 17000Mhz gives 544GB/s for example.
 
Last edited:

Ar¢tos

Member
Wait, is that a 2TB SSD on a console (not dev kit)?

He said 4 chips and each one is 512GB.

Here it also mentions 2TB, but it's hybrid (like 2TB HDD + 256GB SSD cache):


2TB sounds batshit insane for a console! Some other rumors mention 1TB SSD (no HDD) for both Sony and MS.

Honestly, 32GB GDDR6 sounds a lot more believable than slapping 2TB of SSD storage.

Either way, I get the impression that Sony has multiple PS5 iterations in their labs, so multiple leaks from various sources (minus fake ones) might be explained by that.

Does anyone remember the 2 PS4 Pro iterations? The one we got had Jaguar, the 2nd one ($499 SKU) has Zen 1 CPU. The 2nd one is probably what became the proto-PS5.


No way it's gonna be a big fat 512-bit bus! Unless it's $599 minimum.

I'll take an avatar bet if that happens on a consumer console.
It's a dev kit, don't expect 2tb storage or 32gb ram in the retail unit.
 

JohnnyFootball

GerAlt-Right. Ciriously.
"Better" in terms of what? Rasterization? Efficiency? Software-based vs hardware-based RT?

Vega is a compute monster, but it's very power hungry as well. Just calling spade a spade.

Let's say that Sony hits 13TF and MS miraculously reaches 15TF. How many watts do you think they would need for that?

Vega will have less efficiency than Navi, just like with every old vs new GCN microarchitecture comparison.
MS is supposed to release two consoles. One less powerful and another one more powerful. We don't know if these specs are the weaker or stronger.
 

xool

Member
Wait, is that a 2TB SSD on a console (not dev kit)?

He said 4 chips and each one is 512GB.

My bad (corrected the table) -was trying to read descriptions for TH58LJT2T24BAEG in chinese ..

Here it also mentions 2TB, but it's hybrid (like 2TB HDD + 256GB SSD cache):


2TB sounds batshit insane for a console! Some other rumors mention 1TB SSD (no HDD) for both Sony and MS.

Honestly, 32GB GDDR6 sounds a lot more believable than slapping 2TB of SSD storage.

Either way, I get the impression that Sony has multiple PS5 iterations in their labs, so multiple leaks from various sources (minus fake ones) might be explained by that.

Does anyone remember the 2 PS4 Pro iterations? The one we got had Jaguar, the 2nd one ($499 SKU) had Zen 1 CPU. The 2nd one is probably what became the proto-PS5.


No way it's gonna be a big fat 512-bit bus! Unless it's $599 minimum.

I'll take an avatar bet if that happens on a consumer console.

I always expect dev kits to have more memory and HDD/SSD than the product so 32GB/2TB is upper limit..
 
Last edited:
Since he said clamshell configuration (x16 vs x32), then each GDDR6 chip will only have 16 bits x 16 chips = 256-bit bus.

OG PS4 also had a clamshell memory setup, which is very convenient for cost savings as you can see:


7.jpg
 
Last edited:

xool

Member
I've put a spec analysis in the comments of these rumored devkit parts infos :

Devkit parts

Those seems reasonable for 500$ I think

One thing says FAKE, FAKE, FAKE here - and that's the die size - there's really no way they could know that without taking the lid of the APU - which would almost certainly break it, or need re lidding .. how the fuck could they do that in a dev environment without getting the sack/into a ton of trouble .. ??

[edit - screw this - toomuchlittlecoffee - I was wrong]

Unless there's an explanation for this I'd say well made fake leak.
 
Last edited:
One thing says FAKE, FAKE, FAKE here - and that's the die size - there's really no way they could know that without taking the lid of the APU - which would almost certainly break it, or need re lidding .. how the fuck could they do that in a dev environment without getting the sack/into a ton of trouble .. ??

Unless there's an explanation for this I'd say well made fake leak.
What do you mean? APUs and GPUs have no lid (if you meant the heat spreader, only CPUs have that).

Why would it break?

Either way, we'll know for sure the exact dimensions of the APU die (assuming it's monolithic) by the end of 2020.
 

xool

Member
What do you mean? APUs and GPUs have no lid (if you meant the heat spreader, only CPUs have that).

Why would it break?

Either way, we'll know for sure the exact dimensions of the APU die (assuming it's monolithic) by the end of 2020.

You're right what was I thinking (actually I was thinking the PS4 APU had a lid)- but checking images I now see that you can see the backside of the dark shiny silicon (or whatever it is..)
 

xool

Member
Thanks for detailling the analysis for me, I was on my phone when I posted on reddit and I usually do all posting on PC
Well thank you - you might want to check it (my stuff) again, because I've had to make 2 major corrections .. oops
 
Last edited:

Ovech-King

Member
Well thank you - you might want to check it (my stuff) again, because I've had to make 2 major corrections .. oops

Great write up and thanks for correcting me, 32GB and 2TB SSd make sense for devkits though . I expect the final product to cut in half just like the Xbox One X devkits were 24GB vs 12GB on the final product so I guess 16GB DDR6 (+ 4GB (?) DDR4) and 1TB SSD will be the final specs
 
Last edited:

ethomaz

Banned
I'm not sure if I'm allowed to post ERA links, but I think this post is very thoughtful:


I guess Switch revolutionized the gaming field in a way? People want minimum friction these days, no time to waste on console booting/excessive loading times.
Sony key marketing at PS4 reveal was games playing without any loadtimes and most Sony first party games hide the loads times while you watch the cutscenes... Killzone at launch for example give you gameplay in less than 15s from a console cold boot.

They are only following the Cerny goals started on PS4.
 

CrustyBritches

Gold Member
In which games do you suspect to be the case?
Dont get me wrong, I don't believe your theory but if there is some merit to it im curious to see it.
It's not theory, I tested it before I posted. :messenger_grinning_sweat: FH4 and TW3.

I dont believe 480RX or 580RX is enough to match xbox x results.
6TF RX 480 can match X1X in FH4 and TW3 with tweaking and decent mem oc. A good oc should yield 2250MHz on either RX 480 or RX 580. I don't believe max mem clock ever changed. Polaris 10 has nasty power consumption scaling at max clocks. X1X also has 326GB/s memory bandwidth, I'm stuck around 273GB/s and RX 480/580 has 288GB/s max.
---
This is why I'm saying it's better to think of the X1X as a ~155W(peak draw) RX 480 with better memory bandwidth and more L2 cache than a 205W RX 580. That is to say, in a $499 175W total system consumption console, the GPU should have around 155W peak power consumption, not 205W.
 

SonGoku

Member
New Xbox’s GPU is based of the combination of New Vega (7nm) and the Arcturus 12 GPU
This doesn't make sense. Reads like a forum poster theory

Does this josh guy have a reputable track record?
 

CrustyBritches

Gold Member
Bu buu Buut that's because of trash CPU skewing things
If the X had a decent CPU it would match or surpass rx 580
They're GPU bound in both scenarios. If the X1X had a better CPU then it would either consume more than 175W or end up having a lower power budget for the GPU.

It's not about which is more powerful, it's about finding the sweet spot in relation to perf/watt. RX 580 is in a bad spot for Polaris. It has poor perf/watt in relation to the bog standard RX 480, and hardly a worthy jump over 390. That's why I brought up the 390. In a way, MS rectified the compute to memory bandwidth disparity Polaris 10 has with more Hawaii gen ratio. That's why it's more like a 155W power capped RX 480 with higher mem bandwidth rather than a RX 580. Otherwise you could end up thinking they stuffed a 205W GPU into the X1X. Estimates for next-gen TDP would be assumed to be higher. In theory, at least.
 

SonGoku

Member
They're GPU bound in both scenarios. If the X1X had a better CPU then it would either consume more than 175W or end up having a lower power budget for the GPU.

It's not about which is more powerful, it's about finding the sweet spot in relation to perf/watt. RX 580 is in a bad spot for Polaris. It has poor perf/watt in relation to the bog standard RX 480, and hardly a worthy jump over 390. That's why I brought up the 390. In a way, MS rectified the compute to memory bandwidth disparity Polaris 10 has with more Hawaii gen ratio. That's why it's more like a 155W power capped RX 480 with higher mem bandwidth rather than a RX 580. Otherwise you could end up thinking they stuffed a 205W GPU into the X1X. Estimates for next-gen TDP would be assumed to be higher. In theory, at least.
I get what you are trying to say but it can be a bit misleading since it can be understood as the X GPU is less capable than what its specs say.
The GPU inside the X is just as capable as the RX 580 while consuming less power, MS managed to put a 180W GPU in there.
 
Last edited:

SonGoku

Member
Status
Not open for further replies.
Top Bottom