• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
LOL DF are shameless
btw 8tf isnt even the worst part, the worst part of their thought process is building a chp around 7nm weakness (clocks) insteads of its strength (density increase)
They have a PCMR-esque mindset and of course this type of mindset doesn't apply to consoles and their philosophy of doing things in the most efficient manner.
 

quest

Not Banned from OT
Even accounting for RT bits a 64CU APU would be around 380 mm2 and 72CU 400 mm2
With 6nm node shrink in the near future enabling cost reductions, it would be shortsighted to go with a small chip.

I think 64 with 4 disabled would be the magic number for one of them. I can't see either remotely using 40 with 4 disabled. It would be nearly impossible to market the next gen not much a head of the X in raw power. I think the magic number is at least 10 tf for marketing purposes. The best would be 12 double the X and the traditional 10x the previous gen PS4.
 

CrustyBritches

Gold Member
LOL DF are shameless
btw 8tf isnt even the worst part, the worst part of their thought process is building a chp around 7nm weakness (clocks) insteads of its strength (density increase)
They base it around APU size.

I'm going by mid-gen consoles peak system power consumption compared to Navi 10 TBP and expected card only average gaming consumption.

I didnt see that video yet, is their Prediction with 40cu's and 4 disabled for both Systems?
If you've been following Richard's coverage, he has a theory that PS5's BC, similar to PS4 Pro, is more hardware based than Xbox, so the PS5 would need to be built in a manner to accommodate that. Additionally, he believes that Gonzalo is PS5's APU, and Navi Lite. Lastly, he's theorized based on the recent deal between Sony and MS for Azure to host PS4/PS5 PSNow games in the future, that PS5 games can basically run on MS hardware. Thus, GCN/RDNA Hybrid...Navi Lite.

So this is mainly about PS5. I too, believe Gonzalo is PS5.



 

SonGoku

Member
I think 64 with 4 disabled would be the magic number for one of them.
With RDNA DCU config you must disable 8CUs.
56CU enabled is the baseline of my prediction
56CUs @1540Mhz = 11TF
They base it around APU size.
I do too:
72CU APU = 399.1 mm2 (64 enabled)
64CU APU = 385.6 mm2 (56 enabled)

With the possibility of quick cost reductions come 2021:

Launch die size at 7nm"6nm" die size (15% reduction)
400 mm2​
340 mm2​
390 mm2​
331.5 mm2​
380 mm2​
323 mm2​
I'm going by mid-gen consoles peak system power consumption compared to Navi 10 TBP and expected card only average gaming consumption.
A bigger lower clocked chip is the best option to hit consoles power consumption sweet-spot
A 36CUs chip power consumption at 1.8Ghz would be just as high if not more than a 64CU chip at 1600Mhz
If you've been following Richard's coverage, he has a theory that PS5's BC, similar to PS4 Pro, is more hardware based than Xbox, so the PS5 would need to be built in a manner to accommodate that.
Sony woulnt do the Wiiu mistake of sacrificing perfomance for backwards compatibility
They are more likely to dump 5 million into an emulation team to create a software layer that assists hw based emulation.
 
Last edited:

LordOfChaos

Member
Trying to steer this back into relevance:


Node
16 nm​
10 nm​
7 nm​
7nm/10nm Δ​
Gate
90 nm​
66 nm​
57 nm​
0.86x​
Min Metal
64 nm​
42 nm​
40 nm​
0.95x​


Never realized that the 16nm-10nm jump was actually quite substantial, and the 10nm to 7 is much less so. Though the transistor profile improved. TSMC calling this 7nm was even more aggressively...Optimistic, than fab naming bs already was, either this was the feature size on a ring oscillator or some other un-chip component, or else an original target long abandoned but used the name of. The fin is also very tall.


Transistor Profile
Node10 nm7 nmΔ
Fin Pitch36 nm30 nm0.83x
Fin Width6 nm6 nm1.00x
Fin Height42 nm52 nm1.24x

7nm+ is where density would get somewhat close, and is in fact closer to Intels 10nm.

"2nd Generation 7nm
TSMC also developed a 2nd generation of their 7nm process. This is an optimized process which uses the same design rules and DUV and is unrelated to 7nm+ which is EUV-based. This process is entirely design-compatible with the first generation but enjoys additional power and performance enhancements. For their second generation process, TSMC made some additional optimizations.

  • Fin profile Optimizations
  • Epi Optimizations
  • MOL resistance Optimizations
  • FEOL capacitance
  • Metal gate Optimizations
All in all, the 2nd-generation 7nm process is said to deliver over 5% improvement in performance. Additionally, at the same leakage, at high frequencies, the second-generation 7nm process has improved the Vmin by 50 mV. Qualcomm has stated that the second-generation 7-nanometer node will be used for their next-generation cellular 5G modem which will deliver 2x peak data rate over their first-generation 5G modem."
 
Last edited:

CrustyBritches

Gold Member
With RDNA DCU config you must disable 8CUs.
How do they end up with 40CU 5700XT and 36CU 5700 Pro?
Sony woulnt do the Wiiu mistake of sacrificing perfomance for backwards compatibility
They are more likely to dump 5 million into an emulation team to create a software layer that assists hw based emulation.
Like they did with PS4 Pro?
---
I've yet to hear an adequate explanation how you plan to go from 1.6GHz 36CU 5700 Pro that consumes baseline 180W TBP(probably more like 190W average gaming/200W peak) to a 56CU system that consumes even less at similar clocks, while having an additional 16GB GDDR6(over Navi 10's 8GB) and RT thrown in the mix.
 

SonGoku

Member
How do they end up with 40CU 5700XT and 36CU 5700 Pro?
2SEs
Like they did with PS4 Pro?
PS4Pro didn't sacrifice performance for bc and even if it did its a midgen refresh meant to make enhancements as costless as possible for devs not a next gen machine
I've yet to hear an adequate explanation how you plan to go from 1.6GHz 36CU 5700 Pro that consumes baseline 180W TBP(probably more like 190W average gaming/200W peak)
I could ask you the same question lol how you plan to go from 1.6GHz 36CU 5700 Pro that consumes baseline 180W TBP to a 36CU system that consumes even less at 1.8Ghz?

To answer your question:
I believe the 5700 uses the default voltage (used to boost all the way up to 1950Mhz) on all clock modes
A console GPU could be undervolted to hit 1600Mhz stable on a 56CU chip or 1500Mhz on a 64CU
Another factor is yields, as they improve you need less voltage to hit same clock.
 
Last edited:

CrustyBritches

Gold Member
Full chip i believe
2 full chips at 40CU and 36CU with no option for salvaged chips? We'll see. That would be odd compared to AMD normal modis operandi.

PS4Pro didn't sacrifice performance for bc and even if it did its a midgen refresh meant to make enhancements as costless as possible for devs not a next gen machine
I was talking about their approach in contrast to MS. Responding to this...
They are more likely to dump 5 million into an emulation team to create a software layer that assists hw based emulation.
See...like PS4 Pro. Not hardware BC like putting old hardware in the system, hybrid BC where certain hardware config and clocks are required to accommodate BC. We're talking about the same thing. Xbox is further abstracted and virtualized. They used, and had to use, a different approach with Xbox.

I could ask you the same question lol how you plan to go from 1.6GHz 36CU 5700 Pro that consumes baseline 180W TBP to a 36CU system that consumes even less at 1.8Ghz?
You can't ask that question while pushing a config that consumes ~100W more than what I'm talking about. Hovis and power cap with "opportunistic" clock PR is how I would go about it. A more simple explanation is that Gonzalo's 1.8GHz is "boost"/opportunistic/high quality binned chip clock, and lower clock is coming to the retail version.
 
Last edited:

SonGoku

Member
2 full chips at 40CU and 36CU with no option for salvaged chips? We'll see. That would be odd compared to AMD normal modis operandi.
Look at my edited post. 2SEs you disable a DCU from each
For bigger chips a 4SE config will be needed
See...like PS4 Pro. Not hardware BC like putting old hardware in the system, hybrid BC where certain hardware config and clocks are required to accommodate BC. We're talking about the same thing. Xbox is further abstracted and virtualized. They used, and had to use, a different approach with Xbox.
It does not apply to the slightest because the Pro was meant to be a revision of the same console, Cerny even said for next gen their approach would be different
And i repeat again... Pro didn't sacrifice performance for bc
You can't ask that question while pushing a config that consumes ~100W more than what I'm talking about. Hovis and power cap with "opportunistic" clock PR is how I would go about it. A more simple explanation is that Gonzalo's 1.8GHz is "boost"/opportunistic/high quality binned chip clock, and lower clock is coming to the retail version.
I asked because you are using 1.8Ghz as part of your prediction
A 36CU chip at 1.8ghz would consume the same or more than a 56cu chip at 1.6ghz

I already explained how (which you chose to ignore?) without even getting into the Hobbit method
 
Last edited:

TLZ

Banned




Here we go
giphy.gif

and he continues :messenger_grimmacing_




8tf, 8 times more powerful. I get it now. Gr8 b8 m8 I r8 8/8.

i rel8 str8 appreci8 nd congratul8. i r8 dis b8 an 8/8. plz no h8, i'm str8 ir8. cr8 more cant w8. we shood convers8 i wont ber8, my number is 8888888 ask for N8. no calls l8 or out of st8. if on a d8, ask K8 to loc8. even with a full pl8 i always hav time to communic8 so dont hesit8. dont forget to medit8 and particip8 and masturb8 to allevi8 ur ability to tabul8 the f8. we should meet up m8 and convers8 on how we can cre8 more gr8 b8, im sure everyone would appreci8 no h8. i dont mean to defl8 ur hopes, but itz hard to dict8 where the b8 will rel8 and we may end up with out being appreci8d, im sure u can rel8. we can cre8 b8 like alexander the gr8, stretch posts longer than the nile's str8s. well be the captains of b8 4chan our first m8s the growth r8 will spread to reddit and like reel est8 and be a flow r8 of gr8 b8 like a blind d8 well coll8 meet me upst8 where we can convers8 or ice sk8 or lose w8 infl8 our hot air baloons and fly tail g8. we cood land in kuw8, eat a soup pl8 followed by a dessert pl8 the payment r8 wont be too ir8 and hopefully our currency wont defl8. well head to the israeli-St8, taker over like herod the gr8 and b8 the jewish masses 8 million m8. we could interrel8 communism thought it's past it's maturity d8, a department of st8 volunteer st8. reduce the infant mortality r8, all in the name of making gr8 b8 m8.

*Shamelessly copypasta from Reddit for the occasion.

Before i waste time watching this are they basing their speculation on a 40-48CU chip?
40 with 4 disabled like the Pro which guarantees easy full BC like the Pro did again.

Also, 8tf RDNA is x1.25 Vega flops so 10tf. They mention both companies aren't talking TFs not to confuse people since RDNA TFs are different.

That's what they said.
 

SonGoku

Member
40 with 4 disabled like the Pro which guarantees easy full BC like the Pro did again.
This is so dumb lol might as well call it the PS4 U
If that was the case they could just disable CUs for the PS4 emu
Or go with a 72CU chip (2x36) ;)
80CU (total) dream redeemed
 
Last edited:
It will be interesting to see how the TF marketing war progresses.
  • MS thinks they have PS5's 12.9 beat
  • Sony are obviously packing more than 12.9
If MS decide against revealing the TF number in case it backfires then it's coming down to the games to land the knockout blows.

Some games that will be hindered by cross gen if communication is to be believed.

The entire reason for one platform to exist is it wearing the performance crown.

The other platform doesn't need performance to win the hearts and minds of consumers.

The specs may never be revealed. Won't that be hilarious?
 

CrustyBritches

Gold Member
Look at my edited post. 2SEs you disable a DCU from each
For bigger chips a 4SE config will be needed
So are you saying these are 2 full chips, or 40CU XT with 4 disable for 36CU Pro?

It does not apply to the slightest
It's precedent, of course it applies.
And i repeat again... Pro didn't sacrifice performance for bc
This is your strawman, don't involve me.
I asked because you are using 1.8Ghz as part of your prediction
This is what I'm waiting for you to explain. I'm really not following, sorry. I've written literal pages explaining how I power cap, oc, and undervolt my RX 480 to get X1X's results under 160W, even without the doubling of cache and 326GB/s memory bandwidth compared to ~277GB/s on RX 480.

Meanwhile, you've always used your misunderstanding of that dynamic as the fulcrum of your belief that Xbox One X got a 200W+ GPU. This is where your wider with slightly lower clocks = massive perf/watt jump theory comes from. That going from 36CU to 40CU and having like 200MHz lower core clock results in some fantastical perf/watt uplift.

Show me how your theory works in relation to 2016 RX 470/480 and 2017 RX 580 and Vega line when it comes to perf/watt. P.S.-I already looked.
 
Last edited:

joshwaan

Member
Funny thread interesting to say the least :p

Im more excited see what they do with CPU if both ps5 and X5 have 3.2GHZ and 32MB L3 cache be an absolute beast ever if GPU is 8 or 9 TF imo.
 

SonGoku

Member
If MS decide against revealing the TF number in case it backfires then it's coming down to the games to land the knockout blows.
I hope neither Sony nor MS go this route
Imagine waiting for SEM die shots to get an approximation lol
40CU XT with 4 disable for 36CU Pro
This
It's precedent, of course it applies.
Its a flawed comparison... Pro is a revision of the same console
PS5 is a next gen clean slate console (Cernys words)

Plus you can achieve the same result with
56CUs (20 disabled for PS4 emu)
64CUs (28 disabled for PS4 emu)
72CUs (half disabled for PS4 emu or 2x performance mode)
This is your strawman, don't involve me.
A 36CU chip would be compromising performance
7nm node strength is in the density increase not clocks
That going from 36CU to 40CU and having like 200MHz lower core clock results in some fantastical perf/watt uplift.
It is though because the X is undervolted to hit a lower stable clock
Show me how your theory works in relation to 2016 RX 470/480 and 2017 RX 580 and Vega line when it comes to perf/watt. P.S.-I already looked.
My theory doesn't apply to RX580 since it is a higher clocked version without increased CUs
If anything the RX 480-580 proves my point how a couple hundred Mhz can mean the world of difference for perf/watt once you push past the sweetspot and hit diminishing returns

edit:
Example of how much of a difference undervolting makes for perf/watt
 
Last edited:

CrustyBritches

Gold Member
Its a flawed comparison... Pro is a revision of the same console
PS5 is a next gen clean slate console (Cernys words)
Still precedent because Sony has to deal with PS4 BC. 36CU for BC would just be 1 of many factors converging: BC, Power Consumption, die size, CUs disabled for yields.

A 36CU chip would be compromising performance
Remember my, "how about lower clocks and less CUs?" comment. I provided precedent with Polaris where the perf/watt sweet spot is year 1, and moving towards less CUs giving better perf/watt. Compared to the following year with 570/580's higher clock and Vega's wider/slower + architectural advantages both either only matching 470/480's perf/watt, or being worse.

My theory doesn't apply to RX580 since it is a higher clocked version without increased CUs
If anything the RX 480-580 proves my point how a couple hundred Mhz can mean the world of difference for perf/watt once you push past the sweetspot and hit disminishing returns
This example fails because you claim the node maturing(RX 580) + wider/slower(Vega) will result in better perf/watt. That's to be seen, especially in regards to my above example.

Still confident in my prediction that 2060/1080/Vega64 class GPU will beat PS5. AMD's own benchies seem to agree. I can only work off what we actually know and historical precedent, not just hope for the future.
 
Last edited:

SonGoku

Member
Still precedent because Sony has to deal with PS4 BC. 36CU for BC would just be 1 of many factors converging: BC, Power Consumption, die size, CUs disabled for yields.
Cerny already went on record that PS5 will be a clean slate and you can emulate 36CUs by diasabling CUs on the emulator anyways
Power consumption would be better due to lower clocks and voltage
Cost i concede
Remember my, "how about lower clocks and less CUs?" comment.
So 6tf then? lol
Rememeber my how about more CUs and even lower clocks comment? :p
Whaever your target is: 6tf, 8tf, 10tf, 12tf, 14tf (or anything in between) you'll get better perf/watt with a bigger chip
570/580's higher clock and Vega's wider/slower + architectural advantages both either only matching 470/480's perf/watt, or being worse.
An undervolted Vega kicks Polaris ass, AMD clocked those cards way past comfort zone
This example fails because you claim the node maturing(RX 580) + wider/slower(Vega) will result in better perf/watt.
RX580 is clocked higher... to apply to my claim it would have to be undervolted to match rx480 clocks
 
Last edited:
I actually now think that >10TF is unlikely for these machines. AMD's recent RX 5700 XT and RX 5700 cards are doing up to 9.75 TFLOPs and 7.95 TFLOPs.

9-10TF for both PS5 and Scarlett seem more realistic.

Unless I've missed something?
 

bitbydeath

Member
I actually now think that >10TF is unlikely for these machines. AMD's recent RX 5700 XT and RX 5700 cards are doing up to 9.75 TFLOPs and 7.95 TFLOPs.

9-10TF for both PS5 and Scarlett seem more realistic.

Unless I've missed something?

It was confirmed those GPU’s don’t support Ray-Tracing so they won’t be used on PS5 / Scarlett.


But wait, it was only confirmed that PS5 is supporting Ray-Tracing via the GPU.

I’m kidding, I’m sure they both will.
 
Last edited:

SonGoku

Member
I actually now think that >10TF is unlikely for these machines. AMD's recent RX 5700 XT and RX 5700 cards are doing up to 9.75 TFLOPs and 7.95 TFLOPs.

9-10TF for both PS5 and Scarlett seem more realistic.

Unless I've missed something?
Those are small chips 36-40CU
Consoles will use 56-64CU
 

Lort

Banned
It was confirmed those GPU’s don’t support Ray-Tracing so they won’t be used on PS5 / Scarlett.


But wait, it was only confirmed that PS5 is supporting Ray-Tracing via the GPU.

I’m kidding, I’m sure they both will.

Playstation gpu “will support” raytracing , xbox next has “hardware accelerated raytracing.”

You could read that as xbox gpu having dedicated hardware and sony gpu using shaders ... but not really the other ware around sorry.

Most likely they both have the same thing.
 

Snake29

Banned
Also people are misreading the whole "co-engineerd with Microsft and using our next-generation RDNA bla bla bla". People need to know that she refers to GCN. RDNA is the next-gen architecture after GCN.

So no way Scarlett is using RDNA2.0. Nowhere does she refers to RDNA 2, not even on AMD's own website.
 
Last edited:

pawel86ck

Banned
XboxoneX is allready 50%more powerfull than ps4pro and the sony fans are fine with that
What's interesting both PS4P and Xbox X are based on polaris architecture but xbox x GPU was clearly more capable thanks to MS customization. PS4P has 4.2TF yet mamy games on XBOX X render 2x as many pixels with just 1.8TF more. In games like wolfenstein 2 even on RX 580 you cant match xbox x results because even at dynamic and MINIMUM settings you get 45-55 fps, while xbox x runs the same game at 55-60 fps at even higher settings. If MS will customize Xbox Scarlett GPU in similar way, then their GPU will be clearly faster than PS5 even when both consoles will use the same 12TF GPU. Also CPU performance on xbox scarlett should be better if MS will use the same DX12 tech as it was on Xbox X (according to MS their DX12 tech was able to reduce draw calls drastically and as a results of that xbox x CPU was up to 50% faster). Maybe PS5 will have faster SDD, but I really think from pure CPU and GPU power perspective MS will customize their console with better results.
 
Last edited:

THE:MILKMAN

Member
Those are small chips 36-40CU
Consoles will use 56-64CU

I think the info released at E3 clearly shows the next consoles will be based on these Navi 10 40CU parts.

I always guessed next-gen would be in the 10-12TF GCN/Vega56-Vega64 range and that is exactly where 5700 XT seems to fall performance wise and seems completely inline with die size/power too when you look at the history with PS4 getting a 212mm^2 GPU ~18 months after.

56-64CUs RDNA seems highly unlikely given what die area and power that would likely use. Either console maker is quite welcome to prove me wrong on this.
 

Imtjnotu

Member
What's interesting both PS4P and Xbox X are based on polaris architecture but xbox x GPU was clearly more capable thanks to MS customization. PS4P has 4.2TF yet mamy games on XBOX X render 2x as many pixels with just 1.8TF more. In games like wolfenstein 2 even on RX 580 you cant match xbox x results because even at dynamic and MINIMUM settings you get 45-55 fps, while xbox x runs the same game at 55-60 fps at even higher settings. If MS will customize Xbox Scarlett GPU in similar way, then their GPU will be clearly faster than PS5 even when both consoles will use the same 12TF GPU.
Pro is bandwidth limited. The X is not. Hence the higher pixel count
 

Snake29

Banned
What's interesting both PS4P and Xbox X are based on polaris architecture but xbox x GPU was clearly more capable thanks to MS customization. PS4P has 4.2TF yet mamy games on XBOX X render 2x as many pixels with just 1.8TF more. In games like wolfenstein 2 even on RX 580 you cant match xbox x results because even at dynamic and MINIMUM settings you get 45-55 fps, while xbox x runs the same game at 55-60 fps at even higher settings. If MS will customize Xbox Scarlett GPU in similar way, then their GPU will be clearly faster than PS5 even when both consoles will use the same 12TF GPU.

Sony always told it was not there intention to make big changes for the Pro console. The 6TF is not the only thing that was changed in the X, but also more Ram, higher bandwidth. So no, it doesn't say shit for the PS5 and Nextbox. Pro had a CPU upclock, a bit higher total bandwith, same ram, only GPU was more customized with Vega features and dedicated checkerboard hardware. Also the secondary ARM SoC was better with more ram. The Pro with only more bandwidth alone would do much more then now. Microsoft had to make those big changes because the original Xbox One didn't had GDDR5, and way slower GPU.

It would be weird if Microsoft comes up with a refresh that wouldn't be more powerful, that would have been at the very end of Xbox. Microsoft had to come up with something better, Sony didn't have to.
 
Last edited:

bitbydeath

Member
Playstation gpu “will support” raytracing , xbox next has “hardware accelerated raytracing.”

You could read that as xbox gpu having dedicated hardware and sony gpu using shaders ... but not really the other ware around sorry.

Most likely they both have the same thing.

It means the GPU has dedicated hardware.
Xbox not mentioning it could mean the Ray Tracing hardware is external to that of the GPU.
 
Status
Not open for further replies.
Top Bottom