CrustyBritches
Gold Member
DF covered the same question "why test console APU in 3DMark on Windows?" about 2 months ago now...
Last edited:
Your theory sucks ass mate because NaviXT sucks up 225W with 8GB of GDDR6.Still equivalent to 10TF of GNC for rasterization performance, nowhere close to the 13TF found in devkits
This theory has a bunch of holes in it and going through a bunch of assumptions to make it work:
To many holes and assumptions made to be treated as anything other than wild speculation.
- Is it real?
- Its not a Chinese gaming chip?
- How is it linked to PS5 other than PS4 chips? What's to say this isn't cashing it on that to make it believable
- April dev kits weren't Navi because? Didn't gonzolo appear before that?
- It performs like 10TF GNC even though devkits report 13TF of unknown arch
- Why would Sony use a Latin last name for the APU code name?
We are talking of navi not turing
8TF Navi would perform similar to 6TF Turing.
It would be a reckless move, because hardware component prices (7nm wafers, GDDR6, NAND chips) need to drop (Phil said that, not me). That's why both companies will release them in late 2020. BF/Xmas period is always the best.I believe it will be a killer move if they do that.
There is only pros in this move:
- No holidays stock issues
- Two big seller period in the first year
No, it's not IBM PC compatible in any shape or form (for example, legacy BIOS functions are totally missing and the southbridge is rather exotic compared to PC chipsets).I believe there is no issue in installing a Windows in a machine with PS4's APU because it is fully compatible.
$349-399 would make sense for specs like that.What if that were to happen but it was only 8.5-9 Tflops, 399? Game over for Xbox division?
PS4 Pro and XB1X increased the gaming power consumption (up to 155-175W) and yet, the idle power consumption is lower (70 vs 50 watts) compared to OG models.I suspect going near 400mm2 7nm will be a no no. They probably don't want to trigger EPA regulations/tariffs or something as far as CE devices go. Times have changed from PS3 days.
That is why I asked.. seems weird to see 3DMark benchmarks for PS4.
But maybe for test only before they had the full PS4 hardware they used the APU in a PC with Windows and so they did the benchmarks.
I believe there is no issue in installing a Windows in a machine with PS4's APU because it is fully compatible.
Of course the final PS4 machine can be better around in performance than that 3DMark frankstein test.
Not familiar with those, can you elaborate why they would regulate a 380-400 mm2 chip?EPA regulations/tariffs or something as far as CE devices go
Many leaks banked on PS4/XB type leaks design to make it look more believableI thought the Apisak etc work looked reliable - but doesn't make it guaranteed right
Sure we can come up with all sorts of explantions, doesn't make it any less of a stretchBut the simple explanation is : we have 13TF dev kits that exceed even the ~8TF RDNA>GCN TF conversion because/if dev kits are/were running more powerful hardware to ease development before optimization.
And running 36CUs at 1.8GHz won't be power hungry because? let alone 18Gbps chipsYour theory sucks ass mate because NaviXT sucks up 225W with 8GB of GDDR6
Easy with a ~250W total system TDP and the hobbit methodSo tell me, how will they fit it in console first, along with additional 8GB of RAM (~16W), RT hardware (?W) and 8 core Zen2 (~40W)? Because they had to downclock Pitcairn, which was :
It would be a reckless move, because hardware component prices (7nm wafers, GDDR6, NAND chips) need to drop (Phil said that, not me). That's why both companies will release them in late 2020. BF/Xmas period is always the best.
Sony has also confirmed it that they're not releasing the PS5 in early 2020.
No, it's not IBM PC compatible in any shape or form (for example, legacy BIOS functions are totally missing and the southbridge is rather exotic compared to PC chipsets).
Don't take my word for it, watch this presentation (PS4 hackers):
Console Hacking 2016
Last year, we demonstrated Linux running on the PS4 in a lightning talk - presented on the PS4 itself. But how did we do it? In a departu...media.ccc.de
$349-399 would make sense for specs like that.
$499? Hell no!
PS4 Pro and XB1X increased the gaming power consumption (up to 155-175W) and yet, the idle power consumption is lower (70 vs 50 watts) compared to OG models.
Modern chips with modern lithography support more advanced power gating and aggressive clock scaling (CPU/GPU/DRAM clocks), so I don't see this being an issue.
yeah well all past APISAK console leaks rely on the 3D mark database. and the ps4 one was a perfect fit (as can be read in the DF article). so somehow it must be possible.
You can read the regulation here:Not knowing any of these regulations, Pro and X seem very reasonable at sub 200w. But where do you see a theoretical 400 mm2 7nm PS5 power consumption wise? If it's well above 200w which it seems it would be then it's unlikely. That's why going EUV will be the key.
I'll bite for $299What if that were to happen but it was only 8.5-9 Tflops, 399? Game over for Xbox division?
Im suggesting a 380-390 mm2 SoC, big differencePS4 had smaller, 212mm² chip from PC, in 348mm² die.
The difference is there was no 6nm equivalent on the horizon back when PS4 launchedYou want them to increase die by 15% on more expensive process with worse yields and then ti shrink it to original PS4 level in year time.
11TF would be roughly equal to the RTX2080, nowhere close to the TiLook, you are not getting 11-13TF part because that is >2080TI beater.
Big chip will net best perf/watt so around ~250W totalBut where do you see a theoretical 400 mm2 7nm PS5 power consumption wise? If it's well above 200w
Indeed but in case 7nm EUV isn't ready the fallback option is big die on 7nm and 6nm shrink a year later.why going EUV will be the key.
Not familiar with those, can you elaborate why they would regulate a 380-400 mm2 chip?
Many leaks banked on PS4/XB type leaks design to make it look more believable
One of the first PS5 leaks used the PS4 PDF presentation style lol
Sure we can come up with all sorts of explantions, doesn't make it any less of a stretch
And running 36CUs at 1.8GHz won't be power hungry because? let alone 18Gbps chips
Easy with a ~250W total system TDP and the hobbit method
56CUs undervolted to maintain 1540Mhz stable (compared to 1950Mhz 5700 XT)
380-390 mm2 SoC size late 2020 shrink to 320-340 mm2 late 2021
There is no such limit, otherwise 9900k and high-end 300W GPUs would be banned as well. What matters is having decent power/clock scaling, since most of the time a computer/console is idle, not operating at full throttle.Sad thing is 250W is probably way too much. I don't know the exact power regulations but I remember posts from Rigby going on and on about them. I think EU has even stricter limts. They can't just launch a 300w console or something without significant financial penalties.
From N Negotiator post: they don't regulate gaming power usage, but navigation/media playback/idle modes.Sad thing is 250W is probably way too much. I don't know the exact power regulations but I remember posts from Rigby going on and on about them. I think EU has even stricter limts. They can't just launch a 300w console or something without significant financial penalties.
Interesting:The Subor Z+ console team has disbanded - but it's not game over yet
Last September we took an early look at the Z+, a Windows 10 games console from Chinese manufacturer Zhongshan Subor. A…www.eurogamer.net
Subor evolved to Gonzolo perhaps?However, this may not be the last attempt that Zhongshan Subor makes to enter the Chinese games console market according to a statement by the company's CEO, Wu Song: "While the Shanghai office has been closed, the project is still ongoing and we will have a new announcement to make regarding its progress in the next few months.
We had one today. No way any Chinese console is packing 8 core Zen2 and Navi10 with 20K+ on 3DMark. Just not hapoening. Its completely different matter that some expect 2080 perf in console when last time around we got 60% less FLOPS AMD gave in PC space.Shit is getting boring ... It has been long time since we had a legit info
You can read the regulation here:
http://efficientgaming.eu/fileadmin..._Games_Console_ACR_2018_Final_report_V1.0.pdf (page 24)
TL;DR: they don't regulate gaming power usage, but navigation/media playback/idle modes.
Imagine if they enforced double-digit wattage numbers for gaming usage. Consoles would be rather underpowered.
I mean, what's next? Banning expensive AAA games because they don't promote "green computing" (unlike low-power pixelized indies)? It would be ridiculous!
then they would have done this before if that was the caseI believe it will be a killer move if they do that.
There is only pros in this move:
- No holidays stock issues
- Two big seller period in the first year
Why not? Subor was targetting 4TF + Zen 4 coreWe had one today. No way any Chinese console is packing 8 core Zen2 and Navi10 with 20K+ on 3DMark.
everything shrinks when you go to 7nm. Yes, the PS4 had a 212mm2 GPU with a 70mm2 jaguar in a 350mm2 GPU, but those I/O parts, memory controllers etc all get a major size reduction as well. 4x compared to 28nm. So 212+70=282-350=~70mm2 reserved for everything else on the die on a 28nm die. 70/4=17.5mm2. Thats how much all that extra stuff should take up on a 7nm die.Your 60CU + RT CANNOT fit when 40CU part is already 251mm². I dont know why you ignore PC > console conversion from last gen when PS4 had smaller, 212mm² chip from PC, in 348mm² die.
You want them to increase die by 15% on more expensive process with worse yields and then ti shrink it to original PS4 level in year time.
Look, you are not getting 11-13TF part because that is >2080TI beater. You just arent gonna get that in console.
We had one today. No way any Chinese console is packing 8 core Zen2 and Navi10 with 20K+ on 3DMark. Just not hapoening.
5700 DCUs weight 3.37 mm2 each70mm2 Zen 2 chip + 17.5mm2 controllers + 251 mm2 = 338mm2. The Scarlett die is in the 380-400m2 range. Thats a whole lot of space for extra CUs AND RT cores.
Pascal or Turing?acts like a nvidia card
Big chip will net best perf/watt so around ~250W total
Indeed but in case 7nm EUV isn't ready the fallback option is big die on 7nm and 6nm shrink a year later.
Its not a redesign, its design compatible - minimal retooling requiredIt's probably best to avoid having to have a redesign model so quickly after launch. It's like double the work.
Because I am pretty sure you would hear about their next console with 8core Zen2 and Navi 10 chip on die alreay. In fact, I am pretty sure they would yell about it from their balcony.Why?
It's not their talent that's in question here, it's what they ask for from AMDs semicustom division. If they were happy to make one, why not another?
They were also on 4Tflops on 14nm with the previous attempt. The extra density afforded by 7nm plus Navi's IPC gains would seem awfully close.
Being on 14nm the first time they did pay for a beefy die
Scarlett die is not 380-400. It can be anything from 338-400 and you are forgetting RT hardware + wider bus on Scarlett. So no, not alot of space, just alot of space for wishfull thinkingeverything shrinks when you go to 7nm. Yes, the PS4 had a 212mm2 GPU with a 70mm2 jaguar in a 350mm2 GPU, but those I/O parts, memory controllers etc all get a major size reduction as well. 4x compared to 28nm. So 212+70=282-350=~70mm2 reserved for everything else on the die on a 28nm die. 70/4=17.5mm2. Thats how much all that extra stuff should take up on a 7nm die.
70mm2 Zen 2 chip + 17.5mm2 controllers + 251 mm2 = 338mm2. The Scarlett die is in the 380-400m2 range. Thats a whole lot of space for extra CUs AND RT cores.
yep. fits perfectly within the 380-400mm2 range.5700 DCUs weight 3.37 mm2 each
Add 12DCU (24 CUs for a total of 64CUs or 32DCUs)
12DCUs + 15% RT = 46.5 mm2
12DCUs + 10% RT = 44.48 mm2
Pascal or Turing?
Because I am pretty sure you would hear about their next console with 8core Zen2 and Navi 10 chip on die alreay. In fact, I am pretty sure they would yell about it from their balcony.
They are not in game against MS and Sony, they would get with that info far before chip ends up in QS, especially seeing as they went bankrupt.
I am?Scarlett die is not 380-400. It can be anything from 338-400 and you are forgetting RT hardware + wider bus on Scarlett. So no, not alot of space, just alot of space for wishfull thinking
Thats a whole lot of space for extra CUs AND RT cores.
10% extra RT sillicon doesnt take that much space and 320-384 bit bus is tiny on 7nmScarlett die is not 380-400. It can be anything from 338-400 and you are forgetting RT hardware + wider bus on Scarlett. So no, not alot of space, just alot of space for wishfull thinking
75mm for CPU
45mm for 10 GDDR6 controllers (20GB GDDR6)
8.8mm for ROPs
140mm for buses, caches, ACE, geometry processors, shape etc. (over estimating this part as the 5700 seems to have lots of "empty" areas. )
3.37mm2 is the DCU size
Total size 387-393mm2
- 118.6mm for 64CUs + RT silicon making CUs 10% bigger compared to Navi10
- 124mm for 64CUs + RT silicon making CUs 15% bigger compared to Navi10
Tensor/RT sillicon takes 22% space on 12nm for nvidia cards, 7nm has a 3.2X density increase (or 0.31X area) over 12nm. So 10% or less is a safe bet for RT silicon on 7nm
396 mm2 with 24GB GDDR6 (12 GDDR6 memory controllers)
For 64CUs APU I'd expect anywhere between 380-390mm2
400Mhz its a huge difference for TDP in GPUs especially when hitting diminishing returnsIf a 40 CU part is taking up 150W at 1.9ghz, how can a 56-60 CU part be less than that even at 1.5ghz?
Yeah ~250W seems more realisticEven for a 200W console
Look at the 5700 size estimates were posted earlier on this threadSonGoku, I still don't understand how you managed to put in so many CUs.
Interesting point by one user as reply to Komachi.
(Should check entire tweet chain)
Seems there is definite difference between codename numbering between Sony and MS.
A semicustom part in the same vein as that is what I suggested, not that exact part or company
I mean, what's the explanation for this APU running a Windows benchmark when Playstation is FreeBSD based? The closest plausible explanation if you squint would be that they also run Linux on it for testing and ran it through Wine, but then you'd expect such a performance drop, and to even land between the Vegas...
And why would they need to do that anyways, FreeBSD is what they know and customize and run on PS hardware for years.
There's so much about that you need to squint really hard to make it seem plausible, vs it just being a semicustom part for another obscure Windows box.
Because of yields and the design not being ready yet, and im thinking AMD is designing RDNA2 (big navi) on 7nm EUV.SonGoku ,
One question regarding this theory of yourse...Why didnt AMD just release 64CU version of Navi @ 1600MHZ and ~300mm² size and literally slaughtered Nvidia in price segment? Could even take performance crown ffs with 13TF+ Navi.
Your 140mm part and the size of CUs is made up estimate based on what? You have some reference?Look at the 5700 size estimates were posted earlier on this thread
This is how i come up to 64CUs
Zen 2 is 70 mm2
But 50% difference in CUs is small as you said, certainly price of card ~600$ would overcome small price you pay for worse yields.Because of yields and the design not being ready yet, and im thinking AMD is designing RDNA2 (big navi) on 7nm EUV.
Its from Proelites estimates, (he knows his stuff) for the record that 140mm2 is overcompensating for empty spaces that wont necessarily be on a console chipYou 140mm part is made up estimate based on what?
I can only speculate to the design not being ready yet and maybe AMD doesn't consider it worth the investment to port their 7nm EUV design to 7nmBut 50% difference in CUs is small as you said, certainly price of card ~600$ would overcome small price you pay for worse yields.
Yes 99.9% sureDo we know yet if a whole dual CU has to be disabled on Navi?
The cache and some units are shared between the 2 CUs so I don’t think it is possible to disable half of a DCU.Do we know yet if a whole dual CU has to be disabled on Navi?
Or is it possible to just disable a single CU (half of a dual CU)?
Hoping it's the latter because Navi is supposed to be scalable.
Not saying gonzolo is certainly them, could be from another dime a dozen Chinese company, but this statement is certainly suspect:Because I am pretty sure you would hear about their next console with 8core Zen2 and Navi 10 chip on die alreay. In fact, I am pretty sure they would yell about it from their balcony.
However, this may not be the last attempt that Zhongshan Subor makes to enter the Chinese games console market according to a statement by the company's CEO, Wu Song: "While the Shanghai office has been closed, the project is still ongoing and we will have a new announcement to make regarding its progress in the next few months.
45W? Why?With the Zen2 taking up 45W by itself
Where did you get those numbers from?I can't find any realistic estimates of the ryzen 3000 sizes but...
The last gen ryzen 12nm 8c/16t 2700x had 213mm die. 7nm would bring it down to 124mm. There could be some small architectural upgrades but I don't see a 7nm ryzen 7 going anywhere near 100mm.
Why didn't Nvidia release Ampere GPUs this year?SonGoku ,
One question regarding this theory of yourse...Why didnt AMD just release 64CU version of Navi @ 1600MHZ and ~300mm² size and literally slaughtered Nvidia in price segment? Could even take performance crown ffs with 13TF+ Navi.
I love your cynicism, haha!I wonder when we find out that apisak is a twitter bot ran by the green or blue team?
2018 isn't exactly a long standing twitter account.
Easier to poison the well when you own the well
Im going with occams razor on this one and maintain that bankrupt firm is not releasing not yet announced console with best performing APU you can currently find, especially since its codename tells us its much more likely to be Sonys console and not from no named Chinese manufacturer that AMD never mentioned on their quarterly call as new semi customer.Not saying gonzolo is certainly them, could be from another dime a dozen Chinese company, but this statement is certainly suspect:
Why didn't Nvidia release Ampere GPUs this year?
Apple has been using 7nm chips since 2018, earlier than anyone else. Why?
Apple A12 - Wikipedia
en.wikipedia.org
Ask those questions and you'll get your answer.