[MLiD] PS6 Full Specs Leak: RTX 5090 Ray Tracing & Next-Gen AI w/ AMD Orion!

Don't feel like getting quoted by wccftech and all the other tech rags today

"LADIES AND GENTLEMEN, MY NAME IS PAUL, AND WELCOME TO ANOTHER REDGAMINGTECH.COM VIDEO!"

Mocking Spongebob Squarepants GIF
 
Last edited:
If Canis has 16 CUs, then that means 8 WGs right? So 8 vs 27 (1 disabled). PS6 will be 3.25 times the GPU power of the PS6 handheld. Series X was 3 times the Series S but with almost equivalent CPUs.
 
Oh I see what has happened, he sees x amount of RT performance over the base PS5 and then just multiples the FPS from a benchmark. He did the same with the PS5 Pro. That's not really how that works, you have an improved RT pipeline, not the entire FPS number being multiplied by a certain amount. That's how people vastly inflated the PS5 Pro RT performance as well. Assuming this information is even correct.
 
Oh I see what has happened, he sees x amount of RT performance over the base PS5 and then just multiples the FPS from a benchmark. He did the same with the PS5 Pro. That's not really how that works, you have an improved RT pipeline, not the entire FPS number being multiplied by a certain amount. That's how people vastly inflated the PS5 Pro RT performance as well. Assuming this information is even correct.
That's exactly what he does, but I think according to him, this (12x) would play out more in solely Path Traced games. Otherwise 6x.
 
I'm just curious from a tech perspective, when Sony/MS do their CU thing and they got a few to spare in case some go down, what happens it loses so many CUs the backups cant fulfill it?

Does the system and game totally bomb out? Or the game can still work but it's like someone with a gimped PC and it can still work but at shittier performance?
 
Think about it like this: TSMC's claimed transistor density for N3P is 224 Mtr/mm^2, and Apple SoCs are also in this range. So if the chip is 280mm^2 it should land somewhere in the 224 * 280mm = ~62.7B transistors range. A 9070 XT is 53.9B transistors, but that is for the GPU alone. This chip will have include the entire CPU and SoC aswell which is say 30-40%(?) of the die area if you eyeball it. IMO, if one expects this to massively outperform some ~200mm^2 3nm RTX 6060 or 9060 XT successor you set yourself up for disappointment.
 
Last edited:
Again, what matters is full exclusives just like Nintendo. It is pointless to buy ps6 if you will just see there exclusives be ported to Pc and other consoles. And bring back Sony Japan AA and A exclusives. SRP also matters, it must not be too expensive.
 
Last edited:
Well, obviously the clocks will determine the final result, but smaller the gap, better for the devs technically.
It's a >10 times difference (hopefully 20 times...) in power draw. That's really all you need to know.

(Not that power draw vs performance has a nowhere near linear relationship but the difference will be huge no doubt about that. good thing most gfx related tasks are highly scalable then)
 
Last edited:
I'm just curious from a tech perspective, when Sony/MS do their CU thing and they got a few to spare in case some go down, what happens it loses so many CUs the backups cant fulfill it?

Does the system and game totally bomb out? Or the game can still work but it's like someone with a gimped PC and it can still work but at shittier performance?
They are nonfunctional (and possibly defective), but included to increase manufacturing yield and reduce production costs. For all practical purposes, pretend the disabled CUs don't exist. No CUs disabled typically means a super high quality chip and as a result, the highest cost to produce. They end up being the ultimate/flagship models in a GPU series.

Post manufacturing, no CUs should go down. There are no spares/backups.
 
Last edited:
Its just me or a 9070/5080 in almost 3 years for a 8 year gen seems like really fucking low ?
Were you expecting 6090 level or something?.
I'll bet anything a 4090 will still eat in for breakfast raw power wise, probably a 4080 too.
Which for a console would still be good.

As with every new console gen coming up, keep your expectations in check, and dont believe everything you read or watch from clueless Youtubers.
 
Last edited:
5090 Performance?
In 2027 for 160W?

tdF0IHk.gif


If Nvidia announced this we would laugh at them.
160W for RTX 5090 performance for the 2026 nextgen lets call it RTX 6070 would be called bullshit straight up, we wouldnt even entertain it.
Why are we gonna believe a console could do that?
 
160W TDP still seems low same as his last video. CU count have got little bump. I think it was 40-48CU. 2.5-3x PS5 raster performance is definitely believable. I do not believe his RT and memory requirement as well.
 
3 shader engines and 2 deactivated CU don't go along. TDP should be higher and bandwidth is way too low.

Fake or highly modified.

Logically it's going to have 2 shader engines and 4 deactivated CUs.
 
I have a hunch only half this information is true, the sketchy stuff being the TDP and RAM, TDP is likely going to be similar to the PS5, and RAM is likely going to be 24 GB.

I expect the RT uplift to be huge, on 54 CU as well - this looks very promising.
 
I probably would of changed it to

10 high X3D powered cores cpu - 1 LP core disabled for OS and good for cpu demanding games

320-bit @ 720GBS @ GDDR7

58cu capable gpu 2 disabled with 20MB L2 cache 40-50gig of ram

40-50 tf of performance

Probably be around 200-300 tdp

Should be enough atleast between a 5080 5090 level. And it definitely be more powerful then 6090xt and a 4090gtx
 
Last edited:
Yeah. For such huge step ups, while being 'efficient' some of this doesn't sound right and things like bandwidth and power seem off. Well compared to the levels and direction that last few PlayStation consoles took and they did work on efficiency on them. Especially the ps5

during the summer they were saying 40-48 cus for the ps6. Now it's in the 50s ? As a response to the next Xbox chip. I'm not sure that's how Sony or amd design a chip. With a quick add more 'X' to not look bad or weaker, against the opposition.

Doesn't history say the next PlayStation will be about the X700, or X700xt level with some Sony tweaks and bits ? Eg, ps5 being about 6700, then 7700 for the pro levels as the equivalent amd card

But perhaps with boosted machine learning and ray tracing parts ?
 
25% seems like a huge number compared to all the ones he's been using up to that point. He could have very easily put down
'Huge' is a weird term to use here in any context.
It'd be the second smallest delta we've had between two same-gen consoles for the last 30 years (smallest one being current gen), and that's without normalizing for the diminishing returns (ie. 25% in PS1 era was significantly more notable than it is today).

I mean I get it that as differences disappear we're narrowing the focus of how we talk about numbers in enthusiast circles - but in practice, hw differences might as well not exist at this point as to the amount of impact they have on the end-user experience.

Now the number is obviously pulled out of thin air - so it's about as believable as all the FUD that happens every generation a few years out - but 'if' it actually materialized.
 
Last edited:
I probably would of changed it to

10 high X3D powered cores cpu - 1 LP core disabled for OS and good for cpu demanding games

320-bit @ 720GBS @ GDDR7

58cu capable gpu 2 disabled with 20MB L2 cache 40-50gig of ram

40-50 tf of performance

Probably be around 200-300 tdp

Should be enough atleast between a 5080 5090 level. And it definitely be more powerful then 6090xt and a 4090gtx
And for $499 right?
 
The video is silly on so many levels but one thing worth mentioning is that TDP-figures are kinda pointless if your next sentence is "clocks are not decided".

The TFLOPS figure is dumb as well since it include dual issue and he then compares it to single issue PS5 Pro numbers. "Wow, wow 3 times higher". One would think that we have left that stupidity by now...

I think MLiD has proven that he has access to some valid information/sources. The problem is that he is not satisfied with that. He just has to add his own, often less than stellar, analysis and speculation and even down right hyping.

The sad thing is that he seems unaware of that his reputation would be a lot better if he just was that "reliable leaker without the bullshit"-guy instead. But hey, maybe "entertainment" is his main priority and if so he's doing a pretty good job.
 
160W TDP still seems low same as his last video. CU count have got little bump. I think it was 40-48CU. 2.5-3x PS5 raster performance is definitely believable. I do not believe his RT and memory requirement as well.
Bandwidth and tdp just terrible, well as predicted, time to move to pc, will buy this weakling but will only play 1 first party game on it per year
vPMFRKqbg6hKc1z5.jpg


160 bit memory bus and that 160W TDP...
Kepler said a few weeks ago that the 160W TDP wasn't accurate, so seeing MLID continue to say that make you wonder what other things he got wrong.
 
Top Bottom