Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Look at Sea of Thieves, for example: released in two machines with a 4x delta. Or a multi platform that runs in a 2080 ti and in a 960.
Game was designed to run on a HD7770 (~650Ti) so of course it runs beautifully on a 960 which is multiple times more powerful than the GPU it wass designed to run on.
devs hace to do the Potato PC version they can use that work to make the Lockhart version as well.
I didn't say lockart wouldn't work though, just that low end PCs won't be on developers minds when designing next gen only games, they'll downgrade and scale down settings but if they cant guarantee performance minimum requirements will just be raised.

Provided locker has same CPU & SSD speeds i don't see why games wouldn't scale down, the important question is which console do they prioritize XSX or XSS?
  1. Prioritize XSX, spend the most amount of time on said console optimizing code to make the most out of it. Cheap downport to lockart with possibly other downgrades on top of resolution
  2. Prioritize XSS, spend the most amount of time on it to get the best possible results. Cheap up port to XSX scaling up resolution and settings (when applicable) similar to XB1S->XB1X ports
 
when ps5 pro and xsx pro comes out in 2025 with 8k 30fps-60pfs being the norm, what will be the storage medium for games dudez?

I wanna predict xsx pro and ps5 pro specs for shitz and giggles:

SSD: Has ReRAM technology, so SSD speed will be close to 10GB/sec-15GB/sec
RAM: 25GB of GDDR7 at close to 800GB- 1 TB/sec (so it will be double the current bandwidth)



SSD: Has ReRAM technology, so SSD speed will be close to 20GB/sec-30GB/sec
RAM: 40-50 GB of GDDR8 at close to 2TB/sec

DRAM as well as Nand have stopped scaling below 10nm.

RAM density improvement from now on will be similar to today's battery improvement (until solid state battery), which means we will be stuck with the prices we have now per capacity with probably incremental improvements yearly.

I don't expect PS6 to have more than 24-32GB of hopefully stacked DRAM by 2027/28. (budget consoles)

Again, it will be the SSD that will compensate for the lack to RAM. Hopefully, Sony's ReRAM is ready by then.

Nand had already stopped scaling while ReRAM is confirmed to be able to scale below 10nm. Nand's saving grace is its ability to stack in multiple layers. Probably that will let Nand to continue drop in price. But ReRAM is stackable too, but more importantly scalable to smaller process.
 
Really good catch!

Unreal Engine 4 and Quixel will fly on PS5 SSD and I/O speed.

Combined with Geometry engine, which ensures that geometry outside your view are not rendered.

PS5 2.23ghz GPU is also capable of more geometry per second than xsex wide and slow approach.

Texel density approaching that of CGI quality will be everywhere. No more cutting of details just because the RAM cannot fit them all. Now, the SSD can stream them all just at the right time.

HIGH RESOLUTION MEGASCANS. DATA NEEDS TO BE STREAMED JUST AS WHEN IT IS NEEDED.

 
Last edited:
DRAM as well as Nand have stopped scaling below 10nm.

RAM density improvement from now on will be similar to today's battery improvement (until solid state battery), which means we will be stuck with the prices we have now per capacity with probably incremental improvements yearly.

I don't expect PS6 to have more than 24-32GB of hopefully stacked DRAM by 2027/28. (budget consoles)

Again, it will be the SSD that will compensate for the lack to RAM. Hopefully, Sony's ReRAM is ready by then.

Nand had already stopped scaling while ReRAM is confirmed to be able to scale below 10nm. Nand's saving grace is its ability to stack in multiple layers. Probably that will let Nand to continue drop in price. But ReRAM is stackable too, but more importantly scalable to smaller process.
I thought you were full of crap on DRAM but I looked it up. Interesting problem.
 
I thought you were full of crap on DRAM but I looked it up. Interesting problem.

DRAM is going 3D. If HBM or something similar become ubiquitous in the future then GDDR will be a more expensive option because stacked DRAM should be cheaper by default. It's the package that makes it much more expensive and its rarity for now.

We are going 3D for the years to come.
 
Unreal Engine 4 and Quixel will fly on PS5 SSD and I/O speed.

Combined with Geometry engine, which ensures that geometry outside your view are not rendered.

PS5 2.23ghz GPU is also capable of more geometry per second than xsex wide and slow approach.

Texel density approaching that of CGI quality will be everywhere. No more cutting of details just because the RAM cannot fit them all. Now, the SSD can stream them all just at the right time.

HIGH RESOLUTION MEGASCANS. DATA NEEDS TO BE STREAMED JUST AS WHEN IT IS NEEDED.



imagine this on xsex with velocity architecture and 12tflops of power :messenger_beaming:
 
BOM price for PS5 and XsX isn't that different (Zhug said around PS5 is around 20$ cheaper) so I don't expect any price difference. If Sony will come up with 450$, Microsoft will match.
BOM price for PS5 and lockart isn't that different (SSD, smaller APU, 6 GDDR6 modules, disc drive) so I don't expect price difference to be greater than $100.
Low end pricing
  • Lockart $300 to $350
  • PS5 $400 to $450
High end pricing
  • Lockart $350 to $450
  • PS5 $450 to $500
Indeed so. But people throwing around prices, and BOM's and hardware components are the ones who are really into this thing, while the average mom and dad on the street will have no idea why this less expensive console isn't the best for their kids at home, right? I mean, look at the price!
That's what im saying, lockart will sell significantly more, xsx can't possibly subsidize it selling a fraction of the units.
 
May it is? Horizon sequel confirmed for the ps5 event?

EWoHjCzXkAIWOIj
 
imagine this on xsex with velocity architecture and 12tflops of power :messenger_beaming:

That I/O though.

10 seconds of loading in a LOADING DEMO SHOWCASE does not inspire confidence.

But I will give xsex the benefit of the doubt. Perhaps the CPU Zen2 cores can make up for the lack to custom chips in the APU. But the CPU will take a lot of hit in performance.
 
GitHub isn't proof but it is evidence. I don't think it is conclusive without additional evidence. If someone revealed the devkit speeds for that time period and they match then I think I would lean toward the current speed as being reactionary.
Evidence of? you need to add context to your statement
My point is that Github isn't proof or evidence that Sony ramped up clocks as a reactionary measure to "be competitive". It just doesn't work that way and contradicts with known information
 
That I/O though.

10 seconds of loading in a LOADING DEMO SHOWCASE does not inspire confidence.

But I will give xsex the benefit of the doubt. Perhaps the CPU Zen2 cores can make up for the lack to custom chips in the APU. But the CPU will take a lot of hit in performance.

same game code as xbone, gears 5 apparently load 4x faster on xsex vs xbone, again without any code changes. Reality is that we know much more about xsex vs ps5, we know context of these demonstrations etc. With ps5 we have got Cerny talk and leaked Spider-Man demo withou any context additional info and loads of wishful thinking. don't get me wrong, I know that ps5 ssd is at least 2x faster than xbsex one, but it will be interesting how ps5 load the PS4 games that are not optimised for ssd.
 
>TLOU 2 in June
>GOT in July
Until then finish FF7 and platinum Nioh 2.
That's how you end a generation.
Hopefully when those both sell great devs will stop being scared of summer releases. Never got why so many Summers are dry when big games have shown success there before and it would help uncrowd the holiday season
 
It's not really that hard to come to a conclusion on, otherwise you're somehow insisting either Arden was the XSX chip and Oberon isn't PS5 (so what is PS5's chip if that's the case?), or neither chips are for their respective systems, which is very difficult to believe since we know outside of some CUs disabled and a clock speed adjustment XSX and PS5 fit Arden and Oberon practically like a glove.
Best case scenario for github being legit is that its outdated rdna1 placeholder they used.
And it's also been mentioned both chips are custom RDNA2, they won't be exactly 100% equivalent to the PC RDNA2 GPUs coming out later this year.
Yes but it'll keep core features, custom in consoles tends to mean more no less
Regardless github chip lacked hardware accelerated raytracing so that rules out rdna2
 
I know that this Tidux is not very reliable, but is there any chance that this is true? It looks like the cover of a Game Informer issue.
 
Last edited:
same game code as xbone, gears 5 apparently load 4x faster on xsex vs xbone, again without any code changes. Reality is that we know much more about xsex vs ps5, we know context of these demonstrations etc. With ps5 we have got Cerny talk and leaked Spider-Man demo withou any context additional info and loads of wishful thinking. don't get me wrong, I know that ps5 ssd is at least 2x faster than xbsex one, but it will be interesting how ps5 load the PS4 games that are not optimised for ssd.
Maybe it is the people who want to close their eyes because dont want to see what is coming who have those wishful thinkings :messenger_tears_of_joy:

Mark Cerny:
"For PlayStation 5 our goal was not just that the SSD itself be a hundred times faster it was that game loads and streaming would be a hundred times faster so every single potential bottleneck needed to be addressed."

T3MM2Gx.jpg


Source: (Full Road To PS5 transcription)
 
Last edited:
Best case scenario for github being legit is that its outdated rdna1 placeholder they used.

Yes but it'll keep core features, custom in consoles tends to mean more no less
Regardless github chip lacked hardware accelerated raytracing so that rules out rdna2

The regression testing data for Ariel iGPU lacked hardware RT, but that doesn't mean Oberon didn't have RT on the chip. You wouldn't of needed RT enabled for PS4 and Pro regression testing, is all.

I don't necessarily understand the insistence Github wasn't legit. Those were the PS5 and XSX chips. Final XSX chip turned off 4 CUs and clocked higher (clocks were never given for Arden, and Scorpion to X did the same thing, i.e Scorpion had full chip on, the X disabled 4 CUs for retail), PS5 chip turned out clocked higher, RT wouldn't need to be on for regression tests and Navi 10 was in reference to Ariel iGPU testing profile.

I mean, what other chips even exist that would be XSX and PS5's chips? There aren't any. That just leaves Arden and Oberon. The Github stuff was incomplete but aside from Tommy's XSX stuff and one of Heisenberg's PS5 figures, it was by far the closest to giving an indication on some of the baseline specs for both systems (baseline as in general GPU stuff like CUs, SUs specifically) 🤷‍♂️
 
Last edited:
The
Evidence of? you need to add context to your statement
My point is that Github isn't proof or evidence that Sony ramped up clocks as a reactionary measure to "be competitive". It just doesn't work that way and contradicts with known information
Then why does github show lower clocks? Could be several reasons. That's why I say it is evidence not proof. Your treating it as nothing at all. How did that work out for those who did that before Cernys talk.
 
I don't necessarily understand the insistence Github wasn't legit. Those were the PS5 and XSX chips. Final XSX chip turned off 4 CUs and clocked higher (clocks were never given for Arden, and Scorpion to X did the same thing, i.e Scorpion had full chip on, the X disabled 4 CUs for retail), PS5 chip turned out clocked higher, RT wouldn't need to be on for regression tests and Navi 10 was in reference to Ariel iGPU testing profile.
Im not saying its necessary fake just that it was outdated and using an older chip as a placeholder (rdna1). The latest github leak claimed rdna1 without hw rt did it not?
About ariel what makes you think it was for BC testing and if it was wouldn't you find it reasonable to asume they were using a rdna1 placeholder for testing bc?
Then why does github show lower clocks? Could be several reasons. That's why I say it is evidence not proof. Your treating it as nothing at all. How did that work out for those who did that before Cernys talk.
And i ask you again evidence of what? so we can take the discussion from there
 
Last edited:
Im not saying its necessary fake just that it was updated and using an older chip as a placeholder (rdna1). The latest github leak claimed rdna1 without hw rt did it not?
About ariel what makes you think it was for BC testing and if it was wouldn't you find it reasonable to asume they were using a rdna1 placeholder for testing bc?
He answered all your issues in his post. Did you even read it?
 
Status
Not open for further replies.
Top Bottom