Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Question: a drop from 4K to 1080 p wouldn't reduce the load on the GPU by (around) four times? In this sense, 4 TFs seems appropriate. We have seen Pro and X multiplaying GPU power mainly for res.
From 4K 60 fps to 1080 30 fps you would save a lot of power, and you can still decrease other things like utilize a different AA or less particles.
This could allow to develop mainly for SeX and PS5, and than scale back. I suppose it would be the contrary if Series S will be the most selled console, but we all know that probably PS5 will instead.
I think I saw some time a dev talking about how they develop games targeting the top config for PCs, then scaling back for lesser configs and consoles.
 
Why do you keep data on RAM for OS? To have the actual data available to use by the OS at any time.
So what happens if you can stream data from SSD to RAM fast enough to OS use it? You don't need to keep the data on RAM anymore.

Basically that... with high seeds SSDs you eliminate the needs to OS keep most of the data on RAM.. even background APP can go SSD and just back to RAM when active.

That alone eliminate a lot of RAM usage by the OS.


So is this essentially what Virtual Ram is? But since the SSD is alot faster alot of functions of the OS can be out in virtual memory.
 
Sony has made no comments about how RT is implemented in their console (other than stress how demanding RT is) and your assumption that it is the exact same architecture as in the XSX, just provided by fewer CUs, has not been verified yet. Let's wait and see. All we know for now is that one manufacturer was confident enough to showcase not just RT but PT in March, while the other's lead designer said he was flabbergasted that it is even successfully attempted on consoles. You can focus on the artificial, self manufactured number of 18% difference in RT HW acceleration but it looks like 18% of quite a lot.
Sony said it is RDNA 2 RT implementation.

So is this essentially what Virtual Ram is? But since the SSD is alot faster alot of functions of the OS can be out in virtual memory.
Yes but Virtual Memory on Windows is used when you already near to use all the RAM (for obvious reasons) while on consoles they can optimize to use it any time.
It is called Swap in Unix based systems.
 
Last edited:
Why do you keep data on RAM for OS? To have the actual data available to use by the OS at any time.
So what happens if you can stream data from SSD to RAM fast enough to OS use it? You don't need to keep the data on RAM anymore.

Basically that... with high seeds SSDs you eliminate the needs to OS keep most of the data on RAM.. even background APP can go SSD and just back to RAM when active.

That alone eliminate a lot of RAM usage by the OS.
Hmmm… but what about the share option? How much RAM consumes to record 4K @ 60fps? You can't keep writing that in the SSD without destroying it after a couple of years of use (and consuming those much needed GB/s). Thus my argument again about this generation being scarce in RAM.
 
I think I saw some time a dev talking about how they develop games targeting the top config for PCs, then scaling back for lesser configs and consoles.
Regardless, what could Pro do that base PS4 couldn't, if we already take into account the extra res?
This could be a similar situation, two mostly identical set up but one less powerful. I wont take Xbox One X as an example because it was 3-4 years after One, it seems that in terms of GPU we are looking at 3x difference in worst case, it may be 2x, more around PS4-PS4 Pro difference than anything else, and it will likely be freed up with res drop.
I think the main point is RAM, especially because next gen has "only" 16 GBs already, and I have the impression that memory is less scalable than GPU at this point. Devs from The Outer Words cutted some parts not only because of time, but due to memory.
We will see, but my take is that screaming at 4-6 TFs like means something it's a bit too much.
 
Last edited:
Hmmm… but what about the share option? How much RAM consumes to record 4K @ 60fps? You can't keep writing that in the SSD without destroying it after a couple of years of use (and consuming those much needed GB/s). Thus my argument again about this generation being scarce in RAM.
If I'm not wrong Share on PS4 didn't use RAM at all... it write directly to HDD.
You can do that with SSD and it will lives longer than usual HDD.
 
Also If I remember the PS4 use a second small chip for secondary tasks, this work only for sleep mode,
background downloads or also for share feature ?

https://www.ifixit.com/Teardown/PlayStation+4+Teardown/19493?

How Xbox one do that ?
They wanted to use it but I believe they never archived that.
It uses the APU for sllep mode, downloads and share but I can be wrong on that.

I don't have ideia how Xbox One does DRV share because it is bad implemented with a lot of issues and lag.
 
Last edited:
This user has been removed from thread. Your passive aggressive console war trolling has come to light, especially in this post.


Pretty clear.

I'm not re-watching someone suddenly glorifying dynamic clocks because they had to add it to get closer to their competitor. Unlike you, I resent my intelligence being insulted even by Sony.Cerny didn't say what type of RT implementation is in the PS5. No analyst has discussed it since that conference because there are no details on it other than "It generates billions of rays but I've seen one game that's successfully using it without too much impact on the GPU"
 
Last edited:
I'm not re-watching someone suddenly glorifying dynamic clocks because they had to add it to get closer to their competitor. Unlike you, I resent my intelligence being insulted even by Sony.Cerny didn't say what type of RT implementation is in the PS5. No analyst has discussed it since that conference because there are no details on it other than "It generates billions of rays but I've seen one game that's successfully using it without too much impact on the GPU"
So you are just a troll.
You asked a link/source where the answer come directly from Cerny/Sony but you don't want to watch :messenger_tears_of_joy:

And yes Cerny said even how it works... the processing (intersection) units inside the RDNA 2 that it uses and how it is the same as incoming AMD GPUs in PC lol
And yes Digital Foundry talked about what Cerny said on the video.

BTW dynamic clocks is a smart ideia to get more from the system ;) MS could use the same to get more from new Xbox if they wish (they will probably use int he mid-gen refresh).
 
Last edited:
Sony has made no comments about how RT is implemented in their console (other than stress how demanding RT is) and your assumption that it is the exact same architecture as in the XSX, just provided by fewer CUs, has not been verified yet. Let's wait and see. All we know for now is that one manufacturer was confident enough to showcase not just RT but PT in March, while the other's lead designer said he was flabbergasted that it is even successfully attempted on consoles. You can focus on the artificial, self manufactured number of 18% difference in RT HW acceleration but it looks like 18% of quite a lot.
There is little reason to assume the RT implementation will be different. If they are both in-hardware, it will be the same implementation in both, since both are RDNA2. Expecting a different RT implementation is like expecting a different ROP implementation, or a different shader engine. It's not likely to happen. There isn't much else that can be done at this point with BVH, and Cerny confirmed that RT acceleration will be based on BVH.
 
I'm not re-watching someone suddenly glorifying dynamic clocks because they had to add it to get closer to their competitor. Unlike you, I resent my intelligence being insulted even by Sony.Cerny didn't say what type of RT implementation is in the PS5. No analyst has discussed it since that conference because there are no details on it other than "It generates billions of rays but I've seen one game that's successfully using it without too much impact on the GPU"
This ain't it, fam.
 
There is little reason to assume the RT implementation will be different. If they are both in-hardware, it will be the same implementation in both, since both are RDNA2. Expecting a different RT implementation is like expecting a different ROP implementation, or a different shader engine. It's not likely to happen. There isn't much else that can be done at this point with BVH, and Cerny confirmed that RT acceleration will be based on BVH.
Cerny own words.

"...a major new feature of our custom RDNA 2 based GPU on the same strategy as AMD's upcoming PC GPUs that contain a new specialized unit called Intersection Engine that can calculate the intersection of rays with boxes and triangles. To use the Intersection Engine..."

And go on.
 
Last edited:
BTW dynamic clocks is a smart ideia to get more from the system ;) MS could use the same to get more from new Xbox. if they wish
If MS had announced dynamic clocks and Sony fixed ones you would be declaring the generation over before it even started and we both know it.
Even Cerny doesn't say that it gets more from the system, just that it sets a fixed power budget that they needed to tackle the cogenital Playstation loudness problems more easily. He's not quite dishonest enough to claim that it provides any sort of power advantage over more CUs and every benchmark test has proved that it doesn't.
 
So you are just a troll.
You asked a link/source where the answer come directly from Cerny/Sony but you don't want to watch :messenger_tears_of_joy:

And yes Cerny said even how it works... the processing (intersection) units inside the RDNA 2 that it uses and how it is the same as incoming AMD GPUs in PC lol
And yes Digital Foundry talked about what Cerny said on the video.

BTW dynamic clocks is a smart ideia to get more from the system ;) MS could use the same to get more from new Xbox if they wish (they will probably use int he mid-gen refresh).
This right here "Still have to see confirmation that it's AMD's standard RT implementation or the same that's in the XSX." I think he is implying that Microsoft has special Ray Tracing and that they are not using just AMD's ray tracing but the ps5 is. So unless they come out and say PS5 is using the same RT as Xbox or Xbox says its the same as what is built on AMD chip it is special and different hence non comparable. In other words save your energy. Edit: None of the fine details matter in the long run what matters is which has the best experiences.
 
Last edited:
I'm not re-watching someone suddenly glorifying dynamic clocks because they had to add it to get closer to their competitor. Unlike you, I resent my intelligence being insulted even by Sony.Cerny didn't say what type of RT implementation is in the PS5. No analyst has discussed it since that conference because there are no details on it other than "It generates billions of rays but I've seen one game that's successfully using it without too much impact on the GPU"

Hahaha, someone here is so used to PR bullshit he refuse to take detailed technical numbers from THE SOURCE .
 
This right here "Still have to see confirmation that it's AMD's standard RT implementation or the same that's in the XSX." I think he is implying that Microsoft has special Ray Tracing and that they are not using just AMD's ray tracing but the ps5 is. So unless they come out and say PS5 is using the same RT as Xbox or Xbox says its the same as what is built on AMD chip it is special and different hence non comparable. In other words save your energy.
But Cerny come out and said they are using exactly the same RT tech of RDNA 2.
The video says it pretty clear I even did a lazy transcript to Ascend Ascend .
 
Last edited:
MS probably think super budget gamers will choose the Lockart+Gamepass combo, but they could simply grab an even cheaper One S+Gamepass if they don't care about graphics. You think Lockart will cannibalize SX, but this no-next-gen-exclusives policy will make One S cannibalize Lockart.

That's not what they think, they KNOW it - Pro an 1X have been on the market for quite a some time, and still the base models greatly outsell them. Most normal people just want to play the games, nothing more, the cheaper the better. And the so-called hardcore audience would still get their high-spec console anyway. It's a win-win scenario.
 
your assumption that it is the exact same architecture as in the XSX, just provided by fewer CUs, has not been verified yet.
No, my reasonable assumption is that its the exact same architecture as featured on RDNA2 cards across the board. Makes no sense to waste R&D money on a alternative RT solution when RDNA2 already has it integrated into the design.
he was flabbergasted that it is even successfully attempted on consoles.
lol that's not how it went at all, here's what Cerny had to say about RT implementation on PS5 games
Mark Cerny said:
How far can we go? Im starting to get quite bullish, I've already seen a PS5 title that's successfully using RT based reflections in complex animated scenes with only modest costs
 
No, my reasonable assumption is that its the exact same architecture as featured on RDNA2 cards across the board. Makes no sense to waste R&D money on a alternative RT solution when RDNA2 already has it integrated into the design.

lol that's not how it went at all, here's what Cerny had to say about RT implementation on PS5 games
He did not watch the Road of PS5 so he is basing on what he read from FUD followers.
What he claim is basically the opposite of how the presentation happened.
 
Last edited:
No, my reasonable assumption is that its the exact same architecture as featured on RDNA2 cards across the board. Makes no sense to waste R&D money on a alternative RT solution when RDNA2 already has it integrated into the design.

lol that's not how it went at all, here's what Cerny had to say about RT implementation on PS5 games
If Cerny started to "get quite bullish" only AFTER having designed the console and because of ONE by his assessment "successful" implementation, it is fair to declare that RT wasn't high on the list of the performance requirements of the PS5.
Compare that to sending an actual demo of PATH TRACING to Digital Foundry 9 months before release, or to the statement "we provide the equivalent of 25TF of compute power when executing RT" and that indicates better than anything in which ballparkS (plural) those consoles are.
 
That's not what they think, they KNOW it - Pro an 1X have been on the market for quite a some time, and still the base models greatly outsell them. Most normal people just want to play the games, nothing more, the cheaper the better. And the so-called hardcore audience would still get their high-spec console anyway. It's a win-win scenario.
I don't thing a mid-gen refresh is anything reliable to a new generation puscharse.
I found all mid-gen refresh useless for me and so I avoided to buy them when my day one PS4 Amateur works flawless even today without missing anything from PS4 Pro.

Said that I'm again day one PS5 Amateur and a never one PS5 Pro ;)

I should never buy Lockhart over Scarlet for a new generation but that won't happen anyway.
 
Last edited:
He did not watch the Road of PS5 so he is basing on what he read from FUD followers.
What he claim is basically the opposite of how the presentation happened.
I said I didn't want to "re-watch" something and according to you it means I didn't watch it. That's what I'm dealing with
 
If Cerny started to "get quite bullish" only AFTER having designed the console and because of ONE by his assessment "successful" implementation, it is fair to declare that RT wasn't high on the list of the performance requirements of the PS5.
Compare that to sending an actual demo of PATH TRACING to Digital Foundry 9 months before release, or to the statement "we provide the equivalent of 25TF of compute power when executing RT" and that indicates better than anything in which ballparkS (plural) those consoles are.


KRCCFnn.jpg
 
Last edited:
If Cerny started to "get quite bullish" only AFTER having designed the console and because of ONE by his assessment "successful" implementation, it is fair to declare that RT wasn't high on the list of the performance requirements of the PS5.
That's just baseless speculation, GPU theoretical numbers are just that theoretical you won't know how it will apply to real gaming loads until implemented. RT implementations tank performance on even the mighty RTX2080Ti, being cautious with expectations speaks well of his judgment.

The second part of your statement whether it was high on the list or not (same can be asked of MS) its irrelevant, they didn't design the RT implementation, AMD did its a core RDNA2 feature that scales with CUs & Frequency
Compare that to sending an actual demo of PATH TRACING to Digital Foundry
Sony just hasn't shown any games yet, they are saving it for a later time
 
What is wrong with this people you cannot just decide when a feature or some specification told in an official presentation
was a lie from X company just because don't like the company or the is a lied could means your favorite box has worst hardware
when any third person has done a teardown to double check what have each machine.

All your arguments are full of fallacies like ad conditionallis (I have right just guessing), ad consequentiam (If I like this brand then should be better) ,
ad nauseam (If I make 10 post per day with just my opinion is because I am right) , ex populo (most the people think TF is only which matter in the graphics)
and ad ignorantiam (the console doesn't has the same RT has the other and as you can have access to both dev kits I have right).

And I can continue all the day, please when you comment something don't use fallacies, in the moment you use it, you
lost the debate, is okay to have a different opinion but that doesn't means in true.
 
Last edited:
I'm not re-watching someone suddenly glorifying dynamic clocks because they had to add it to get closer to their competitor. Unlike you, I resent my intelligence being insulted even by Sony.Cerny didn't say what type of RT implementation is in the PS5. No analyst has discussed it since that conference because there are no details on it other than "It generates billions of rays but I've seen one game that's successfully using it without too much impact on the GPU"
3Eu.gif



Damn you're dense
 
If MS had announced dynamic clocks and Sony fixed ones you would be declaring the generation over before it even started and we both know it.
Even Cerny doesn't say that it gets more from the system, just that it sets a fixed power budget that they needed to tackle the cogenital Playstation loudness problems more easily. He's not quite dishonest enough to claim that it provides any sort of power advantage over more CUs and every benchmark test has proved that it doesn't.

False.

Mark Cerny:
"While we're at it we also use AMD's "Smart Shift" Technology and send any unused power from the CPU to the GPU so it can squeeze out a few more pixels."

Source:

 
That's just baseless speculation, GPU theoretical numbers are just that theoretical you won't know how it will apply to real gaming loads until implemented. RT implementations tank performance on even the mighty RTX2080Ti, being cautious with expectations speaks well of his judgment.
It wasn't just theoritical numbers. It was also an actual Minecraft demo that can be compared to the performance of the most powerful PC cards on Minecraft RTX. It's called transparency. You're refusing to aknowledge it where it is and you excuse it where it's not.

The second part of your statement whether it was high on the list or not (same can be asked of MS) its irrelevant, they didn't design the RT implementation, AMD did its a core RDNA2 feature that scales with CUs & Frequency
You don't know how those components where designed, what is off the shelf and what is bespoke and with what amount of customisation. This is willful simple mindedness.
[/QUOTE]

Sony just hasn't shown any games yet, they are saving it for a later time
They talked about one. One that "successfully" implements RT. And that's how they marketed their RT capabilities. I have every right to expect more when deciding whether to fork money for a "next-gen" product when RT has been around for almost a year. The fact that you don't doesn't absolve Sony of its transparency duties to customers.
 
LEt me get this right, Smart shift allows a GPU to squeeze out more pixels than its compute units and frequency clocks allow? If you believe that you don't know what you're talking about.

The variable clocks allow the GPU to squeeze out more pixels because it can achieve a higher clock rate than what it would achieve locked.

Not hard to understand
 
The variable clocks allow the GPU to squeeze out more pixels because it can achieve a higher clock rate than what it would achieve locked.

Not hard to understand
It keeps the higher clock rate for how long? And what is the lowest clock rate it can drop to?
There's plenty of information Sony hasn't given out because they prefer you just think about Smartshift and about the PS5's maximum achievable frequency.
 
I have a "Member" account. You know those are not handed out. Think before typing.

Pretty sure you're upgraded to member automatically. And it's not difficult either
It keeps the higher clock rate for how long? And what is the lowest clock rate it can drop to?
There's plenty of information Sony hasn't given out because they prefer you just think about Smartshift and about the PS5's maximum achievable frequency.

The majority of the time it'll stay at its highest clocks. When it has to drop the drops will be small.

This has been argued to death already, so just do a little bit of research
 
Last edited:
LEt me get this right, Smart shift allows a GPU to squeeze out more pixels than its compute units and frequency clocks allow? If you believe that you don't know what you're talking about.

Funny.

You did not say that. You said "Even Cerny doesn't say that it gets more from the system, just that it sets a fixed power budget"

FALSE.

He did, here:




Awkward times, I guess :messenger_tears_of_joy:
Move on!
 
Status
Not open for further replies.
Top Bottom