Has there been confirmation that they won't support it? Or are people just assuming they won't because Cerny hasn't mentioned it?And the absence of Variable rate shading not looking good
This way your fan is always quite and once a busy heavy scene is passed and code doesn't need much gpu it drops the gpu clock so the gam doesn't increase. So it adjusts the cloxk based on the game need to keep the temp always cool.and fan always quiet and at same level .What I don't get is, what's the difference between this and PS4? My PS4 constanly changes the fan speed depending load on the system which means the system is not running always at 100% load because then the fan speed would never change. Can someone explain this?
I mean it's not like MS has never had a piece of hardware that didn't have major issues (360 RRoD). I suspect the XSX will be fine, but MS has had a long history of hype and non-delivery. So they don't get full benefit of the doubt.Funny thing is this may not even be guaranteed. The xbox one x is more powerful than the ps4 pro and the pro has had many instances where performance was better. Next gen should be a generation of options, let us dictate if we want higher frame rate or higher resolution
Dude oberon from github at 9.2 tf was B0 stepping .ps5 now is at E0 . It has changed. Get over it .ps5 is 10.3 tf and variable clock is for when the game code doesn't need it so the fan stays quiet and system stays cool .You're never going to get a straight answer.
Sony hinging PS5 at the 10.3 max is important for the psychological double digit threshold. He will never say it can dip to under 10.
Cerny and gang could have simply skipped all this by designing a more powerful and balanced system from the start (SeX is more powerful and its gpu only runs at 1.82 ghz), but this cpu/gpu tandem thing is what you get when the system is Oberon 9.2 with a gpu upclock at 2.23 ghz.
Probably because MS was more open about SeX then Sony with PS5.Weird. With xbonesx, they had extensive coverage the same day.
With ps5 it took weeks to repeath what cerny said in the the video.
And they praise gow5 on every occasion and I never felt gow5 deserved it. Probably just coincidence. Digital foundry is great
It does have vrs. It was not mentioned here .vrs is amd rdna2 feature.dxr api is for msI'll watch he video another time. Working from home doing double duty.
PS5 has no VRS?
I thought people have said VRS is a standard thing in RDNA 2??? Maybe I'm wrong.
Cerny also delivered PlayStation 4 - which he defined as 'super-charged PC architecture'
What a funny guy. Tablet CPU installed in the console and awfully slow in 2013.
Has there been confirmation that they won't support it? Or are people just assuming they won't because Cerny hasn't mentioned it?
If the architecture is capable of doing it, then all they'd have to do is support it in their APIs like Microsoft did with DirectX 12. If Sony really won't support it, then that would either mean that their programmers are too dumb to do it, or they're refusing to do it just for the hell of it. Neither of those seem particularly likely to me.
What you quoted does in no way change what was said about developers throttling back the CPU to ensure they always reach 2.23 GHz. Why? Because when the developers use the profiles, they will automatically be programming to load the CPU less and the GPU more. This means that the CPU might get a slight bump in performance compared to the profile when running the retail games on the retail console, but since developers were already maxing out the GPU, the GPU performance will stay the same. The additional performance from the slight CPU bump will not make much of a difference in practice. The main difference is that one is manually done by the developers to get the results they want, and the other is that the system does it automatically with SmartShift, with a slight boost to only CPU performance in this case, IF there is any leeway."The fixed profiles that devs are talking about are only used in ps5 dev kits and in retail games their code will tap into the variable clocks for extra performance "
Can you dumb cunts who are parotting "Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core." missed the above bit of bold?
Obviously not, more cherry-picking to push something which has been explained to death many times.
Disagree.ffs,
architecture def: "the complex or carefully designed structure of something." It doesnt mean power.
supercharged definition: "to make an engine more powerful by forcing in more air and fuel than usual" this also does not reference absolute power
The ps4 was a carefully designed 1.8TF jaguar based console runs games better than any equivalent PC can, hence supercharged architecture.
You may not like it, or understand it, but it is no lie.
It seems Sony wants to leverage a variant of mobile/laptop tech designed by AMD as those similar limitations are introduced while designing Consoles, i.e., cost and relatively lower permitted TDP than PCs.no offense, but smartshift tech is not really a positive thing to have. It is used in mobile because of limitations.
We dont know if MS will or not. It is not a big deal MS hasnt mentioned about it.
Care to share an example of PS4 Pro performing better than the One X at comparable resolution? In the words of Alex Battaglia from DF, that would seriously put the developer's ability (or tools) in question.Funny thing is this may not even be guaranteed. The xbox one x is more powerful than the ps4 pro and the pro has had many instances where performance was better. Next gen should be a generation of options, let us dictate if we want higher frame rate or higher resolution
It seems Sony wants to leverage a variant of mobile/laptop tech designed by AMD as those similar limitations are introduced while designing Consoles, i.e., cost and relatively lower permitted TDP than PCs.
Sony could have opted to go with traditional design philosophy of PS4 with locked frequency which would make the fans run like "jet engine" whenever additional power was drawn from mains. So I agree with you as it seems like a huge gamble on Sony's part though.
I am certain the variable frequency of PS5 that is defining the PS5 forums talks everywhere is only because of the implementation of AMD's new SmartShift tech. Variable frequency is not a feature of PS4/Xbox1 and even the upcoming Xbox series X.
If PS5 does have VRS, why didn't Sony mention it at GDC digital event? Meybe, it's because Microsoft helped AMD to develop VRS.It does have vrs. It was not mentioned here .vrs is amd rdna2 feature.dxr api is for ms
What they should have to appease the masses is use that one hour show (should be longer if needed), and go over each key spec of the system concisely.Im sure PS4 and even PS3 have variable clocks.
Variable clocks are for power savings or limitations requiring power savings.
There is all that is to it. Nothing special.
The problem is Sony hasnt been direct with PS5 presentation. Just say we built a cheaper console ala Gamecube. Reachable for the masses and still have good efficient graphics. Power is not a concern.
Done.
But It seems their PR team still want to sell the idea PS5 is only 10% or so behind Series X, still has great powers! Let them dreams in forums and defend Team Blue!
Disagree.
That's like Toyota saying their Camry with all options is like a Super Charged Lexus because they share some of the same parts.
It's still a Camry no matter how much you adjust it. If you want a Lexus, get a Lexus.
You either have not read the article which the video does a serious dis-service to because it is packed with detail; or you are showing a serious lack of reading comprehension. Either-way please stop.What you quoted does in no way change what was said about developers throttling back the CPU to ensure they always reach 2.23 GHz. Why? Because when the developers use the profiles, they will automatically be programming to load the CPU less and the GPU more. This means that the CPU might get a slight bump in performance compared to the profile when running the retail games on the retail console, but since developers were already maxing out the GPU, the GPU performance will stay the same. The additional performance from the slight CPU bump will not make much of a difference in practice. The main difference is that one is manually done by the developers to get the results they want, and the other is that the system does it automatically with SmartShift, with a slight boost to only CPU performance in this case, IF there is any leeway.
"Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.
WutDisagree.
That's like Toyota saying their Camry with all options is like a Super Charged Lexus because they share some of the same parts.
It's still a Camry no matter how much you adjust it. If you want a Lexus, get a Lexus.
Not at all. It is like saying a supercharged camry will have more power and acceleration than one that isnt. Here are some suoercharged shitbangers:
He doesn't say that... the guy made up.I'll watch he video another time. Working from home doing double duty.
PS5 has no VRS?
I thought people have said VRS is a standard thing in RDNA 2??? Maybe I'm wrong.
He said that because Xbox Phil said the same so he believes PS5 follow the same.Ahh my bad guys just got home and watched the beginning again, Richard said he suspects we will see a prolonged cross gen period with games based around Jaguar, Mark didnt say it.
cerny probably had to deal with the 400 dollars top budget imposed by sony and that's why he tried to squeeze the most from a weaker console with exotic stuff like this variable boost clock.They should rename Gaf to Daft, because some of you certainly are. It seems practice is better than theory, because Cerny speaks a lot of theory, while developers put a lot into practice. And based on what developers are saying, and what DF have determined so far, is that the variable speed exists for a reason. It's not there, just because Cerny wants it there. There's a reason why there is a window of operation, min to max freqs (and Cerny has never, up to this day, said what the base clock speeds are). Putting any technical limitation on a design, where there doesn't have to be one is a wasted effort. When you complicate a design, needlessly, you're a poor designer. That is not a matter of opinion, but that is a fact; from engineering, to graphic design, to everything else. And I know Cerny didn't design these features without a need. The CPU and GPU doesn't operate at variable speeds because Cerny was just in the mood to make things more complicated. They operate at variable freqs because they have to. Because, in order for this thing to perform efficiently, compromises had to be made, and that is exactly what this is; compromises!
This is a result of poor design, and trying to squeeze a little extra juice out of something, without revisiting the entire design from the ground up, or increasing the cost of the machine. Cerny makes several claims, all of which go against computer science, practical purposes and developer experience. Even DF is skeptical - it's evident in that very video in the OP.
Cerny may make for a great system designer, but I think he'd make an even better politician.
They should rename Gaf to Daft, because some of you certainly are. It seems practice is better than theory, because Cerny speaks a lot of theory, while developers put a lot into practice. And based on what developers are saying, and what DF have determined so far, is that the variable speed exists for a reason. It's not there, just because Cerny wants it there. There's a reason why there is a window of operation, min to max freqs (and Cerny has never, up to this day, said what the base clock speeds are). Putting any technical limitation on a design, where there doesn't have to be one is a wasted effort. When you complicate a design, needlessly, you're a poor designer. That is not a matter of opinion, but that is a fact; from engineering, to graphic design, to everything else. And I know Cerny didn't design these features without a need. The CPU and GPU doesn't operate at variable speeds because Cerny was just in the mood to make things more complicated. They operate at variable freqs because they have to. Because, in order for this thing to perform efficiently, compromises had to be made, and that is exactly what this is; compromises!
This is a result of poor design, and trying to squeeze a little extra juice out of something, without revisiting the entire design from the ground up, or increasing the cost of the machine. Cerny makes several claims, all of which go against computer science, practical purposes and developer experience. Even DF is skeptical - it's evident in that very video in the OP.
Cerny may make for a great system designer, but I think he'd make an even better politician.
The second option.Has there been confirmation that they won't support it? Or are people just assuming they won't because Cerny hasn't mentioned it?
If the architecture is capable of doing it, then all they'd have to do is support it in their APIs like Microsoft did with DirectX 12. If Sony really won't support it, then that would either mean that their programmers are too dumb to do it, or they're refusing to do it just for the hell of it. Neither of those seem particularly likely to me.
Lol thats no where near the same analogy brother. It would be like having a tuned Camry that can hit the horsepower of a stock Lexus.Disagree.
That's like Toyota saying their Camry with all options is like a Super Charged Lexus because they share some of the same parts.
It's still a Camry no matter how much you adjust it. If you want a Lexus, get a Lexus.
yeah, it's a pretty pointless video, we don't know a single new thing compared to their last video.Uncovered yet provides nothing really new. Zero. Zilch, Nada.
The fact they hide a lot of info from the interview in the video tells you a lot.It's really a shame that digital foundry changed from strictly analytical to now clickbait hypothetical videos. They used to be pretty solid.
What they should have to appease the masses is use that one hour show (should be longer if needed), and go over each key spec of the system concisely.
And before anyone says it's a GDC show, it's supposed to be detailed and boring, then Sony shouldn't have promoted to everyone to watch the show.
Then back up those specs with some demos. It's already March at the time, there's got to be some slick PS5 first party demos up and running that can show a weaker system using SSD will lead to awesome games.
They even did an SSD demo a year ago with Spiderman. They could have expanded on that showing what a wicked open world game could look like with huge landscapes, zero loading times, and it works seemlessly.
I'm excited for the PS5, but this sounds like a horrible design decision. They essentially put a Zen 2 processor in there, but where the risk of designing a game to properly leverage the very best of what that CPU can do will leave very little power for the GPU to perform at the stated boost clock speed.
It just plains seems like a better setup to be able to run the CPU as hard as you want for your game without ever needing to worry that the GPU may suffer for it, the way it works on the Xbox Series X. And Cerny and Sony still can't seem to acknowledge where the baseline is on either the CPU or GPU. There is almost certainly a baseline.
Read the article people it has way more information than the video.yeah, it's a pretty pointless video, we don't know a single new thing compared to their last video.
people still don't have a clue about how this variable boost clock is gonna work with heavy games in real world applications.
The fact they hide a lot of info from the interview in the video tells you a lot.
What did he say?Just read Matt's post on ERA. Looks like some Xbox fans will be dissapointed
Pop it on here chief?Just read Matt's post on ERA. Looks like some Xbox fans will be dissapointed
From 9min mark, Richard tried to get Mark Sony to answer PS5 base clocks, or worst case clocks in next gen games.
Mark Sony basically deflected the question by talking about PS4 overheating and shutdown.
WTF? So evasive.
I think Richard was pissed, basically calling Mark out for overclocking, and he ended the video about Mark non-committal answer about VRS, and no timeline about teardown and even throw in a KZ which was shown this time last gen.
Disappointing Sony is still hiding their cards.
There is no such agreement at all. We're free to do as we please with the information we gleaned during the visit. Same goes for Austin Evans who was on-site with us! It's literally just a normal kind of press visit.
I think it's just the difference between a Japanese and American company more than anything. Sony is just holding the cards close to their chest at the moment. Also, the virus situation hurt our plans for this (not that it was going to involve seeing the box or anything). The Xbox stuff just squeaked by.
I think there are just more limits here as Microsoft had already revealed a lot more before we saw anything. Sony is taking a very different approach and that's fine! We'll know more soon enough.
MS has been more reserved privately than Sony, but more open publicly.
They are just different strategies.
![]()
PlayStation 5 uncovered: the Mark Cerny tech deep dive [Digital Foundry]
Yeah, that was strange for DF to ignore the ssd part. At the beginning of the video they said they're going to talk about the small details, but ignored one of the major parts of Cerny's presentation.www.resetera.com
What did he say?
Is this the same Matt mod who was with everyone else saying PS5 was 12-13tf?
For sure. I know a guy at work who had a shitty looking VW Rabbit or Golf and he messed with it to be fast.
The thing is even if it's fast, it's still a shit car.
The PS4 supercharged PC... lol... pretty sure a decent PC in 2013 could run games well at 1080p or higher at better settings.
Just read Matt's post on ERA. Looks like some Xbox fans will be dissapointed
Lol thats not how any of this works . Variable frequencies are for when the code doesn't need the maximum power so it stays cool. It is always 10.3 tf when needed by the game codeAs expected, more CUs better performance and scaling with Teraflops.
That means Xbox series X on paper is at worst 18% faster than max PS5 boost, in reality with CU scaling and variable PS5 clocks, XSX will be around 25-30% faster on average during graphics demanding workloads.
![]()