[Digital Foundry]PS5 uncovered

Care to share an example of PS4 Pro performing better than the One X at comparable resolution? In the words of Alex Battaglia from DF, that would seriously put the developer's ability (or tools) in question.
RE3 performs better than the xbox one x version, the whole talk was about perfeomance revolving around frame rate. Thats MS problem, stop
Pushing the 4k 60 bullshit because it cant be done. Do checker board so you can hit a solid frame rate. The Pro version runs better than the X version, fact.
 
Last edited:
Game developer to Digital Foundry: "I'm having tho THROTTLE the cpu to stabilise the gpu"
GAF console warrior: "They're not throttling, they're isolating".

Sony couldn't pay some of you enough for all the spin and free damage control.
Provide the link to the quote.
 
Made up what ?

He didn't confirm it in the presentation and with DF Interview with him

The absence of the VRS is odd

Man Sony need to start give some money to some of PS fanboys they're working day and night for their beloved plastic
So he did not say what you made up.

Are you being paid for spread FUD?
 
Last edited:
You kidding me mate? This is why people need to read the article, Rich just muddied the discussion for no reason with the video.

There is no manual throttling, it is simply there for profiling and optimization purposes. They are not throttling, they are restricting the GPU or CPU to isolate them to better understand the activity each frame. They do the same for both.

If a developer is targeting 60fps performance, the frame time has to be at 16.67ms, if they go above that then the frame rate drops. In order to see the cause they can lock the frequency of the GPU or CPU independently to see what is causing the spike in frame-time. This is essentially taking variable frequency and smart-shift out of the equation as possible causes of the performance issues. They essential profile the game with variable frequency and with locked frequency to see any issues.

They are not programming the game to MAX the GPU or CPU as you put. It is just a profiling tool for optimization.

ITS NOT THROTTLING. Goes on to then essentially define what throttling is in terms of a GPU/CPU and how Sony are doing that.
Also I tend to trust developers more than a random GAF member.
 
Are you not feeding more power to boost a frequency? I mean we may be arguing semantics here, but if SmartShift is changing frequencies it regulates power on some level I would assume?
No power and frequency are not tied together. Voltage determines CPU or GPU clock speed. The GPU or CPU can draw any arbitrary amount of power independent of the frequency as long as you can cool the chip.

A CPU running at 1GHz can draw the same amount of power as the CPU at 3GHz. What determines the power they draw is the work they are doing.
 
Last edited:
So I'm guessing people didn't actually read the whole article and watch the video and keep bringing up the developers quote taken out of context to push a narrative? Why am I not surprised? Some of you are transparent as shit man.
 
ITS NOT THROTTLING. Goes on to then essentially define what throttling is in terms of a GPU/CPU and how Sony are doing that.
Also I tend to trust developers more than a random GAF member.
The difference is throttling for programming purposes or profiling purposes. That's why i chose the word restricting. They are not making games for a specific profile where the CPU has to be at a low frequency in order for the GPU to be at max frequency, they can both be at max frequency.
 
Last edited:
Game developer to Digital Foundry: "I'm having tho THROTTLE the cpu to stabilise the gpu"
GAF console warrior: "They're not throttling, they're isolating".

Sony couldn't pay some of you enough for all the spin and free damage control.
Dev didn't say that.
Stop the FUD.

Dev choose a profile for profiling/debuting that max GPU and throttle the CPU (no matter the workload... it is a fixed profile even if you don't use that GPU power).
He choose that because his Engine didn't estrear the CPU die being coded to weak Jaguar workload.
There is no automatically change in frequencies.

Dev actually has no ideia if or how much the CPU will throttle to maintain the GPU at max.
 
Last edited:
Cerny:

"There's enough power that both CPU and GPU can run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."


Can we put the 9.2 tf bs to bed now .

Of course not. Some people will continue to spread FUD because they have to keep the dream alive that MS might acutally win a gen. 4th time is the charm.
 
And when games push these devices to the edge in 5 years and we get down clocks and missed frames then what? I guess hold back games for to protect the PS5 is the answer? Dont max out graphics and physics plus other good stuff to avoid down clocks?
What there ends up being a PS5 pro and a cheaper lower powered Lockhart from Microsoft?
 
But in this case this is something that developers have to pay attention for, isn't it? That's a huge task if I understand this correctly. Just an example: let's say they have to test a certain scene where everything is blowing up, there are lots of characters etc. They achieve their target fps with a certain number of characters, but if they added more the framerate would be worse. And they have to test everything like this. They always have to keep the "budget" in mind which is quite difficult in a dynamic open world game. They can't let the system just take care of this problem because that can mean that a certain number of characters is too much for the system, it donwclocks itself and the framerate tanks. I hope it's clear what I mean and I'm not a developer so it is easily possible that I'm talking nonsense. I just want to understand this whole thing.
Right, but they already have to do that to some extent when they budget for the framerate, so it's just an extra thing to track. With the developer tools they should be figure out what the worse case is and design around that. It's deterministic so the same code will always produce the same effects.
 
And when games push these devices to the edge in 5 years and we get down clocks and missed frames then what? I guess hold back games for to protect the PS5 is the answer? Dont max out graphics and physics plus other good stuff to avoid down clocks?
Ask yourself this then.

Are GPUs and CPUs locked this gen and the gen before that?

Are their missed frames and performance issues in games even though the CPU and GPU have locked frequency?

What is actually funny is this new system can address that slightly as a by product. When the GPU is bound power can be redirected to it and vice versa.
 
Last edited:
For anyone being stupid enough to try and bring up that quote again about throttling:

In short, the idea is that developers may learn to optimise in a different way, by achieving identical results from the GPU but doing it faster via increased clocks delivered by optimising for power consumption. "The CPU and GPU each have a power budget, of course the GPU power budget is the larger of the two," adds Cerny. "If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."

Now shut the fuck up with your concern trolling and bullshit.
 
Watch the DF video
The video says, those profiles are only used in dev kit, the article further clarifies that it i for debugging and profiling.

You said
Game developer to Digital Foundry: "I'm having tho THROTTLE the cpu to stabilise the gpu"
GAF console warrior: "They're not throttling, they're isolating".

Sony couldn't pay some of you enough for all the spin and free damage control.
And again i say post the link to the quote or give me the timestamp for that part of the video. You put that sentence in a quote so surely Rich said that in the video right?
 
Last edited:
I don't understand the thought process here. Why even have a Boost Mode if the hardware can just run constantly at that pace? Why even suggest that there even a mode that doesn't employ it like they did at the hardware reveal?
 
Last edited:
Yeah, from
794846-killzone_1_helghast.jpeg


to
maxresdefault.jpg


giphy.gif
 
I'm certain PS5 is late. we had seen games at this time in the PS4 pre launch. Right now we saw nothing. Add to that sony wasn't going to be at E3, which meant they were focusing elsewhere. I'm sure they would have been there if everything was ready to show.
 
I don't understand why people are having a hard time following that this is just smart shift. Variable frequency is not something to cheer for. This is a clear reach to try and catch up to Xbox and I think it is going to be a mess.
 
Last edited:
The video says, those profiles are only used in dev kit, the article further clarifies that it i for debugging and profiling.

You said

And again i say post the link to the quote or give me the timestamp for that part of the video. You put that sentence in a quote so surely Rich said that in the video right?
Richard said throttling, you and other armchair developers here are explaining how throttling is not throttling and I'm the one who has to provide explanations?
And you shouldn't have skipped class the day when they explained what paraphrasing is. I'll dig out where Richard said that some developers are telling him that they have to throttle the CPU to make sure that they get max GPU, since you can neither read nor listen.
 
Right, but they already have to do that to some extent when they budget for the framerate, so it's just an extra thing to track. With the developer tools they should be figure out what the worse case is and design around that. It's deterministic so the same code will always produce the same effects.
I see. Shame, worst case scenario has to be taken into account always = hindering what could be possible if the machine worked like the previous Playstations. I don't like this, it's similar to the concept of Lockhart hindering nextgen which I was also bitching about. It's clear now that this was designed as a less powerful machine. I'd rather they left the GPU at 9,2 TF and have the machine work similarly to PS4. I suppose this will be the first machine I will buy anyway, but I'm quite disappointed. Thanks for clarfying.
 
I don't understand why people are having a hard time following that this is just smart shift. Variable frequency is not something to cheer for. This is a clear reach to try and catch up to Xbox and I think it is going to be a mess.
They would cheer for higher power bills and less stable frame rates if Sony tells them that their console will be quiet. Oh wait...
 
Richard said throttling, you and other armchair developers here are explaining how throttling is not throttling and I'm the one who has to provide explanations?
And you shouldn't have skipped class the day when they explained what paraphrasing is. I'll dig out where Richard said that some developers are telling him that they have to throttle the CPU to make sure that they get max GPU, since you can neither read nor listen.
Don't even bother, you can't find it. It does not exist. Lmao
 
RE3 performs better than the xbox one x version, the whole talk was about perfeomance revolving around frame rate. Thats MS problem, stop
Pushing the 4k 60 bullshit because it cant be done. Do checker board so you can hit a solid frame rate. The Pro version runs better than the X version, fact.
But you didn't actually answer his question honestly.
 
This is how tech videos should be done, not like nxgamer who comes up with hypotheticals which seem to favour the PS5 the majority of the time.
 
I don't understand why people are having a hard time following that this is just smart shift. Variable frequency is not something to cheer for. This is a clear reach to try and catch up to Xbox and I think it is going to be a mess.

"a mess" is a bit of an overstatement, but it's true that there is no practical advantage to this setup. On XSX, when CPU and/or GPU run taxing instruction sets, the system will use more power and be forced to dissipate more heat. On PSV, the system will lower clocks in response to demanding instructions, it's not the end of the world, but certainly not some great advantage. Unless, using less electricity is a pressing concern for you when it comes to consoles (not something I care about, LOL).
 
Last edited:
I really appreciate Richard covering this in specific because there's a lot of delusional assertions about frequency identically scaling compute. It can approach identical performance but when pressed hard more physical hardware will always handle load better, this is even the case for like for like compute ceilings.

This isn't going to make up a nearly 2 Teraflop advantage coupled to 16 additional compute units. Not even close.

szaNC8Q.jpg


What I also appreciate is him showing how a memory bandwidth bottleneck can present itself even at a mere 9.67 Teraflops with 448GB/s of bandwidth available.

6ntRomv.jpg


There's some hard realizations ahead for certain people, and it's best you come to grips with these things now because when the actual hardware hits the above is going to scale dramatically higher when compute variance of such a lopsided degree enters the fray.
 
Last edited:
But you didn't actually answer his question honestly.
That defeats the purpose because I originally stated that the ps4 pro beats the xbox one x version in terms of performance. Again you can have all the high resolution you want but I'm sure alot of people rather have checker board and a consistent frame rate than having the most powerful current console dip in the 40's. Since the new xbox is more powerful as well I can see this being a theme for next gen as well as well as an edge for sony. Performance matters most over resolution
 
This is how tech videos should be done, not like nxgamer who comes up with hypotheticals which seem to favour the PS5 the majority of the time.
There dozen hypotheticals in the video to the point people are using Richard assumption as confirmation to what Cerny said lol
 
I'm not saying otherwise. I'm just really annoyed, when anybody who is trying to sell me something is saying things like "most of the time"

Because "most of the time" can be 80% of time, 95% of time and also 99% of time.

And i'm not even talking about fact, that we still don't have minimum frequencies (and we will probably never have) just info that lowering a frequency have a big influence on power requirements (No shit, Sherlock...)

You literally cannot specify a time, as clearly stated - developers will control the frequency. Depends how much shit is on the screen, same is true for PC, though you can lock a specific frequency if you want.
 
What in the fudge are you talking about.
Don't even bother, you can't find it. It does not exist. Lmao
I believe you need to watch again.
I was simply referring to the below but go ahead, say it's photoshopped. Group denial in all its splendor:

"Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. "


ctputqT.jpg
 
I was simply referring to the below but go ahead, say it's photoshopped. Group denial in all its splendor:

"Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. "


ctputqT.jpg
Keep reading where that is clarified.
 
Keep reading where that is clarified.
I thought it didn't exist. If you fought half as hard to better your life as you do to defend Sony's product, I guarantee you would be too happy to be trolling game forums to systematically deny the truth on behalf of a corporation.
 
Wasn't this an April Fools joke?

Never the less, if the real PS5 looks similar to this... typical rectangular shape, but keeping the razor blade shape without the slant, this system might be the most reasonable representation out there. Hard to tell how big it is in that pic. It could Xbox One VCR size or normal sized.

Compared to the Lets Go Digital mock-ups which some look slick but unrealistic.
 
RE3 performs better than the xbox one x version, the whole talk was about perfeomance revolving around frame rate. Thats MS problem, stop
Pushing the 4k 60 bullshit because it cant be done. Do checker board so you can hit a solid frame rate. The Pro version runs better than the X version, fact.

The Xbox versión is running at much higher resolution, with better shadows and depth of filed, at the same settings as the Ps4 pro it would run much better than the pro version + it is the only future proof version, on series X it will much likely run at 4k 60 fps while the ps4 version on ps5 will be stuck with lower res and settings, that's not denying that on xboxX run like shit though
 
Last edited:
RE3 performs better than the xbox one x version, the whole talk was about perfeomance revolving around frame rate. Thats MS problem, stop
Pushing the 4k 60 bullshit because it cant be done. Do checker board so you can hit a solid frame rate. The Pro version runs better than the X version, fact.
but Microsoft didn't make res evil 3
 
The Xbox versión is running at much higher resolution, with better shadows and depth of filed, at the same settings as the Ps4 pro it would run much better than the pro version + it is the only future proof version, on series X it will much likely run at 4k 60 fps while the ps4 version on ps5 will be stuck with lower res and settings, that's not denying that on xboxX run like shit though
We heard that 4k 60 rhetoric when the one x came out and most times they cant hit it, and thats great and all but I like most people rather have performance. If making a game have checker board resolution locks a frame rate or gives me 60 Id rather have that. If the ps5 takes a hit on resolution compared to frame rate thats a win for me and most people. Games are played not starred at.
 
Last edited:
Top Bottom