[Digital Foundry]PS5 uncovered

You guys can debate this "Which system is more powerful" to kingdom come, and it won't change anybody's mind about which system they will pick up day-1. Let's wait until the exclusive games come out at launch. We all know that Sony has some amazing games coming out for the PS4 that will scale with the PS5. And Microsoft has 16 studios working on games as we speak, so surely they, too, will turn out some amazing games. If half of those studios' games turn out to be amazing, I'll be picking up both systems as well, and others will too. Peace!
 
Can someone with more knowledge of emulation tell me if the ps5 would have enough power to do the earlier systems? I can't think of a reason why they couldn't.
PS1 and PS2 easily.
PS3 is a bit trick... it can be done with that power but it should really be a big project with a lot to do.

I don't think Sony will waste $$$ with PS3 emulation.
 
PS1? Easily? PS2? That's a little harder task due to Emotion Engine and its Vector Processing Units, but their tasks could simply be redirected to the GPU. PS3, now this is basically impossible task due to Cell and its SPEs, as each game uses them for different purposes, some for CPu tasks, some for GPU tasks, some for sound processing etc., so the emulation would have to be made on a game by game basis, with a profile for each game telling each SPE task where it should go on the PS5.
They already have a working PS2 emulator on PS4... it should run without any issue on PS5.

PS2 to PS4 games are all emulated.
 
PS1 and PS2 easily.
PS3 is a bit trick... it can be done with that power but it should really be a big project with a lot to do.

I don't think Sony will waste $$$ with PS3 emulation.

Sooner or later they will have to else PSNow would need to have double the number of rack mounted blades with PS5 on a blade units and PS3 based ones for 5-6 more years and that is less and less practical as time goes on. It would also be a PR win if they could announce widespread BC from PS1/PS2/PS3 all the way to PS4 Pro enhances titles.
 
Last edited:
Sooner or later they will have to else PSNow would need to have double the number of rack mounted blades with PS5 on a blade units and PS3 based ones for 5-6 more years and that is less and less practical as time goes on. It would also be a PR win if they could announce widespread BC from PS1/PS2/PS3 all the way to PS4 Pro enhances titles.
I wish they could do something like that, even if it was only PS and PS2 game. I would be over the moon.
 
THis is what the "race to idle" means. When a GPU is at max clocks it is likley not being fully loaded from a transistor perspective.
Not to mention Cerny mentioned both CPU/GPU would run at (or close to) peak frequency most of the time without taking into account race to idle
Thanks for the input.
With race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time.
PS3 is a bit trick... it can be done with that power but it should really be a big project with a lot to do.
It would be awesome if they can use tempest engine to aid PS3 emulation down the line
 
Last edited:
It would be awesome if they can use tempest engine to aid PS3 emulation down the line

It was interesting that they harkened all the way back to the PS3 to explain the design of it. Not just DSP like, but SPU like. Maybe it's like a reverse of the bet the PS3 itself made, it had the hardware for BC at first but the software didn't pan out, so maybe this is underselling it at first and seeing if they can do some emulation through it later.


Or another possibility is leave it free for developers to choose where to apply that power, if they don't use it for audio. I'm not gey but 100Gflops is 100Gflops. Wait what.
 
Or another possibility is leave it free for developers to choose where to apply that power, if they don't use it for audio. I'm not gey but 100Gflops is 100Gflops. Wait what.
Cerny emphasized its main purpose is 3D audio processing, not only because it excels at processing audio workloads but also as a guaranteed resource so devs don't have to sacrifice audio anymore
I would assume extracting those 100gflops for visuals wouldn't be worth the effort/micromanaging required and sacrifice in audio
 
i just wrote up a Comment in the german PCGamesHardware.de Forum and i will put it also in here - translated ofcourse. I use for the most part Google to translate and only fine tune here and there.
Hope this works out. :messenger_sunglasses:


For one thing, the performance difference of ~ 19% is not really relevant. Not in itself. And even more not so if you take into account that the funds that has made the XboxX Tf advantage possible , the PS5 Team just had used to completely eliminate system bottlenecks - with other positive side effects and Sony's own adjustments to the GPU.

And the tenor of the developers is not that the Xbox X is worse to program, judging by the standard. But that the PS5 is easier to program - measured by the standard. These are two fundamentally different statements.

And damage limitation? I hardly think that Sony is interested in what is written in forums. They may have to admit that the effect of the Cerny presentation was unintentional.
(But)The fact that they presented significant (and supposedly inferior) specs of the PS5 directly after the release of the Xbox X Specs earlier testifies to self-confidence.

Sony will have heard everything that is now suggested by various devs. Don't forget - NDAs logically only accoun to public Affairs (and Dev to Dev) ... For example, if Sony went to Ubisoft in mid-2019 and asked about the performance and programmability of the PS5, then the Dev could definitely tell the truth to them ;)

And it was foreseeable that either of the two manufacturers would go into the GEN with a little more TF - and now the difference is even smaller on paper than the difference from Xbox One to PS4 at the time. In general, the comparisons that are made with the PS5 in relation to the Xbox One disaster are absolutely inappropriate.

Back then, MS launched a system, which was hopelessly inferior to the PS4 on all fronts. There is nothing to denie there . Then their tools were just shitty - just think about it, at the beginning Xbox One was programmed with DX11 on ..
While Sony started with a superb low level API.
The Xbox X still has no real low level API with this DX12 derivative and consequently its hardware cannot be addressed as efficiently as that of the PS5 which should certainly use a further improved version of the GNM.

And what Sony does with the improved I \ O block is basically the low level software principle brought to metal. It is now not just like normal low-level APIs that make the bottlenecks and performance reserves more easily avoidable \ exploitable in given systems. No, an already perfected low level API was set in a perfected hardware environment. It's like not only tuning the car in a street race to have better chances of winning, but also replacing the driver with a Formula 1 driver.

The complete I \ O throughput was increased a hundredfold. H U N D R E D F O L D! - I think it is hardly clear to most People so far what this does with the graphics of games ... And this R&D (cost)that has flowed in there is certainly comparable to the additional costs that MS has with their additional 1.8Tf on the Xbox X.

Nobody needs to brainstorm into the dream world of a "surprised and shocked" Sony Playstation Division. Sony is not a beginner in the field and has been building hardware since the birth of the people who foundet MS..
It was clear to them that if they go this way of optimization and there is a price X at the end on the packaging, MS will come out with probably a little more Tf. That was a flawless balancing act. Basically MS should be glad that it was "only" 10.3Tf because if Sony had also arrived with 12Tf apart from all optimizations - Gme would be already over for MS .. So it remains a bit more exciting on paper at least.

I am sure - when Sony is ready and with the final PS5 presentation (which will hopefully take place after the Xbox X presentation) the first true Next Gen titles in the form of PS5 Exclusives will make our all Jaws drop.

Don't forget - everything that appears on PC and on Xbox X (and of course also third party titles as far as PS5 is concerned) must be technically game designed around the 32Mibyte esRam frame buffer of the Xbox One ...go figure..
People are now all gangsters because: "It can be scaled" Yes it can be - sure. Certainly even. Xbox Games will run at best at 30fps on Xbox One, then on Xbox X in native 4k and 60-120fps and better settings. On the PC then, if you have the spare power with even more fps and even better settings. hooray!!

The cool gangster behavior will then suddenly turn into frenzy when in the final PS5 demonstration games were shown that did:
1. Actually use > 10Tf for next gen graphics
2. Utilize a CPU that is more than 500% as powerful as the Jaguar and do not need care about the later
3. require and use an SSD bandwidth of 9GB \ s.
4. have access to a hundredfold I\O compared to the last gen.

After that it will be clear that the support of the Xbox One MS is this year's clusterfuck and your PC gamers should hope and even help that MS recieves enough pressure to take back that brain fart called "Xbox One Support"
 
Last edited:
i just wrote up a Comment in the german PCGamesHardware.de Forum and i will put it also in here - translated ofcourse. I use for the most part Google to translate and only fine tune here and there.
Hope this works out. :messenger_sunglasses:


For one thing, the performance difference of ~ 19% is not really relevant. Not in itself. And even more not so if you take into account that the funds that has made the XboxX Tf advantage possible , the PS5 Team just had used to completely eliminate system bottlenecks - with other positive side effects and Sony's own adjustments to the GPU.

And the tenor of the developers is not that the Xbox X is worse to program, judging by the standard. But that the PS5 is easier to program - measured by the standard. These are two fundamentally different statements.

And damage limitation? I hardly think that Sony is interested in what is written in forums. They may have to admit that the effect of the Cerny presentation was unintentional.
(But)The fact that they presented significant (and supposedly inferior) specs of the PS5 directly after the release of the Xbox X Specs earlier testifies to self-confidence.

Sony will have heard everything that is now suggested by various devs. Don't forget - NDAs logically only accoun to public Affairs (and Dev to Dev) ... For example, if Sony went to Ubisoft in mid-2019 and asked about the performance and programmability of the PS5, then the Dev could definitely tell the truth to them ;)

And it was foreseeable that either of the two manufacturers would go into the GEN with a little more TF - and now the difference is even smaller on paper than the difference from Xbox One to PS4 at the time. In general, the comparisons that are made with the PS5 in relation to the Xbox One disaster are absolutely inappropriate.

Back then, MS launched a system, which was hopelessly inferior to the PS4 on all fronts. There is nothing to denie there . Then their tools were just shitty - just think about it, at the beginning Xbox One was programmed with DX11 on ..
While Sony started with a superb low level API.
The Xbox X still has no real low level API with this DX12 derivative and consequently its hardware cannot be addressed as efficiently as that of the PS5 which should certainly use a further improved version of the GNM.

And what Sony does with the improved I \ O block is basically the low level software principle brought to metal. It is now not just like normal low-level APIs that make the bottlenecks and performance reserves more easily avoidable \ exploitable in given systems. No, an already perfected low level API was set in a perfected hardware environment. It's like not only tuning the car in a street race to have better chances of winning, but also replacing the driver with a Formula 1 driver.

The complete I \ O throughput was increased a hundredfold. H U N D R E D F O L D! - I think it is hardly clear to most People so far what this does with the graphics of games ... And this R&D (cost)that has flowed in there is certainly comparable to the additional costs that MS has with their additional 1.8Tf on the Xbox X.

Nobody needs to brainstorm into the dream world of a "surprised and shocked" Sony Playstation Division. Sony is not a beginner in the field and has been building hardware since the birth of the people who foundet MS..
It was clear to them that if they go this way of optimization and there is a price X at the end on the packaging, MS will come out with probably a little more Tf. That was a flawless balancing act. Basically MS should be glad that it was "only" 10.3Tf because if Sony had also arrived with 12Tf apart from all optimizations - Gme would be already over for MS .. So it remains a bit more exciting on paper at least.

I am sure - when Sony is ready and with the final PS5 presentation (which will hopefully take place after the Xbox X presentation) the first true Next Gen titles in the form of PS5 Exclusives will make our all Jaws drop.

Don't forget - everything that appears on PC and on Xbox X (and of course also third party titles as far as PS5 is concerned) must be technically game designed around the 32Mibyte esRam frame buffer of the Xbox One ...go figure..
People are now all gangsters because: "It can be scaled" Yes it can be - sure. Certainly even. Xbox Games will run at best at 30fps on Xbox One, then on Xbox X in native 4k and 60-120fps and better settings. On the PC then, if you have the spare power with even more fps and even better settings. hooray!!

The cool gangster behavior will then suddenly turn into frenzy when in the final PS5 demonstration games were shown that did:
1. Actually use > 10Tf for next gen graphics
2. Utilize a CPU that is more than 500% as powerful as the Jaguar and do not need care about the later
3. require and use an SSD bandwidth of 9GB \ s.
4. have access to a hundredfold I\O compared to the last gen.

After that it will be clear that the support of the Xbox One MS is this year's clusterfuck and your PC gamers should hope and even help that MS recieves enough pressure to take back that brain fart called "Xbox One Support"

What's your background? Even though English isn't your 1st language this was a good read.
 
What's your background? Even though English isn't your 1st language this was a good read.

Hi there,
I dont have any proffesional knowledge in hardware.

But iam realy intrested in the technologie behind those consoles and i like to connect the dots 🙂

Iam a Field Technician for enviromental high tier studies..

Iam german but our work is done almost entirely in english to make sure several autoritys in different countrys can use the data to evaluate our findings..

So iam always intrested in improving my english. Going out there and writing stuff up is one way i guess haha
 
Left gaming fora 7 years ago.

Suddenly remember gaf this morning after 22 days of lock down.

Read the back and forth between rnlval and ethomaz.
Brings back fond memories of console warz.

These all felt like home again ~ ❤️
 
Left gaming fora 7 years ago.

Suddenly remember gaf this morning after 22 days of lock down.

Read the back and forth between rnlval and ethomaz.
Brings back fond memories of console warz.

These all felt like home again ~ ❤
Welcome back to our little descent to madness. Every time I enter the forum it is evident that the most zealous members are so disconnected from the reality we live on these days.

But it is such a nice place to forget our day to day problems.
 
Maybe... just maybe ;)
Overclock a RX 5700 to prove a point... DF was very very misleading.

To not say unprofessional.

That comparison is very very misleading.
Again RDNA not sustain 2100Mhz plus it performance not scale proportionally to increase in clock.

Very very misleading.

It is funny the only examples some post here are from RDNA cards at high clock (that are not sustained and bad performance scale) without any clock timeline.

Why not show a comparison with clocks at 1500mhz or so where the clock is stable and performance still is optimal.
Just because AMD named N7, N7P and N7+ as 7nm doesn't mean RDNA and RDNA 2 uses the same process.

That is more true when you look at the "50% increase in perf. per watt" that is only possible in a more advanced procesd.

I have a feeling 2.2Ghz for BigNavi will be the norm.
Yes but we are comparing costs in the same process/tech.
Smaller dies are cheaper than big dies.
Avg. 1880Mhz.

The point is...

1. RDNA doesn't sustain 2100Mhz... the clocks drops a lot só the card is not running at 36CUs @ 2100Mhz..: that means it is not running anywhere close to 9.9TFs.

2. RDNA performance doesn't scale proportional to clock speeds... at 2100Mhz the performance is not optional proportional to the increase in clock because you are near the limit of the RDNA clocks.

So to avoid both issues that are not present in RDNA 2 you make the test using lower clocks.

Eg.

36 CUs @ 1800Mhz vs 40 CUs @ 1620Mhz
36 CUs @ 1500Mhz vs 40 CUs @ 1350Mhz

In both cases the 36CUs part will delivery better performance.

That is why the test at 2100Mhz is misleading and can't be used as evidence for RDNA 2 high clocks that doesn't suffer in sustain it and it scale better the performance with increase of the clocks over 2000Mhz.
All PC cards actually runs at variable clocks... you can't tell it clocks it is running too just like PS5.

The 970 crippled design is related to memory and not clocks.... 0.5GB of the VRAM uses a "half bus" compared with the other 3.5GB due how the modules setup.
That is TSMC talking about N7P to N7+... RDNA is N7.
I'm not sure why are replying to me at all.
The comparison is very misleading.
And you reply has nothing to do with what I said.
It is not the same process.
That is why there is a 50% increase in perf. per watt.

BTW there is a lot of examples of big improvements using the exactly same process.
What miracle? lol

it is already confirmed the improvements... 50% increase in perf. per watt.

You have basically no ideia what are you are talking about.

52CUs at 1825Mhz is not possible with RDNA.
36CUs at 2230Mhz is not possible with RDNA.

RDNA 2 already made it happens.
That of course in a APU package... only RDNA 2 chip probably could reach better clocks.
Again your post has nothing to do with what I said lol



What miracle?

RDNA 2 is already a reality and the clocks of the consoles shows it is a big improvement over RDNA.

You keep posting the same nonsense posts lol
Hoping?

RDNA 2 is already confirmed with better clock scaling... the fact Xbox and PS5 reach these clocks already shows that.

RDNA examples and that PEG that is unrelated to the talk won't change that lol
In fact if you not cap the power target on PS5 the GPU can maintain 2230Mhz all the time because it is RDNA 2... RDNA can't do that no matter the power you supply to it.

Xbox needs 10 memory chip for a 320bits bus and to go cheaper they split the memory bandwidth not using the same density chip for all modules.
WTF? lol

Keep posting random charts doesn't create a point you know.

RDNA, Turing or anything is not related to what RDNA 2 can do... stop looking for something unrelated to the conversation.

BTW you need more bandwidth for post-processing/texture/filters but these are unrelated to CU.
An wall text of nothing substantial lol

RDNA doesn't scale in high clocks... that liquid cooling RX 5700 is just to break speed records... it performance does not scale, it is impractical and the clock doesn't sustain.

About Xbox memory setup you can't access the memory at the same time so you are ir access the 560GB/s part or the 336GB/s part... your made up math made no sense.

And now you try to use an old GCN 1.2 as example for memory management in RDNA 2.

Your examples are very misleading.
 
Last edited:
Nor Series X for that matter. Any chip under max load will downclock to protect itself from overheating.

That's only going to happen with a failure. No console deals with thermal throttling. Even the switch in handheld mode has consistent clocks, even if they are lower. Consoles are designed with a thermal envelope in mind, and stay inside it.
 
That's only going to happen with a failure. No console deals with thermal throttling. Even the switch in handheld mode has consistent clocks, even if they are lower. Consoles are designed with a thermal envelope in mind, and stay inside it.
I agree... the issue was people jumping on the board that PS5 clocks were ahead the designed thermal envelope.
 
That's only going to happen with a failure. No console deals with thermal throttling. Even the switch in handheld mode has consistent clocks, even if they are lower. Consoles are designed with a thermal envelope in mind, and stay inside it.
I agree... the issue was people jumping on the board that PS5 clocks were ahead the designed thermal envelope.
Should have been more clear. What ethomaz said was what I should have mentioned. There's still this idea that somehow PS5 cannot handle those clocks.
 
Top Bottom