• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Immortals of Aveum PS5/Xbox Series X/S: Unreal Engine 5 is Pushed Hard - And Image Quality Suffers

SlimySnake

Flashless at the Golden Globes
This thread exposes all the newbies who werent around when the PS4 was first announced.

PS4 has a massive async compute advantage because Cerny put in 8 ACEs with upto 64 compute queues. Xbox One only had 2 ACEs with 16 queues. Cerny gave the PS4 async compute of the high end AMD GPUs at the time. The ones with 3-4 tflops. PS4 Pro was also an async compute monster and there is no reason to believe that they would skimp on async compute for the PS5 after their first party devs from ND to SSM have made full use of async compute in their engines.
“The original AMD GCN architecture allowed for one source of graphics commands, and two sources of compute commands. For PS4, we’ve worked with AMD to increase the limit to 64 sources of compute commands — the idea is if you have some asynchronous compute you want to perform, you put commands in one of these 64 queues, and then there are multiple levels of arbitration in the hardware to determine what runs, how it runs, and when it runs, alongside the graphics that’s in the system.”

Besides, async compute does not scale with CUs. The queues handle instructions being sent to CUs.

lVTgFNo.png


The PS5 clearly looks better than the XSX here.
 

SmokedMeat

Gamer™
We need Pro consoles to run the games, that they marketed for the current consoles lol.

Maybe so, but the majority of modern AAA developers suck. No matter how much power they’re given to work with, they’re going to deliver disappointing results.
 

Riky

$MSFT
And yet the Developer is pretty cut and dry in the statement he made. You should email him to tell him he is wrong.

It's pretty scary people think they know better than the Devs. Of course those maths only work if the Series X was just using 36 compute units, it isn't.

"verify the behavior across different GPU tiers. High-end GPUs have more SM units, so more potential for overlap."

this is what Nvidia say and overlap is the important phrase, SM units is what AMD call Compute Units.
 
Last edited:
This thread exposes all the newbies who werent around when the PS4 was first announced.

PS4 has a massive async compute advantage because Cerny put in 8 ACEs with upto 64 compute queues. Xbox One only had 2 ACEs with 16 queues. Cerny gave the PS4 async compute of the high end AMD GPUs at the time. The ones with 3-4 tflops. PS4 Pro was also an async compute monster and there is no reason to believe that they would skimp on async compute for the PS5 after their first party devs from ND to SSM have made full use of async compute in their engines.


Besides, async compute does not scale with CUs. The queues handle instructions being sent to CUs.

lVTgFNo.png


The PS5 clearly looks better than the XSX here.
Yeah I assumed he was referring to Xbox but you might be right here I wish they would just come out and say it
 

Gaiff

SBI’s Resident Gaslighter
Yeah I assumed he was referring to Xbox but you might be right here I wish they would just come out and say it
Or more than likely, they didn't leverage the async compute of the PS5 properly which is why they say "it doesn't work as well".

This game's performance profile was made on PC. Hell, the development was all PC and they ported it to consoles. Seems it was easy enough to get DS and DX12 to work on Xbox but it's a different story for the PS5 which explains the performance disparity. His words don't imply that the PS5 is worse at async compute, they imply that they didn't manage to get it to work as well.
 
Last edited:
this i am a 60fps man but if it had 30fps and saw the df video i play it at 30fps no cap.
Mark Maratea: We spent the first two years chasing our visual target lock [rather than an fps target], with the art team leading that charge etc................ we had a machine running consistently at 45-50fps in dev mode - and Brett was playing that and said, "OK, no, I don't like 30 anymore. This game needs to be 60, combat doesn't feel right at 30." And so as a team, we pivoted.
 
Are you guys really gonna start wasting your time arguing about which of the two main versions is slightly better than the other for the sake of pathetic console wars noise?

Ooh one has 5 more frames than the other. Ooh one looks a bit sharper. Jesus Christ, go and play a game or something.

That's what happens when the two are close. Some people are mad at that.
 

Montauk

Member
That's what happens when the two are close. Some people are mad at that.

It’s just pathetic.

I was playing an amazing game yesterday which is more interesting, unique and involving than most games.

Go and try Rain World for 20 minutes instead of wasting your time acting like your chosen console is a beast because it gets a whopping 5 more FPS or is slightly sharper. Worse, it’s about a game you don’t want to play and looks like shit.

How many of you are in your 30s or getting into middle age?

I was embarrassed by console wars shit when I was 14.
 
Last edited:
It’s just pathetic.

I was playing an amazing game yesterday which is more interesting, unique and involving than most games.

Go and try Rain World for 20 minutes instead of wasting your time acting like your chosen console is a beast because it gets a whopping 5 more FPS or is slightly sharper.

How many of you are in your 30s or getting into middle age?

I was embarrassed by console wars shit when I was 14.

I don't let it bother me. I actually find it entertaining the lengths that some go with this.
 

SlimySnake

Flashless at the Golden Globes
YyrIcE6.jpg


From interview:
"
The PC graphics menu calculates a score for your CPU and GPU. What is this based on?

Mark Maratea
: Epic built a synthetic benchmark program [that we use] and pull real [user-facing] numbers, so that's where that comes from. CPU is essentially single-core performance, GPU is full-bore everything GPU performance. A Min-spec CPU is somewhere around 180 or 200, ultra is around 300; min-spec GPU is around 500, ultra starts from around 1200 which is where a 7900 XT or a 4080 shows up.
This is very interesting stuff. Can someone run this tool on PCs to see where their GPU ranks?

The Xbox GPU should be around 950 if they are going by tflops alone. I would love to see what they rate it as. CPU is way higher than i thought it would be. The PS5 CPU is handicapped by lack of cache.

Whats interesting is that they set the 2080 and 5700xt as minimum in their PC requirements. They both should be around 870 unless they are bottlenecked by their 8GB of vram and the PS5's VRAM is making it rank higher in their tool.

F3v20ZBasAAsqPW.jpg:large



EDIT: Apparently the guy who wrote the tool replied in the reddit thread and said not to conflate this with PC ratings since the tool was mainly written for PCs and not consoles.

That isn't the rating (that would be the second number which is -1) thats the rating of whatever the default tunings are for the PS5. Those are PC tuned values so I don't really think it matches up against the consoles very well.

I'm well aware of how the tool works - I wrote it.
The PS5 is basically a 6700XT not a 5700XT. Both are RDNA2 running at around 2250Mhz (2233 for the PS5 and 2321 for the 6700XT stock) - but the 6700XT is a 230W vs 180W setup so you can see where the PS5 falls down - no boost clock.
The PS5 does have greater memory bandwidth (448gb/s vs 384) which helps a ton when we push async compute tasks over to it - which we do in IoA.
The PC version leans into the UE5 Scalability group system - what comes out in the cooking engine.ini in the context binary isn't the actual settings. Thats what the tool does - it runs a synthetic benchmark and then does a lookup against the scalability settings that we've created. Those defaults are an attempt to match the performance cost vs system capabilities.
But the PS5 actually has its own scalability settings. The PC groups don't map to it at all. Most of them don't exist on the PS5. And the costing isn't an exact match since its an aggregate over a wide range of hardware and resolutions. The performance budget tool is still very much in its first pass and you'll be seeing vast improvements with it over the next few patches.

He mentions async compute in the second response. If you guys want, you can ask him which console has better async compute. lol
 
Last edited:

Lysandros

Member
Wouldn't that benefit from caches scrubbers which the PS5 has?
Cache Scrubbers are all about maintaining a more continuous inner GPU data flow by minimizing flushes, they should undeniably influence the compute throughput, async or not it's all data. So yes, quite significantly i would say.
 

Zathalus

Member
He mentions async compute in the second response. If you guys want, you can ask him which console has better async compute. lol
Personally I don't think it matters, the two consoles are so close to each other that it's all rather academic anyway.

The only thing it brings to attention is that optimization methods for both consoles differ quite a bit, so if the game you are developing doesn't take advantage of Async Compute it's likely going to run worse then it should on one of the consoles.
 

SlimySnake

Flashless at the Golden Globes
I missed a bunch in the thread here. Has there been a conclusive explanation bhy the ps5 version looks so much better?

It kinda looks like the difference between fsr2 and tsr... Are we absolutely sure they are both on fsr?
Devs confirmed that they are both FSR 2.1 on ultra performance which means they are both 720p.

PS5 has better detail because of the better IO pipeline (cerny's secret sperm sauce) that seems to be the key to UE5 as mentioned by devs.
Mark Maratea: I would say in some ways it works better on consoles, weirdly enough. Nanite virtualised geometry is very stream-heavy, it is all about disk I/O. So UE5 is rewriting the I/O pipeline and the optimisations they made for NVMe SSDs. It's designed to work on consoles.

Some people on gaf noticed visual inconsistencies in the XSX version of the Matrix demo and Fortnite. The engine seems to favor PS5 at the moment.
 

jroc74

Phone reception is more important to me than human rights
So seems like UE5's Nanite and Virtual Texturing systems are sensitive to data I/O bandwidth.

Lots of interesting nuggets in this article.



siXuS19.jpg
Well, this thread certainly got lively, lol.

And I'm here just looking at this:

"Mark Maratea: On consoles only, it does an adaptive upscale - so we look at what you connected from a monitor/TV standpoint... and there's a slot in the logic that says if a PS5 Pro comes out, it'll actually upscale to different quality levels - it'll be FSR 2 quality rather than standard FSR 2 performance."

Can't wait. Let's see what a PS5 Pro can do.
 

rofif

Can’t Git Gud
Well, this thread certainly got lively, lol.

And I'm here just looking at this:

"Mark Maratea: On consoles only, it does an adaptive upscale - so we look at what you connected from a monitor/TV standpoint... and there's a slot in the logic that says if a PS5 Pro comes out, it'll actually upscale to different quality levels - it'll be FSR 2 quality rather than standard FSR 2 performance."

Can't wait. Let's see what a PS5 Pro can do.
If this is true, this is very good way of thinking
 

Crayon

Member
Devs confirmed that they are both FSR 2.1 on ultra performance which means they are both 720p.

PS5 has better detail because of the better IO pipeline (cerny's secret sperm sauce) that seems to be the key to UE5 as mentioned by devs.


Some people on gaf noticed visual inconsistencies in the XSX version of the Matrix demo and Fortnite. The engine seems to favor PS5 at the moment.

I've seen the fortnite comparisons and that looks like epected - better lods on the ps5. So I can see the ps5 moving in better assets faster. But in this game we see a notable difference in image quality. I don't think that can be affected by the io but I'm no expert.

I think SomeGit is closer. If they are both running fsr2.1 on the same setting, there is something happening somewhere else in the pipe.

fsr2_where_in_frame.png
 
Last edited:

SomeGit

Member
If it was I/O related then it would become sharper on XSX while standing still, that doesn’t happen if I was a betting man I’d bet on different CAS settings. But it’s something that affects the whole screen.

Besides Nanite doesn’t work for animated meshes yet, so that wouldn’t be affected.
 
Last edited:

Kataploom

Gold Member
If it was I/O related then it would become sharper on XSX while standing still, that doesn’t happen if I was a betting man I’d bet on different CAS settings. But it’s something that affects the whole screen.

Besides Nanite doesn’t work for animated meshes yet, so that wouldn’t be affected.
Well, both ports don't seem to be done by same team since there are inconsistencies that can be attributed to different diverging ideas like the main menu screen so I would count on it
 

SlimySnake

Flashless at the Golden Globes
And confirmed. Async is faster on PS5. Props to whoever asked this question. I know it one of you. ;p


[–]Leather-Tomorrow4221 1 point an hour ago
I'd have to see a video of the exact issue because there are potentially a few things going on here.
We had some issues where the Xbox OS was a little more aggressive about taking memory. We worked with Microsoft to free as much as we could from that but at the end of the day the PS5 just has more usable ram for us. That translates into larger textures pools and the ability to have larger buffers.
Implementations for async compute is obviously platform dependent (ie. we are calling console OS libraries) and thats generally been faster on PS5.
We aren't using the same tunings on the PS5 and Xbox X - despite being close in hardware specs, the PS5 is a little better so we pushed fidelity a little more. That is most of the performance delta. We are examining all of those choices and will hopefully have some future patches that address the fidelity gap as well as fix some of the performance hiccups.

Please ignore the next sentence where he says the PS5 is a little bit better because thats going to create wars that will end up with people banned.
 
Last edited:

Thick Thighs Save Lives

NeoGAF's Physical Games Advocate Extraordinaire
Devs confirmed that they are both FSR 2.1 on ultra performance which means they are both 720p.

PS5 has better detail because of the better IO pipeline (cerny's secret sperm sauce) that seems to be the key to UE5 as mentioned by devs.


Some people on gaf noticed visual inconsistencies in the XSX version of the Matrix demo and Fortnite. The engine seems to favor PS5 at the moment.
Sorry to interject in this discussion, but you made me crack up real hard with that one. Almost spilled my coffee on the keyboard. 🤣

Anyway, carry on! 😂

Edit: Also, props to that dev for being so responsive on the game's subreddit.
 
Last edited:

Zathalus

Member
And confirmed. Async is faster on PS5. Props to whoever asked this question. I know it one of you. ;p




Please ignore the next sentence where he says the PS5 is a little bit better because thats going to create wars that will end up with people banned.
That clears that up. I'm wondering about his comment about the Xbox OS though, I thought it was a static allocation of 2.5GB? What's this about taking memory?
 
Bring on the PS5 Pro Cerny *enters bathtub*
Why so we have buy more shit in this inflation riddne timeline. Also that would just mean they would use more brute force instead of getting better. We didn't need pro consoles and still don't.

This game didn't need to be 60fps, it could be 30 but you guys would bitch and moan.
OR they could use a different engine. There were tons of games on ps4 that looked better than this.
 

jroc74

Phone reception is more important to me than human rights
And confirmed. Async is faster on PS5. Props to whoever asked this question. I know it one of you. ;p




Please ignore the next sentence where he says the PS5 is a little bit better because thats going to create wars that will end up with people banned.
Honestly, the posts you, Lysandros Lysandros and others made about this was enough for me.

Cool to have confirmation tho.
 
Last edited:

Lysandros

Member
And confirmed. Async is faster on PS5. Props to whoever asked this question. I know it one of you. ;p




Please ignore the next sentence where he says the PS5 is a little bit better because thats going to create wars that will end up with people banned.
This is the first time i'll say this on these forums: "It feels good be right from the beginning". Sorry for the flex everyone, i am mere human. Nevermind.
 
This is very interesting stuff. Can someone run this tool on PCs to see where their GPU ranks?

The Xbox GPU should be around 950 if they are going by tflops alone. I would love to see what they rate it as. CPU is way higher than i thought it would be. The PS5 CPU is handicapped by lack of cache.

Whats interesting is that they set the 2080 and 5700xt as minimum in their PC requirements. They both should be around 870 unless they are bottlenecked by their 8GB of vram and the PS5's VRAM is making it rank higher in their tool.

F3v20ZBasAAsqPW.jpg:large



EDIT: Apparently the guy who wrote the tool replied in the reddit thread and said not to conflate this with PC ratings since the tool was mainly written for PCs and not consoles.





He mentions async compute in the second response. If you guys want, you can ask him which console has better async compute. lol
Wait --- wtf... Ryzen 7 just to run this? A 16 core cpu and 2080 super to run on low 1080p....

Guess I am out... not that I wanted this. I have a Ryzen 5 3600 and 3060ti . Which can run every game out there at 1080p max settings at 60fps+ (and most 1440p too).
This is insane specs for pc, let alone console.
Why even use this engine? Why not use what they made metro with or kingdom come deliverance, FF16, or AC valhalla.
Its bad when last gen games look better and these rec specs are just nuts.
 

Jose92

[Membe
Right and from that you would have no way in guessing what they are referring to. But earlier in the same article they kind of let it slip:



Not that it really is that significant, the developers already confirmed performance parity between the XSX and PS5. Arguing about Async compute does not make one make better if the end result is the same. It just sheds light on the fact the both consoles requires different optimization paths.


It's almost certainly a sharpening filter (likely CAS) as the entire image is sharper, not individual elements.

They're literally calling it a wonderful speed boost coupled with Direct Storage, which both Xbox and PC versions use.

But one has 52 CUs and the other 36. Async benefits from more CUs , even if they run slower. As you can run more parallel.

52 CUs with each one of them being hampered by lower L1 cache bandwidth/amount also affecting Async throughput directly or indirectly. Both have the same number of ACEs, HWS and GCP going by the standard RDNA/2 design, being 4+1+1. Every single one of these components is 20% faster on PS5. Now divide those resources by the number of CUs. Which system has more of those resources available per CU? Which system can run more processes asynchronously per CU and therefore has more to gain from an application using async compute extensively in the context of CU saturation/compute efficiency?
Lysandros Lysandros you were right


This is from one of the devs on their subreddit
 

DenchDeckard

Moderated wildly
So dynamic resolution after all as confirmed by devs? hmmm

Also, I found something very interesting. Nothing like that we've ever seen on a console.
Someone, by mistake or maybe a bug accessed a pc settings menu in ps5 version of the game.
And this is a very interesting menu because it shows a gpu and cpu score calculated by epic synthetic benchmark


YyrIcE6.jpg


From interview:
"
The PC graphics menu calculates a score for your CPU and GPU. What is this based on?

Mark Maratea
: Epic built a synthetic benchmark program [that we use] and pull real [user-facing] numbers, so that's where that comes from. CPU is essentially single-core performance, GPU is full-bore everything GPU performance. A Min-spec CPU is somewhere around 180 or 200, ultra is around 300; min-spec GPU is around 500, ultra starts from around 1200 which is where a 7900 XT or a 4080 shows up.
"


Also, another interesting tidbit from interview:
"
How does Nanite and virtual shadow maps translate to console like PS5 and Series X/S?

Julia Lichtblau
: On the art side, we haven't really had to adjust anything, but perhaps Mark has been doing stuff on the back end to make that work so us artists don't have to worry as much!

Mark Maratea: I would say in some ways it works better on consoles, weirdly enough. Nanite virtualised geometry is very stream-heavy, it is all about disk I/O. So UE5 is rewriting the I/O pipeline and the optimisations they made for NVMe SSDs. It's designed to work on consoles.
"

This is pretty huge, if only someone could get the xbox to do that and benchmark them both. That would be interesting to see.
 

DenchDeckard

Moderated wildly
Thanks Jose, i only sticked to base hardware facts and basic logic. We also have confirmation for the higher fidelity on PS5 due to additional headroom now, may be to be forwarded to Digital Foundry.

Yup, gonna be interesting to see if the dev responds to the question asked on there. Need to understand the performance difference between the consoles but looks like you could be right here.
 

LordOfChaos

Member
Devs confirmed that they are both FSR 2.1 on ultra performance which means they are both 720p.

PS5 has better detail because of the better IO pipeline (cerny's secret sperm sauce) that seems to be the key to UE5 as mentioned by devs.


Some people on gaf noticed visual inconsistencies in the XSX version of the Matrix demo and Fortnite. The engine seems to favor PS5 at the moment.

If this is the real reason, it's interesting that this is starting to matter now, it must be years since that first UE5 PS5 demo was memed to high hell (remember another Linus controversy and apology video?)

Granted it coming in and starting to matter is also when both of them are quite honestly on the strugglebus with this title
 

Bogroll

Likes moldy games
Wait --- wtf... Ryzen 7 just to run this? A 16 core cpu and 2080 super to run on low 1080p....

Guess I am out... not that I wanted this. I have a Ryzen 5 3600 and 3060ti . Which can run every game out there at 1080p max settings at 60fps+ (and most 1440p too).
This is insane specs for pc, let alone console.
Why even use this engine? Why not use what they made metro with or kingdom come deliverance, FF16, or AC valhalla.
Its bad when last gen games look better and these rec specs are just nuts.
It's like inflation has hit our consoles and GPU's.
 
Top Bottom