• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Brothers: A Tale of Two Sons Remake - PS5/XSX/S Tech Review - UE5 Nanite/Lumen Come at a Heavy Cost

Kerotan

Member
454p on Series S. We're back to the N64 era boys. I know people love remakes but we didn't ask to remake the resolutions too.

Michael Jordan Lol GIF
MS were a fucking disgrace releasing that console.
 

SlimySnake

Flashless at the Golden Globes
Just bought it. I was never a fan of Brothers, but I adore It Takes Two. It's good to go back and see Joseph Fares plant the seeds for what was to come. I am appreciating the little details controlling the two brothers a lot more now than i did ten years ago. go figure.

DF wasnt lying. Game is very heavy on the GPU. CPU usage is non-existent but my 3080 runs this game between 30-50 fps maxed out at 4k dlss quality. i just capped it to 30 fps. i dont know why people say vrr or gsync hides frame drops, i can notice every fucking frame drop on my LGCX even with gysnc engaged. just capped it to 30 fps because its much smoother that way.

Not surprised consoles are running it at 1440p 30 fps. Game is beautiful and very pleasing to look at but im not seeing why its so expensive on the GPU. other ue5 games ive played arent this heavy.

zfU5lWl.gif
 

ABnormal

Member
To be fair, the game is running at a higher resolution than both the Matrix and the first UE5 demo in its 30 fps mode.

the graphics might not look as good but the tech behind the scenes is still very taxing and is performing as intended.

Lords of the Fallen and Robocop also run around 1440p in their 30 fps mode. 60 fps is just too much for these consoles when you are using next gen tech like lumen and nanite. UE5 is performing fine in the 30 fps modes.
This. UE5 is simply a new version of UE and it usesvarious techs which will need some beefier hardware to run completely, but it makes possible things that never have been, from a realtime rendering standpoint and data management also. Next gen will be probably have it as a new UE standard.
 

ABnormal

Member
Looks like ue5 feels really good on bigger GPUs and this is another game where we can see series x hardware advantage.
We are still talking of an advantage that is only perceptible to analisys tools. In this gen there will be never a significant difference to the eye.
 

Elysium44

Banned
MS were a fucking disgrace releasing that console.

But not the devs for forcing expensive techniques in a performance mode?

bNblDeo.jpg


nvidia are to blame for not making the 4090 more powerful I guess. Nothing to do with the stupid devs of this game.

You can't blame the Series S on this occasion. It would be like if Capcom had forced RT and hair strands on the Series S performance mode for Resident Evil 4 - it would be stupid, which is why they didn't do it. If they had, performance and/or resolution would have suffered dramatically - just like it does in this game.
 
Last edited:

SKYF@ll

Member
DF wasnt lying. Game is very heavy on the GPU. CPU usage is non-existent but my 3080 runs this game between 30-50 fps maxed out at 4k dlss quality. i just capped it to 30 fps. i dont know why people say vrr or gsync hides frame drops, i can notice every fucking frame drop on my LGCX even with gysnc engaged. just capped it to 30 fps because its much smoother that way.

Not surprised consoles are running it at 1440p 30 fps. Game is beautiful and very pleasing to look at but im not seeing why its so expensive on the GPU. other ue5 games ive played arent this heavy.
It seems that even 1080p is heavy(GPU) and PCs are also struggling.
I'm starting to worry about future UE5 games.
Brothers-A-Tale-of-Two-Sons-Remake-benchmarks-1.png

 

SlimySnake

Flashless at the Golden Globes
But not the devs for forcing expensive techniques in a performance mode?

bNblDeo.jpg


nvidia are to blame for not making the 4090 more powerful I guess. Nothing to do with the stupid devs of this game.

You can't blame the Series S on this occasion. It would be like if Capcom had forced RT and hair strands on the Series S performance mode for Resident Evil 4 - it would be stupid, which is why they didn't do it. If they had, performance and/or resolution would have suffered dramatically - just like it does in this game.
If Lumen is what is so taxing then you cant expect devs to just take it out. If they are taking it out then they need to provide a baked lighting solution which means shipping the game with lighting textures. probably a completely different workflow.

Lumen and other realtime lighting techniques help devs ship games faster and more efficiently. A lot of the games we have gotten this gen are basically baked in with RT slapped on top. Its possible that this game was designed from the ground up using lumen and there is simply no fallback.
 

SlimySnake

Flashless at the Golden Globes
It seems that even 1080p is heavy(GPU) and PCs are also struggling.
I'm starting to worry about future UE5 games.
Brothers-A-Tale-of-Two-Sons-Remake-benchmarks-1.png

whats frustrating is that Epic keeps touting how they have improved performance at every GDC presentation. They were saying how they can run the matrix demo at 2x the framerate now with the latest UE5 build. Said the same for the very first UE5 demo. Whatever they are doing is just not translating into ingame performance.
 

Elysium44

Banned
If Lumen is what is so taxing then you cant expect devs to just take it out. If they are taking it out then they need to provide a baked lighting solution which means shipping the game with lighting textures. probably a completely different workflow.

Lumen and other realtime lighting techniques help devs ship games faster and more efficiently. A lot of the games we have gotten this gen are basically baked in with RT slapped on top. Its possible that this game was designed from the ground up using lumen and there is simply no fallback.

I see what you mean. So they can't be blamed at this stage for not just taking it out perhaps, but they can certainly be blamed for designing it to require an excessively expensive method in the first place, if that is the case.
 

Magic Carpet

Gold Member
If Lumen is what is so taxing then you cant expect devs to just take it out. If they are taking it out then they need to provide a baked lighting solution which means shipping the game with lighting textures. probably a completely different workflow.

Lumen and other realtime lighting techniques help devs ship games faster and more efficiently. A lot of the games we have gotten this gen are basically baked in with RT slapped on top. Its possible that this game was designed from the ground up using lumen and there is simply no fallback.
if it made developing the game quicker and cheaper I will give it a bit more leway in my criticism. My 4070 huffs and puffs but I did not have to dial any visuals back.
 

Elysium44

Banned

SlimySnake

Flashless at the Golden Globes
You have to wonder how the 'recommended' GPU is an RTX 2060 Super, when even an RTX 3080 can't maintain 60fps minimums at 1080p.
dont look at the minimum framerate. thats just a 1% low which could be a stutter or something rare. the average is well over 60 fps.

this is also ultra settings which are very taxing compared to even high. 2060 should run this at 1080p 30 fps using some kind of dlss upscaling and medium settings.
 

Elysium44

Banned
dont look at the minimum framerate. thats just a 1% low which could be a stutter or something rare. the average is well over 60 fps.

this is also ultra settings which are very taxing compared to even high. 2060 should run this at 1080p 30 fps using some kind of dlss upscaling and medium settings.

You'd hope a PC version especially would use a 'recommended' card which can average 60fps at 1080p though, and preferably without upscaling.

How competent are these devs? I notice the Steam page cites the minimum GPU as a GTX 1650 8GB. You can't get that card with 8GB to my knowledge, so this is a lazy mistake.
 

Gaiff

SBI’s Resident Gaslighter
Just bought it. I was never a fan of Brothers, but I adore It Takes Two. It's good to go back and see Joseph Fares plant the seeds for what was to come. I am appreciating the little details controlling the two brothers a lot more now than i did ten years ago. go figure.

DF wasnt lying. Game is very heavy on the GPU. CPU usage is non-existent but my 3080 runs this game between 30-50 fps maxed out at 4k dlss quality. i just capped it to 30 fps. i dont know why people say vrr or gsync hides frame drops, i can notice every fucking frame drop on my LGCX even with gysnc engaged. just capped it to 30 fps because its much smoother that way.

Not surprised consoles are running it at 1440p 30 fps. Game is beautiful and very pleasing to look at but im not seeing why its so expensive on the GPU. other ue5 games ive played arent this heavy.

zfU5lWl.gif
Fuck that’s heavy on the GPU.
 

Kataploom

Gold Member
This is a good thing. It simply means the series s did not hold back the consoles. only people affected by this are the poor sods who bought the series s. hopefully they have learned their lesson going forward. you get what you pay for.
This is what I've been saying for a while. Those consoles get limited by themselves way before Series S is even a factor.

Hell we even have developers testimony on how the Serie S version actually helped them so better optimizations for the others so it's existence had just benefited them, with the exception of BG3 on XSX which is the outlier, but iirc big consoles wouldn't even have performance mode in AW2 if not for the XSS
 

rofif

Can’t Git Gud
It’s a graphical remake so it’s good they are pushing graphics. Why else to remake a game if you are not changing gameplay
 

JackMcGunns

Member
454p on Series S. We're back to the N64 era boys. I know people love remakes but we didn't ask to remake the resolutions too.

Michael Jordan Lol GIF

But did it affect the other platforms? Imagine PC gamers with an RTX 4090 making fun of framerate and performance of RTX 4060 users, you can’t imagine it because it’s dumb and pointless. People who bought a Series S paid less and are still having fun. That’s Jordan laughing at your pettiness
 

Gaiff

SBI’s Resident Gaslighter
But did it affect the other platforms? Imagine PC gamers with an RTX 4090 making fun of framerate and performance of RTX 4060 users, you can’t imagine it because it’s dumb and pointless. People who bought a Series S paid less and are still having fun. That’s Jordan laughing at your pettiness
Emotion Reaction GIF
 

Bojji

Gold Member
Lumen and nanite was a mistake.

Whats the point of bring lumen and nanite to the consoles without optimization?

On consoles devs should choose between:

-lumen
-nanite
-virtual shadow maps

Using 3 of them together is too taxing. They should evaluate what they need the most.

For example in this game with this camera angle nanite and VSM are IMO not needed, you won't see many benefits but lumen is essential without baked lighting. At the same time they could have used nothing with good baked lighting but that requires more manual work.
 

Gaiff

SBI’s Resident Gaslighter
Got caught lying and now you’re crying? He said the “Majority of the time the dynamic resolution rests at 720p” but you disingenuously tried to say 454p was somehow the norm.


shame-gameofthrones.gif

Nah, I think not enough care was put into squeezing the juice out of these consoles. 1260p to 1620p in Quality Mode. Come on. Should have no issue reaching 4K/30 in a game with those visuals.

Then 1080p in Performance Mode and still drops to the 30s on the PS5.

That's just a shit job all around.

Probably Lumen just murdering performance. This would explain why in dimmer underground areas, the fps is much higher.
That’s what I said. Now quit crying with your, "Leave the Series S alone", nonsense. We’re taking the piss.
 
Last edited:

JackMcGunns

Member
That’s what I said. Now quit crying with your, "Leave the Series S alone", nonsense. We’re taking the piss.


It’s not about me saying “Leave the Series S alone” it’s about me calling out you’re console warring.

Get that nonesense out of here
 

Gaiff

SBI’s Resident Gaslighter
It’s not about me saying “Leave the Series S alone” it’s about me calling out you’re console warring.

Get that nonesense out of here
There’s no console warring. You got upset and cried foul because I laughed at the low resolution. Imagine being this triggered over this.
 

JackMcGunns

Member
There’s no console warring. You got upset and cried foul because I laughed at the low resolution. Imagine being this triggered over this.

So you didn’t single out a specific console and laugh at it for its lack of performance? You were also making fun of PS5 and Series X performance. My mistake then.
 

Gaiff

SBI’s Resident Gaslighter
So you didn’t single out a specific console and laugh at it for its lack of performance? You were also making fun of PS5 and Series X performance. My mistake then.
A poster blamed the weak Series S and I said no, this isn’t on the console, this is on the dev and then pointed out that the PS5’s performance also tanks to the low 30s at 1080p too.
 

Skifi28

Member
I see the usual scenario of when consoles struggle, it's not good news for the PC version either. Let's see about Dragon's Dogma next.
 
Last edited:

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
This is a good thing. It simply means the series s did not hold back the consoles. only people affected by this are the poor sods who bought the series s. hopefully they have learned their lesson going forward. you get what you pay for.
if you sit sufficiently far enough away with a small tv it will look exactly the same. I don't see the problem.
 

Bojji

Gold Member
I think lumen is overrated, it's not "one size fits all" and can't be used in all environments. Disco lights, when I first saw this I thought my GPU was broken lol:



Even hardware accelerated version in Layers of Fear has this artifacts. RTXGI is cleaner in Metro for sure.

In the end nothing beats good baked lighting and lumen and other techniques should be used only in open world games with dynamic TOD.
 

xrnzaaas

Member
I don't understand the push for making games on UE5. UE4 should be the sweet spot between the graphics quality and what kind of performance current consoles (and cheaper PC's) can offer.
 

Astray

Member
I don't understand the push for making games on UE5. UE4 should be the sweet spot between the graphics quality and what kind of performance current consoles (and cheaper PC's) can offer.
Better visuals for much less work?

UE is pitched at devs, not gamers, if it were up to us then things would still be in UE3 or whatever because we don't know how the sausage is actually made.
 
I would have thought UE5 was designed specifially with current gen in mind. And vice versa current gen was desigend with UE5 in mind.
So why does it kinda kill consoles while also making top PC gear sweat? Shouldn't there be some lower/consolespecific settings and workflow proposals like with RT (or like softshadows after Doom3 and Fear pushed that stuff, or the several AA techniques...). I understand that you can't just turn it off, since that would require an entire different approach, but if the used methods are too taxing, it has to have some slider values to crank up the framerate and be less costly for resolution. What's the point in some barely visible effects, if the resolution is going back to PS3 era? Is this tech just more convenient for the dev, so not using it, doing it with old PS4 pipeline, is out of the question, and they can really not change anything in UE5 to make it run better, find a better compromise? It's either on, using UE5 fully or off, which means they could just use UE4 for the practically same results?
 

xrnzaaas

Member
Better visuals for much less work?

UE is pitched at devs, not gamers, if it were up to us then things would still be in UE3 or whatever because we don't know how the sausage is actually made.
You mean less work when using UE5? (fyi I'm a noob when it comes to technical stuff)

I think it's a double-edged sword, because if you release a game that runs like ass or has high base requirements on PC more people will avoid it or buy later at a discounted price.
 

Astray

Member
You mean less work when using UE5? (fyi I'm a noob when it comes to technical stuff)

I think it's a double-edged sword, because if you release a game that runs like ass or has high base requirements on PC more people will avoid it or buy later at a discounted price.
Yep, less work = less time spent building the game = less costs in most cases.

If you have less costs then you might be able to get profitable with less sales.
 

yamaci17

Member
But did it affect the other platforms? Imagine PC gamers with an RTX 4090 making fun of framerate and performance of RTX 4060 users, you can’t imagine it because it’s dumb and pointless. People who bought a Series S paid less and are still having fun. That’s Jordan laughing at your pettiness
rtx case is different

you have dlaa, which makes native 1080p look decent
you have dlss, which makes 900-1200p look amazing on 1440ps screens
and with 1200-1400p input, dlss simply looks stellar

and almost all RTX line up can take advantage of DLSS. the reason these games look so awful in terms of clarity is becaus how awful regular upscalers are
dlss at 480p input can match fsr at 960p input.



now imagine a game that upscales from 1200p to 4k with regular old upscalers
and then a game that upscales from 960p to 1440p or 1080p to 4K with DLSS

it just looks leagues different. the experience is simply different. as long as you're within 800-1080p with DLSS, games look great. with regular upscalers, it just leaves a bad taste

since dlss can match 960p-1080p input of other upscalers at 480p, you can kind of understand how good it is at 900p and above

if consoles had decent, dedicated, hardware based upscaling like DLSS, most of these concerns wouldn't even be concerns
 
Last edited:

midnightAI

Member
Nanite and Lumen arent the only issue (they are heavy, but that comes down to implementation), other devs can use it without much issue and for much more complex games. Its easy to blame Nanite and Lumen but its the devs choice to use those or not. UE5 is certainly not to blame here.
 
Top Bottom