• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Has Unreal Engine 5 (lumen, nanite, metahuman) really helped reduce development time/costs?

Has Unreal Engine 5 really reduced development time/cost

  • Yes

    Votes: 29 19.0%
  • No

    Votes: 124 81.0%

  • Total voters
    153
When Unreal Engine 5 was announced, Epic were talking about how there features would really drive down the costs and time of development

Lumen - Remove the need for artists to bake lighting and undo their work every time part of a level is redesigned

Nanite - Remove the need for artists to create several models of assets based on distance from screen

Metahuman - Remove the need for artists to create facial/body models, rigging and animation from scratch

So, has this actually panned out, are developers now able to create games at the level they were last generation much quicker?

Or, are these features causing too many headaches due to current-gen consoles not being powerful enough, resulting in the need for yet another console generation?
 

M1chl

Currently Gif and Meme Champion
Well certainly not on compute side of things:



I highly recommend this video, given its probably a reason UE5 isn't used it as much and certainly it isn't what initial hyped might have suggested

TLDR: Nanite introduce a lot of overdraw, which means you are wasting GPU cycles, LOD even generated one is far better for optimisation. Given what has been said in this video and by own account, while playing with it extensively, I think Nanite will be something like a ID Tech mega texture down the line. Not the mention the strain on CPU, which isn't handled in a way ID Tech 7 is for example. But one director core and other aux ones, which limits performance based on Single Core performance, which is especially problem in consoles.
 
Last edited:

Kenneth Haight

Gold Member


Still waiting on my PS5 to play this at 60fps....... maybe on a Pro but who knows, we need the devs to make the games like this, just another reminder that tech demos should always be taken with a pinch of salt. 4 odd years in to this gen and Demons Souls day 1 launch title is still one of the best looking games ffs
 

Cyborg

Member
Ofc not. The narrative is that games cost more because development time has increased. The games went to 70 dollars, will go to 80 in few years.
 

The Cockatrice

Gold Member
It certainly made devs lazier in regards to optimizing their games thats for sure. And what was supposed to be a miracle for stuttering turned out to be a dud. UE5 are still massive stutter fests when it comes to loading assets. CDPR had massive issues with their own built-in engine and I dread how Orion will run.
 
Well certainly not on compute side of things:



I highly recommend this video, given its probably a reason UE5 isn't used it as much and certainly it isn't what initial hyped might have suggested

TLDR: Nanite introduce a lot of overdraw, which means you are wasting GPU cycles, LOD even generated one is far better for optimisation. Given what has been said in this video and by own account, while playing with it extensively, I think Nanite will be something like a ID Tech mega texture down the line. Not the mention the strain on CPU, which isn't handled in a way ID Tech 7 is for example. But one director core and other aux ones, which limits performance based on Single Core performance, which is especially problem in consoles.


Excellent video

It seems like these effort saving features are a bit too resource heavy resulting in a lot of the benefits being cancelled out on current-gen hardware.

Would you say a PS6 with a spec bump similar to the PS4>PS5 transition be enough to allow these kinds of features and engines to become ubiquitous?
 

M1chl

Currently Gif and Meme Champion
Excellent video

It seems like these effort saving features are a bit too resource heavy resulting in a lot of the benefits being cancelled out on current-gen hardware.

Would you say a PS6 with a spec bump similar to the PS4>PS5 transition be enough to allow these kinds of features and engines to become ubiquitous?
Question is more of a what other new features will be introduced, however if we talking about UE5 in its current form running on new gen HW, I think there isn't much of an issue. However its probably not going to compete with old LODs still when it comes to performance.
 
Obviously not. Nanite and Lumen mostly just make games run like shit.

I found a Digital Foundry vid on it and it does seem the case.

If next-gen hardware solves this then I’m sold, not because I want prettier graphics (I’m happy enough with Last of Us 2’s visuals), for me they never need to exceed that.

I just want it to be easier for developers to compete with that as budgets like Naughty Dog have are just unrealistic and unsustainable.
 

Go_Ly_Dow

Member
I am skeptical, at least until next gen.


I have been torn but am now leaning to side that they should stick to their modified UE4 for the final FF7 game, rather than switch to UE5.
 
I am skeptical, at least until next gen.


I have been torn but am now leaning to side that they should stick to their modified UE4 for the final FF7 game, rather than switch to UE5.
Me too - but I'm hoping FF7-3 on UE5 will age better with the PS6 I reckon
 
Even if new hardware solves these issues through brute force, Unreal Engine 6 will come along and introduce something new that will be just as bad.

What else big could they add at this point?

I mean it’s infeasible that games look better than something like Hellblade 2, that’s as photorealistic as things need to get…

…right?
 
What else big could they add at this point?

I mean it’s infeasible that games look better than something like Hellblade 2, that’s as photorealistic as things need to get…

…right?
Man, I remember when people were saying this about the original Unreal.
DEywN.jpeg
 

blastprocessor

The Amiga Brotherhood
From the comments (of that video in 2nd post above) believe he posts on B3D and worked on a 2.5d racing game Trail Fusion :-

@sebbbi2

7 days ago (edited)
Nanite’s software raster solves quad overdraw. The problem is that software raster doesn’t have HiZ culling. Nanite must lean purely on cluster culling, and their clusters are over 100 triangles each. This results in significant overdraw to the V-buffer with kitbashed content (such as their own demos). But V-buffer is just a 64 bit triangle+instance ID. Overdraw doesn’t mean shading the pixel many times. While V-buffer is fast to write, it’s slow to resolve. Each pixel shader invocation needs to load the triangle and runs equivalent code to full vertex shader 3 times. The material resolve pass also needs to calculate analytic derivatives and and material binning has complexities (which manifest in potential performance cliffs). It’s definitely possible to beat Nanite with traditional pipeline if your content doesn’t suffer much from overdraw or quad efficiency issues. And your have good batching techniques for everything you render. However it’s worth noting that GPU-driven rendering doesn’t mandate V-buffer, SW rasterizer or deferred material system like Nanite does. Those techniques have advantages but they have big performance implications too. When I was working at Ubisoft (almost 10 years ago) we shipped several games with GPU-driven rendering (and virtual shadow mapping). Assassin’s Creed Unity with massive crowds in big city streets, Rainbox Six Siege with fully destructive environment, etc. These techniques were already usable on last gen consoles (1.8TFLOP/s GPU). Nanite is quite heavy in comparison. But they are targeting single pixel triangles. We werent.I am glad that we are having this conversation. Also mesh shaders are a perfect fit for GPU-driven render pipeline. AFAIK Nanite is using mesh shaders (primitive shaders) on consoles at least. Unless they use SW raster today for big triangles too. It’s been long time since I analyzed Nanite for the last time (UE5 preview). Back then their PC version was using non-indexed geometry for big triangles, which is slow.
 
Last edited:

Laptop1991

Member
I have not been impressed with Unreal Engine 5 so far, it hasn't had the impact of previous versions for me like 3 or 4 in either performance or looks, maybe that's down to the Publishers and Devs but maybe it just isn't as good as it should be.
 
When Unreal Engine 5 was announced, Epic were talking about how there features would really drive down the costs and time of development

Lumen - Remove the need for artists to bake lighting and undo their work every time part of a level is redesigned

Nanite - Remove the need for artists to create several models of assets based on distance from screen

Metahuman - Remove the need for artists to create facial/body models, rigging and animation from scratch

So, has this actually panned out, are developers now able to create games at the level they were last generation much quicker?

Or, are these features causing too many headaches due to current-gen consoles not being powerful enough, resulting in the need for yet another console generation?
The problem with developing video games, with or without the features of UE5, is that you're not designing last gen games with current engines. This, or whatever supposed cost optimization is nullified with the increase in demand that the new features provide. Sure, you save cost of models with Nanite, but then you increase that model count by 10x over. Every bit of saving means more to produce for the same time period, I don't know the term foe it, but you're just chasing a running train with technology.
 

Hugare

Member


Still waiting on my PS5 to play this at 60fps....... maybe on a Pro but who knows, we need the devs to make the games like this, just another reminder that tech demos should always be taken with a pinch of salt. 4 odd years in to this gen and Demons Souls day 1 launch title is still one of the best looking games ffs

I always knew that third party titles would be shackled by the Series S and the best graphics would come from Sony (most of the time)

I wasnt expecting Sony output to turn mediocre, tho. So now my hopes are mainly on devs who ignore the S (like Wukong) or unicorns like GTA VI.

Regarding the topic: kinda.

Not having to make tons of different assets due to different LOD levels speed things up. Tho artists now have to model every asset to * perfection * due to the fidelity that people demand from AAA games nowadays, same for textures.

There must be tons of copy and paste in Wukong, for example, but the quantity and fidelity of those assets is just crazy.

Lumen helps to create more realistic looking environments, but artists without it would just throw some realtime GI there and call it a day anyways. Same effort, but better results with Lumen

Bottomline: Effort is basically the same
 
Last edited:

Killer8

Member
From the mouth's of developers like the Immortals of Aveum one, yes it does reduce development time and in turn costs. No need to create tons of different LODs with Nanite and with Lumen they can re-light scenes in real time. I'd rather listen to them than the impotent rage of a teenage Youtuber with an agenda who's never shipped a game.

Of course these features have flaws, like everything, as they can be heavier on the hardware. They are not magic. But evidently the choice to lean on the hardware more to ship a game much faster has been calculated as being worthwhile. It's not like the games are unplayable either, they just aren't meeting the loftier expectations people seem to have from hardware this gen.

The reality is that there is no silver bullet approach to development. Something will always have to give. The other reality is that people will moan about literally everything. If it's not retail price it will be development time, visuals, resolution, disk space requirements, performance or any combination of any of those.

I simply look at how an AA dev like Teyon is now able to ship a game that looks as good as RoboCop: Rogue City and it isn't taking 7 years or bankrupting the company:

I'm on the last mission and I really don't want this game to end, help.

xzJcGz5.jpg


IolzTdR.jpg


Qlyox9d.jpg


A3lYYCO.jpg


CJ38ccq.jpg


V7KPWC3.jpg


D4xW17m.jpg


nlBzknq.jpg


GxzrYgU.jpg


3q6vmvk.jpg
 

poppabk

Cheeks Spread for Digital Only Future
I think it has for people who use it for that purpose - Black Myth Wukong, Robocop Rogue City.
 

Mister Wolf

Gold Member
Because they look awesome, git gud hardware.

True Story. People who spend the least expect the most.



The lighting in this looks old and unimpressive. Which is why it can run at 60fps with higher resolution. Lumen and every other RTGI solution is the best thing to happen to gaming graphics in recent years.
 
Last edited:

SlimySnake

Flashless at the Golden Globes


Still waiting on my PS5 to play this at 60fps....... maybe on a Pro but who knows, we need the devs to make the games like this, just another reminder that tech demos should always be taken with a pinch of salt. 4 odd years in to this gen and Demons Souls day 1 launch title is still one of the best looking games ffs

Why? They specifically said this was running at 1440p 30 fps.
And wukong looks better than this at times.
 

StereoVsn

Member
UE5 certainly made games run like shit in low res. I don’t think extra features are worth it so far at least.

Maybe once we get through next console gen and more powerful CPU/GPUs.
 

poppabk

Cheeks Spread for Digital Only Future
was supposed to get rid of pop-in, introduced even more pop-in. But I guess the hardware is not there yet
It did? I feel like that is the one thing they truly delivered on, maybe not perfect but a definite improvement.
 

Bartski

Gold Member
It did? I feel like that is the one thing they truly delivered on, maybe not perfect but a definite improvement.
I think while it keeps many things at full LOD it needs to compensate elsewhere like on textures and large scale geometry spawning in and out. Like Lords of the Fallen that game looked horrid at times on PlayStation
 


Still waiting on my PS5 to play this at 60fps....... maybe on a Pro but who knows, we need the devs to make the games like this, just another reminder that tech demos should always be taken with a pinch of salt. 4 odd years in to this gen and Demons Souls day 1 launch title is still one of the best looking games ffs

yep. Still the best thing to come out from UE5, actually running on console. Maybe it was the I/O the secret of that demo after all? We still haven't seen this demo run elsewhere.
 
Last edited:

T4keD0wN

Member
Because they look awesome, git gud hardware.
Nah, lumen looks like shit in about every game, try comparing maxed out wukong with raytracing on the lowest setting against lumen max settings. Nearly the same performance, but lumen just looks like shit.

Its just quick, common industry tool and thus cheaper way to make games that sacrifices quality ambitions and performance. Good tradeoff since lumens potential is "good enough, but the best looking games will never be the ones that use lumen.
 
Last edited:

Senua

Member
Nah, lumen looks like shit in about every game, try comparing maxed out wukong with raytracing on the lowest setting against lumen max settings. Nearly the same performance, but lumen just looks like shit. Its just quick, easy and cheaper way to make games that sacrifices potential quality and performance.
Because that's software lumen. Hardware lumen is superior. "Shit" is hyperbolic also.
 
From the comments (of that video in 2nd post above) believe he posts on B3D and worked on a 2.5d racing game Trail Fusion :-

@sebbbi2

7 days ago (edited)
Nanite’s software raster solves quad overdraw. The problem is that software raster doesn’t have HiZ culling. Nanite must lean purely on cluster culling, and their clusters are over 100 triangles each. This results in significant overdraw to the V-buffer with kitbashed content (such as their own demos). But V-buffer is just a 64 bit triangle+instance ID. Overdraw doesn’t mean shading the pixel many times. While V-buffer is fast to write, it’s slow to resolve. Each pixel shader invocation needs to load the triangle and runs equivalent code to full vertex shader 3 times. The material resolve pass also needs to calculate analytic derivatives and and material binning has complexities (which manifest in potential performance cliffs). It’s definitely possible to beat Nanite with traditional pipeline if your content doesn’t suffer much from overdraw or quad efficiency issues. And your have good batching techniques for everything you render. However it’s worth noting that GPU-driven rendering doesn’t mandate V-buffer, SW rasterizer or deferred material system like Nanite does. Those techniques have advantages but they have big performance implications too. When I was working at Ubisoft (almost 10 years ago) we shipped several games with GPU-driven rendering (and virtual shadow mapping). Assassin’s Creed Unity with massive crowds in big city streets, Rainbox Six Siege with fully destructive environment, etc. These techniques were already usable on last gen consoles (1.8TFLOP/s GPU). Nanite is quite heavy in comparison. But they are targeting single pixel triangles. We werent.I am glad that we are having this conversation. Also mesh shaders are a perfect fit for GPU-driven render pipeline. AFAIK Nanite is using mesh shaders (primitive shaders) on consoles at least. Unless they use SW raster today for big triangles too. It’s been long time since I analyzed Nanite for the last time (UE5 preview). Back then their PC version was using non-indexed geometry for big triangles, which is slow.
He also wrote:

Also it’s worth remembering why Nanite was built. Nanite allows developers (and users in UGC platforms) to create scenes with millions of objects with millions of triangles each, and these scenes render from any camera location at close to fixed cost. This dream required sacrifices. Constant cost per pixel is higher. There’s V-buffer resolve to G-buffer, there’s 3x vertex transforms for each pixel. But what we get is a pipeline capable of precise occlusion culling and almost fixed triangle count per frame. Object in front occludes N pixels (= 1 pixel triangle each). Thus adding an object adds N triangles but also removes N triangles. Thus depth complexity doesn’t matter. Seamless LOD is very important here as it guarantees roughly 1:1 triangles:pixels. These two techiques make it possible to render massive scenes with close to fixed cost. But if you have hand optimized mid-poly triangle meshes (with normal maps baked from high poly) then Nanite can’t really do anything magical to make that mesh more optimal to render. You just pay added constant costs. If you have complex scenes with lots of objects and lots of partial occlusion, Nanite still helps. The biggest problem right now is that the fixed costs plus costs to render 1 triangle per pixel (that’s what Nanite LOD solution does) are today too much for PS5 to render at native res. Thus you need temporal upscaler and quality trade-offs. But if we assume Nanite gets optimized during the next years it will be awesome when next gen ships. These added fixed runtime costs will be meaningless compared to the improvements it gives. But OP is right. Today Nanite is a compromise on PS5.
 

GymWolf

Gold Member
It's one hell of an engine for sure, and if take into account that china has lowest salaries, they made wukong with a far lower budget than any AAA from sony or ubisoft, probably close to what nintendo spend but with actual production values and top of the class graphic.

So it very could be a money saver if used by the right devs.

Also robocop punch way beyond its weight class thanks to ue5 and the dev is as AA as it can possibly be.

Gaiff Gaiff put that thing in spoiler.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
It's one hell of an engine for sure, and if you account that china has lowest salaries, they made wukong with a far lower budget than any AAA from sony or ubisoft, probably close to what nintendo spend but with actual production values and top of the class graphic.

So it very could be a money saver if used by the right devs.

Gaiff Gaiff put that thing in spoiler.
Alright. I wanted everyone to see it...

:lollipop_sad_relieved:
 
Last edited:

E-Cat

Member
I am skeptical, at least until next gen.


I have been torn but am now leaning to side that they should stick to their modified UE4 for the final FF7 game, rather than switch to UE5.
I think they should skip PS5 altogether and release FF7 Remake Part 3 exclusively on PS6 on UE5.
 
Top Bottom