Unreal Engine 5.6 is more than 30% faster than UE5.4 in CPU bound scenarios

I disagree. Peish has the right idea, this should have been one of the use cases when Epic creates version upgrades for Unreal Engine:
"As a developer, I would like to seamlessly update to the latest version of Unreal Engine without additional overhead testing current implementation."
If not, something they should strive for. Epic should want their customers to use the latest version effortlessly.
It's astroturfing from someone who doesn't know how it works.
I'm not really know how it is for UE, but I know how it is for a lot of business soft. And I expect UE be the same.
There usually a tons of "home customization" of soft (and these soft are not windows, they do allow to customize and even rebuild for own needs). And company for sure can't support millions customization across the globe of customers it has (heck, even ~windows~ have limited compatibility and require some efforts to migrate business structure to the next version).
The moment dev change something - something will break. Because, unironically, those "home customization" often uses "hacks" of software to make things better in non-intended by software developer way. Or they use "workarounds" to make things work. And the moment core code is fixed in a "best intention" basis - these adjustments just break.
Big projects with lots of legacy is very hard to migrate, even if new version is better.
 
Last edited:
It's astroturfing from someone who doesn't know how it works.
I'm not really know how it is for UE, but I know how it is for a lot of business soft. And I expect UE be the same.
There usually a tons of "home customization" of soft (and these soft are not windows, they do allow to customize and even rebuild for own needs). And company for sure can't support millions customization across the globe of customers it has (heck, even ~windows~ have limited compatibility and require some efforts to migrate business structure to the next version).
The moment dev change something - something will break. Because, unironically, those "home customization" often uses "hacks" of software to make things better in non-intended by software developer way. Or they use "workarounds" to make things work. And the moment core code is fixed in a "best intention" basis" - these adjustments just break.
Big projects with lots of legacy is very hard to migrate, even if new version is better.
Yup, you cant just recompile hoping it will just migrate. Even with super small projects you are bound to run into some issues.
 
Ah you're referring to the console versions. Yea it's definitely too heavy for current gen and is best experienced on PC for now.
He's lying. 640p is nonsense. he might be referring to Stalker which is an abomination. even on PC.

UE5 Games may drop to 720p but only in 60 fps modes, and thats true for virtually every game engine this gen. Snowdrop's Avatar and outlaws can both drop to 720p far more often than UE5 games ever do. They both use FSR which is far worse than UE5's TSR. Remedy's northlight engine runs at 864p and still drops frames in 60 fps mode. Had they used 720p as base like UE5 and Snowdrop games, they would not have had performance drops. Its image quality is far worse because they also use FSR. Dead Space which was built on Frostbite has a higher resolution but it literally shipped with the worst image quality of any game ever due to sub 1080p resolution and awful VRS implementation. Dragon Age, the other frostbite game, runs at 864p which is the average resolution for most UE5 games. Square Enix's luminous engine in Forspoken also drops to 720p and looks like shit compared to UE5 games. Square Enix's FF16 engine literally locks to 720p during combat sections in the 60 fps mode.

That leaves AC Shadow's Anvil engine which literally does not have RTGI or any RT features in their 60 fps mode. It looks last gen compared to the 30 fps mode which uses next gen lighting. Frostbite and Forspoken both remove RT in their performance modes too leading to a very last gen look.

Most of the recent games like Avowed and Expedition 33 run at 1080p 60 fps because they do use the newer versions that got lumen upgrades. Fortnite recently upgraded to Hardware Lumen because they recently improved hardware lumen performance to match software lumen. Again at 1080p 60 fps. This whole notion that UE5 performance still sucks on consoles is retarded.
 
Last edited:
How is this not create a trust issue? Imagine as a dev you start to develop a game using Unreal Engine. You pour your everything and reach so far in development and then bam! New version comes out and it says "We have huuuge performance enhancements with this new version". So how do you feel? Also how come 5.1, 5.2, 5.3 is so inferior with the new versions? There shouldn't be this much difference between small increase of version numbers. If there is a huge performance difference, then brand the engine UE6 why don't cha?

I really don't understand this. It is not consistent and developer friendly. I shared in one of my previous message that some of the developers are not happy about this. Because their fanbase demands devs to use new versions of UE and I believe it is not simple to do it. At least make it easy for devs to upgrade their games to the new version easily then.
 
Last edited:
How is this not create a trust issue? Imagine as a dev you start to develop a game using Unreal Engine. You pour your everything and reach so far in development and then bam! New version comes out and it says "We have huuuge performance enhancements with this new version". So how do you feel? Also how come 5.1, 5.2, 5.3 is so inferior with the new versions? There shouldn't be this much difference between small increase of version numbers. If there is a huge performance difference, then brand the engine UE6 why don't cha?

I really don't understand this. It is not consistent and developer friendly. I shared in one of my previous message that some of the developers are not happy about this. Because their fanbase demands devs to use new versions of UE and I believe it is not simple to do it. At least make it easy for devs to upgrade their games to the new version easily then.
They are not small upgrades as much as the 5.x terminology may want you to think. 5.6 effectively rewrote the engine's pipeline from scratch, it's closer to what UE6 would do if anything.
 
Last edited:
Even the likes of Silent Hill 2, what is the gain of having strobe-like high refresh Lumen scene updates when almost everything is static?
SH2's entire map is literally covered in dynamic fog. you need a dynamic lighting solution for it.

also, lumen has far better AO and GI coverage than baked lighitng. The Witcher 4 devs even talked about how big of a difference hardware lumen has over software lumen in terms of AO and GI and why they decided to not go even with software lumen which itself has far better AO and GI coverage than baked GI solutions.
 
They are not small upgrades as much as the 5.x terminology may want you to think. 5.6 effectively rewrote the engine's pipeline from scratch, it's closer to what UE6 would do if anything.
he's not wrong. Epic shouldve never shipped UE5 without multithreading the CPU. We saw the cracks in the matrix demo where the CPU was the main reason the framerate was dipping. As soon as i removed traffic and NPCs, my game ran at a solid 30 fps. On PC i literally couldnt go above 45 fps on a PC 2x more powerful than the PS5.

I can understand adding new features like Nanite Foliage like they did with UE5.2. Or Megalights they added in UE5.5. But basic CPU multithreading shouldve been there since day one.

He;s also right that Epic should be making it easier to upgrade versions. Nvidia literally sends out teams of graphics programmers to help build Path traced versions from scratch. Epic should be sending engineers to help build games on newer versions of the engine. The poor CPU performance is essentially a bug and its their obligation to provide support. especially now that they are literally making billions from fortnite every month.
 
lol what goalposts? You mean me answering literally the question that was posted? Which was just when we might expect devs to utilize these performance upgrades? Right now or in 2-3 years.

I guess you forgot what you said:
only the Expedition 33 devs migrated mid project

... which is just dishonest. Just because it's not up to your preferred UE version doesn't mean the other devs are lazy, or didn't try during the time they had left.
 
he's not wrong. Epic shouldve never shipped UE5 without multithreading the CPU. We saw the cracks in the matrix demo where the CPU was the main reason the framerate was dipping. As soon as i removed traffic and NPCs, my game ran at a solid 30 fps. On PC i literally couldnt go above 45 fps on a PC 2x more powerful than the PS5.
I can understand adding new features like Nanite Foliage like they did with UE5.2. Or Megalights they added in UE5.5. But basic CPU multithreading shouldve been there since day one.
That's bullshit
Read something about it before sprouting some laymen fantasies about traveling to alfa centauri next year
Multithreading in game engine is *extremely* complex. Even as math task, not to mention real implementation. There are way too much execution and data dependencies, and you either have some some super experienced genius guys who can toss this stuff with skill alone, or you try to "automate" it based on some rules and expertise metrics. That will perform so-so.
Multithreading in highly dependent system is a very, very complex task. Even optimization in OP is not that much about multithreading - only core1 and core 3 (basically first 2 threads) see visible improvement, The rest - perform the same, with visible (from 70% to 40%) drop in performance.
 
Last edited:
I guess you forgot what you said:


... which is just dishonest. Just because it's not up to your preferred UE version doesn't mean the other devs are lazy, or didn't try during the time they had left.
its not my preferred version, we are literally talking about the version that improved CPU performance. Look at the title of the thread. This is not that hard.
 
That's bullshit
Read something about it before sprouting some laymen fantasies about traveling to alfa centauri next year
Multithreading in game engine is *extremely* complex. Even as math task, not to mention real implementation. There are way too much execution and data dependencies, and you either have some some super experienced genius guys who can toss this stuff with skill alone, or you try to "automate" it based on some rules and expertise metrics. That will perform so-so.
Multithreading in highly dependent system is a very, very complex task. Even optimization in OP is not that much about multithreading - only core1 and core 3 (basically first 2 threads) see visible improvement, The rest - perform the same, with visible (from 70% to 40%) drop in performance.

Although what you say it's true, the fact remains that UE5 and UE4, for several years, had terrible multithread capabilities.
Now compare to CryEngine in Crysis 3, still using DX11, in 2013, and it speaks volumes to the incompetence in Epic's part.
And it's not like Cryengine is the only engine with much better CPU threading capabilities than Ue4 and 5.
 
its not my preferred version, we are literally talking about the version that improved CPU performance. Look at the title of the thread. This is not that hard.

I don't think you know what you initially replied to:
are developers gonna bother migrating mid-project

And then you listed developers other than Sandfall, who did bother migrating to a newer version of UE than what they started with... You're right, it isn't hard
 
That's bullshit
Read something about it before sprouting some laymen fantasies about traveling to alfa centauri next year
Multithreading in game engine is *extremely* complex. Even as math task, not to mention real implementation. There are way too much execution and data dependencies, and you either have some some super experienced genius guys who can toss this stuff with skill alone, or you try to "automate" it based on some rules and expertise metrics. That will perform so-so.
Multithreading in highly dependent system is a very, very complex task. Even optimization in OP is not that much about multithreading - only core1 and core 3 (basically first 2 threads) see visible improvement, The rest - perform the same, with visible (from 70% to 40%) drop in performance.
Take your own advice and read Epic's own release notes next time:

The overall PCG performance (Beta) has also been improved with multithreading support, enabling the system to distribute workloads more efficiently across multiple cores for faster processing, smoother interactions, and a more responsive experience, especially in complex or large-scale environments.


They literally mention the new CPU multithreading in the hour long interview with Witcher 4 devs.

And thats just UE5.6. They already used multithreading to get a massive 50% increase in CPU limited scenarios when they released UE5.4 last year. DF covered it in great detail.

Enter Unreal Engine 5.4. There are a fair few changes here, but the biggest and most imporant change is renderer parallelisation - essentially splitting up the render thread to better utilise multi-core CPUs. This change has a profound impact on performance when comparing Unreal Engine's city benchmark between versions 5.0 and 5.4. In a CPU-limited scenario on a Ryzen 7 7800X3D, the frame-rate is 42 percent higher in 5.4.

Put another way, you're going from having basically no chance to see 60fps on one of the best gaming CPUs - one that wasn't even available when the first UE5 demo launched! - to easily pushing frame-rates into the 80s. That's a huge improvement to overall performance, despite the content and settings remaining the same, and it's predominantly down to that better CPU utilisation.
 
Although what you say it's true, the fact remains that UE5 and UE4, for several years, had terrible multithread capabilities.
Now compare to CryEngine in Crysis 3, still using DX11, in 2013, and it speaks volumes to the incompetence in Epic's part.
And it's not like Cryengine is the only engine with much better CPU threading capabilities than Ue4 and 5.
It's exactly "have some some super experienced genius guys who can toss this stuff with skill alone"
For one game, from a tech nerds (Crytek came from demo scene, they are familiar with low level optimizations) - it's not a problem. For engine where most who buy it, buy it because they can't afford cost of low level optimization - results will be different.

They literally mention the new CPU multithreading in the hour long interview with Witcher 4 devs.
And you literally don't understand what it means.
Based on threads shown performance - they (by my presumption) improve scheduler on Core0. That's also fall into "multithreading performance", but it's not really performance of multithreading or improvement of said multithreading. It's just a core part of multithreading, that is explicitly single thread, performing better. Not something advanced or a proper way to parallelize, just 2+2 works a bit better.
 
Last edited:
I'm rather curious to know how Black Myth would perform on consoles and PC if ported from the Unreal 5.0 version it is on to 5.6. That 40fps mode on PS5 Pro might very well be 60fps.
 
It's exactly "have some some super experienced genius guys who can toss this stuff with skill alone"
For one game, from a tech nerds (Crytek came from demo scene, they are familiar with low level optimizations) - it's not a problem. For engine where most who buy it, buy it because they can't afford cost of low level optimization - results will be different.


And you literally don't understand what it means.
Based on threads shown performance - they (by my presumption) improve scheduler on Core0. That's also fall into "multithreading performance", but it's not really performance of multithreading or improvement of said multithreading. It's just a core part of multithreading, that is explicitly single thread, performing better. Not something advanced or a proper way to parallelize, just 2+2 works a bit better.
WTF are you talking about? We are talking about improving CPU performance. You are in a fucking thread that shows performance improvements. I linked you another article that shows they had already previously improved CPU performance. They themselves said they improved CPU performance by multithreading. Who gives a shit which route they took to do it. They did it. Just take the L and move on.
 
It's exactly "have some some super experienced genius guys who can toss this stuff with skill alone"
For one game, from a tech nerds (Crytek came from demo scene, they are familiar with low level optimizations) - it's not a problem. For engine where most who buy it, buy it because they can't afford cost of low level optimization - results will be different.

But Cryengine was also used as a third party engine. If Crytek hadn't had financial issues and almost go under, they would still be a competitor to UE5.
And a better option, as their engine was vastly better optimized than UE5 and 4.
 
I'm rather curious to know how Black Myth would perform on consoles and PC if ported from the Unreal 5.0 version it is on to 5.6. That 40fps mode on PS5 Pro might very well be 60fps.
Pretty sure Wukong is GPU bound and for whatever reason, they dont want to go below 1080p. I have zero issues with hitting 60 fps in Wukong on PC.

Virtually every single UE5 game runs at 60 fps on consoles since day one. The drops are mostly GPU related unless you go into towns and NPCs and other hub world logic starts to creep on CPUs. But aside from stalker and avowed, there arent any UE5 games with NPCs or hub worlds so its not really an issue.
 
But Cryengine was also used as a third party engine. If Crytek hadn't had financial issues and almost go under, they would still be a competitor to UE5.
And a better option, as their engine was vastly better optimized than UE5 and 4.
Only because its not pushing the tech that UE5 is pushing. They have their own software based GI solution like Software Lumen in KCD2, but no nanite to speak of. Lighting is absoutely gorgeous in that game, but asset quality is last gen at best. Their big town looks dated and last gen, even if the big fields look great. The forests also look pretty meh because their GI solution cant compete with RTGI or hardware lumen.

And yet it runs at 1440p 30 fps and 1080p 60 fps. Same resolution as this year's UE5 games. Even comparing previous UE5 games like robocop, wukong, and Sh2, they run at similar 1440p resolutions at 30 fps only dropping to 864p in the 60 fps modes despite running both nanite and lumen.

technically Decima and other Sony engines run much better around 1440p 60 fps. Demon Souls, Ratchet, both run at 1440p 60 fps. HFW runs at 1800p cb or 1296p 60 fps. Means nothing if they arent pushing next gen tech, does it?

P.S Spiderman 2 actually drops to 1008p because they forced RT in the performance mode this time around (Miles and Ratchet have non-RT modes running at 1440p). So as soon as you enable RT, and it only uses RT reflections, an actual next gen feature, they start to perform just as bad as UE5. I will never understand the hate UE5 games get on these boards because we used to have weekly existential threads about how graphics are not improving this gen. meanwhile everyone is trashing the only engine that bothered to use next gen features like lumen and nanite.
 
WTF are you talking about? We are talking about improving CPU performance. You are in a fucking thread that shows performance improvements. I linked you another article that shows they had already previously improved CPU performance. They themselves said they improved CPU performance by multithreading. Who gives a shit which route they took to do it. They did it. Just take the L and move on.
They improve multithreading support, not improved performance by multithreading.
There is a difference, but you need to understand a little bit how these things works to see the difference.
It's the same single thread performance optimization (at least 5.6 update) - I see nothing that shows improvement of parallelization and improved of usage of multithreading. The quality of multithreading is the same, just "little guy" giving tasks working a bit faster.

But Cryengine was also used as a third party engine. If Crytek hadn't had financial issues and almost go under, they would still be a competitor to UE5.
And a better option, as their engine was vastly better optimized than UE5 and 4.
It's a problem with nerds - they prioritize "working best" to "features that customers wants"
In reality "nanite next" sells more copies than fixing multithreading, so engine is fixed to the point that reasonable amount of pc/consoles could run "nanite next"
 
Last edited:
The question is, are developers gonna bother migrating mid-project or are we going to start seeing the benefits in 2-3 years when there'll be an even more performant version of the engine out there?

Unreal 5.4 came out in April 2024
Unreal 5.4.4 came out in Augusts 2024
Expedition 33 came out in April 2025 with the reveal being in June 2024.
Exp33 shipped on 5.4.4, they almost certainly migrated the project pretty late in development.


Unreal 5.5 came out in November 2024 so id think we will start seeing 5.5 games second half of the year.
I know Ark is migrating to 5.5 soon, if it hasnt already happened.

Unreal 5.6 came out two weeks ago.
So 5.6 games maybe early next year for bigger titles, smaller titles might even make it for the holiday season.........no reason to not migrate if the engine revision is bringing enough to the project.
 
Last edited:
SH2's entire map is literally covered in dynamic fog. you need a dynamic lighting solution for it.

Not sure what you mean here, the fog is not affecting Lumen, its not even in the reflections, it's not in the SDF, you see things in the puddle reflections that are much further away than you can even see in front of you when there's fog. Has no impact on surrounding lighting. The density of the fog is not blocking sparsely certain section from the global illumination.

also, lumen has far better AO and GI coverage than baked lighitng. The Witcher 4 devs even talked about how big of a difference hardware lumen has over software lumen in terms of AO and GI and why they decided to not go even with software lumen which itself has far better AO and GI coverage than baked GI solutions.

*Lumen has far better AO and GI coverage than than lazy baked lighting for static games. There's no world where a tad of effort here is requiring Lumen. for static games. Its an easy way out for artists and I can understand, Doom and Indiana Jones have done it, but there's no world where suddenly Lumen made SH2 impossible to make on baked lighting.

Witcher 4 is an actual massive open world, this helps the artists immensely.

I'm just not convinced nanite and lumen guaranteed super high fidelity thus warranting the high cost of this unoptimized engine. Imo, not many games did warrant the performance cost for what you actually see on screen.
 
Fortnite currently runs on unreal 5.6, I got a major boost in performance when it switched. Game ran great even before the switch but it's even better
 
"super high fidelity features like nanite and hardware lumen"

Or even just Lumen

Meh Kinda GIF by Cultura


Not many UE5 games really pushed it and I said WOW, this wasn't possible any other way. Wukong was probably the first one imo.

Even the likes of Silent Hill 2, what is the gain of having strobe-like high refresh Lumen scene updates when almost everything is static?

Its an easy tool for devs to get a graphic quality out fast, but they were not always super high fidelity imo.

Epic is showcasing that it was heavy for no reason, the optimization they talk about now for such core engine tech performances means this engine was pushed out the door way too fucking fast. This gen has basically been a guinea pig test for Epic. Their stutter fixing promises on PC too I won't believe until I see it.

This is how it is with every release of UE. Do you remember Silicon Knights suing Epic over UE3? Every gen there's a new UE and every gen there are teething issues with it. Some devs, like The Coalition, do great things with it while others struggle.
 
This is how it is with every release of UE. Do you remember Silicon Knights suing Epic over UE3? Every gen there's a new UE and every gen there are teething issues with it. Some devs, like The Coalition, do great things with it while others struggle.

Yup I agree. UE4 also flexed late in the last gen except a few select developers.

I imagine as we go to next gen again they'll bring something in that gimps performances.
 
"super high fidelity features like nanite and hardware lumen"

Or even just Lumen

Meh Kinda GIF by Cultura


Not many UE5 games really pushed it and I said WOW, this wasn't possible any other way. Wukong was probably the first one imo.

Even the likes of Silent Hill 2, what is the gain of having strobe-like high refresh Lumen scene updates when almost everything is static?

Its an easy tool for devs to get a graphic quality out fast, but they were not always super high fidelity imo.

Epic is showcasing that it was heavy for no reason, the optimization they talk about now for such core engine tech performances means this engine was pushed out the door way too fucking fast. This gen has basically been a guinea pig test for Epic. Their stutter fixing promises on PC too I won't believe until I see it.

I dont get why people say this shit.

With on going development things will always get better.....there is no such thing as it was released too early or whatever.
Was Blender 2.x pushed too early because Blender 4.x has Vulkan support and renders much much faster?
No, its just new features they are implementing over time

Ongoing development softwares dont have "too early or too fast of whatever" when a build is production ready its pushed.
Just because a newer build is better doesnt mean the prior build was faulty or too early or whatever.

Otherwise softwares would never come out, we would just be sitting as devs keep adding more and more features and optimizations literally forever.


People say the same shit in the Stack community about 3D Studio Max, the yearly release is always insulted as coming out too early, then when 202x.1 comes out they say damn this should ahve been the true 202x release.......No, no, thats not how shit works otherwise they would never release builds at all because there will constantly be something in development.


Just look at the Unreal Engine roadmap......even 5.6 is not the "final version" and theres more shit they are adding, is 5.6 also pushed out the door too fast......should they NOT release it and instead hold on till 2034 when they have implemented the entire roadmap to release Unreal Engine 5.0F so they can start working on Unreal Engine 6 which will only be "finished" in 2050.
Game devs should be stuck with Unreal Engine 4....which would have only actually released in 2021 so imagine every Unreal Engine 4 game prior to 2021 would simply NOT have come out on Unreal Engine 4 as they would have still been working on getting to the "final version".

Come on guys think.
 
It's a problem with nerds - they prioritize "working best" to "features that customers wants"
In reality "nanite next" sells more copies than fixing multithreading, so engine is fixed to the point that reasonable amount of pc/consoles could run "nanite next"

It's not nanite that sells Unreal. It's the Dev focused interface, tools and workflow.
It's the studios that buy Unreal and what they want is productivity. And that is the thing that Unreal excels at. And it's what made UE4 and UE5 the leaders.
 
Only because its not pushing the tech that UE5 is pushing. They have their own software based GI solution like Software Lumen in KCD2, but no nanite to speak of. Lighting is absoutely gorgeous in that game, but asset quality is last gen at best. Their big town looks dated and last gen, even if the big fields look great. The forests also look pretty meh because their GI solution cant compete with RTGI or hardware lumen.

And yet it runs at 1440p 30 fps and 1080p 60 fps. Same resolution as this year's UE5 games. Even comparing previous UE5 games like robocop, wukong, and Sh2, they run at similar 1440p resolutions at 30 fps only dropping to 864p in the 60 fps modes despite running both nanite and lumen.

technically Decima and other Sony engines run much better around 1440p 60 fps. Demon Souls, Ratchet, both run at 1440p 60 fps. HFW runs at 1800p cb or 1296p 60 fps. Means nothing if they arent pushing next gen tech, does it?

P.S Spiderman 2 actually drops to 1008p because they forced RT in the performance mode this time around (Miles and Ratchet have non-RT modes running at 1440p). So as soon as you enable RT, and it only uses RT reflections, an actual next gen feature, they start to perform just as bad as UE5. I will never understand the hate UE5 games get on these boards because we used to have weekly existential threads about how graphics are not improving this gen. meanwhile everyone is trashing the only engine that bothered to use next gen features like lumen and nanite.

You have to consider that Crytek was almost bankrupt a few years ago, due to several failed projects.
So they had to close studios and fire people. And because of that Cryengine development ground to a halt.
Even things like support for DX12 took too long.
So while Epic had all that Fortnite money to fund UE5 new tech, Crytec had almost nothing.
 
Not sure what you mean here, the fog is not affecting Lumen, its not even in the reflections, it's not in the SDF, you see things in the puddle reflections that are much further away than you can even see in front of you when there's fog. Has no impact on surrounding lighting. The density of the fog is not blocking sparsely certain section from the global illumination.



*Lumen has far better AO and GI coverage than than lazy baked lighting for static games. There's no world where a tad of effort here is requiring Lumen. for static games. Its an easy way out for artists and I can understand, Doom and Indiana Jones have done it, but there's no world where suddenly Lumen made SH2 impossible to make on baked lighting.

Witcher 4 is an actual massive open world, this helps the artists immensely.

I'm just not convinced nanite and lumen guaranteed super high fidelity thus warranting the high cost of this unoptimized engine. Imo, not many games did warrant the performance cost for what you actually see on screen.
I think volumetric effects like Fog need data from lighting to look accurate. Volumetric effects in general are all realtime and are very heavy on the GPU because they are realtime. This is the first game where the entire level is doused in volumetric fog.

As for whether or not its lazy to use in a linear game, we've done this before. I dont wish to rehash another old argument. I will concede that it was a poor choice in Doom and Indy because the games are too heavy and the visual fidelity is still last gen. But at least Wukong and Sh2 look next gen. Looks like Death stranding 2 and Ghost of Yotei are using baked lighting so we will soon have some comparisons. Right now though? AW2 is the only current gen game with baked lighting that comes close to games with lumen or RTGI. 1 game. So its possible, but AW2 itself is a very heavy game on the GPU even with baked lighting with the same exact performnace profile of UE5 games. At that point, why not use realtime GI and go with faster dev times?
 
You have to consider that Crytek was almost bankrupt a few years ago, due to several failed projects.
So they had to close studios and fire people. And because of that Cryengine development ground to a halt.
Even things like support for DX12 took too long.
So while Epic had all that Fortnite money to fund UE5 new tech, Crytec had almost nothing.
I understand. I am not criticizing cry engine or KCD2 devs. I am simply pointing out the performance profile of these games that dont even utilize CPU or GPU heavy UE5 features. Everything has a cost, but people want their cake and eat it too.
 
They already upgraded CPU performance in 5.4 so with this we finally have almost optimized engine. Stutters fix next (CDPR are working on that)...

Too bad we still see games launching with ancient UE5 version because developers can't be bothered to migrate the project. Epic is probably also to blame, process should be the easiest possible for devs.
 
Last edited:
Slimey will defend this crap engine no matter how poorly it runs. as long as it looks good in still images he doesnt care about FPS
Nah i just point out facts that show its just as performant as other engines on consoles. If you disagree with the numbers i provided, feel free to correct me. I have zero problems admitting my mistakes.

And i have repeatedly said that Eric shouldn't have shipped the engine with cpu bottlenecks that affect framerate drops in hub worlds and npcs. But that affects only very specific games. Most of the games on ue5 are fine and consistently hit 60 fps at around the same resolution as frostbite, anvil, cry engine, snowdrop and northlight engine games.
 
Top Bottom