UE5 has been the worst thing to happen to videogames since someone came up with the word "microtransaction".
Indiana Jones using only RTGI is sub 1800p (runs at 60 so it gets a pass)
Clair runs at 1080p60 and 1440p30 with a mix of RT features.
Avatar runs at 1000p60 and 1440p30 with some cool RT.
Final Fantasy XVI runs at 1080p60 and 1440p30.
Alan Wake 2 runs at 1300p30 and 900p60 with a beavy of RT features.
These are the custom engines that immediately come to mind when I think about feature sets, consider that XVI doesnt actually seem to be using any RT.
Does Unreal Engine really seem that far off?
Which custom engines are you talking about that have thrown the same amount of features at these consoles to be comparable?
The only one really is NorthLight and weve seen how low that engine goes.
Multithreading has been around for more than a decade, and now they are adding it?This part here is interesting. I never thought Unreal Engine had these kind of limitations.
Seems like the only interesting thing that sets it apart from other game engines for me is Nanite.
First Unreal Engine 6 Info Shared by Epic's Tim Sweeney – Preview Versions in 2-3 Years, Goal Is to Go Multithreaded
"The biggest limitation that's built up over time is the single-threaded nature of game simulation on Unreal Engine. We run a single-threaded simulation. If you have a 16 core CPU, we're using one core for game simulation and running the rest of the complicated game logic because single-thread programming is orders of magnitude easier than multi-thread programming, and we didn't want to
burden either ourselves, our partners, or the community with the complications of multi-threading.
Over time, that becomes an increasing limitation, so we're really thinking about and working on the next generation of technology and that being Unreal Engine 6, that's the generation we're actually going to go and address a number of the core limitations that have been with us over the history of Unreal Engine and get those on a better foundation that the modern world deserves, given everything that's been learned in the field of computing in that timeframe."
I'm convinced UE5 is just broken at the core and can't be brute forced.Just in time to overburden hardware that has just caught up with UE5 then.
You can't be serious? UE5 might have more features but it runs like dogshit on all platforms.Custom engines that look and run better and have more features than Unreal Engine 5?
Name them.
P.S Isnt UE5 on console mostly fine its in the PC space where the real issues are.
Yet fortnite looks and plays great on a toaster with UE 5.5The hardware isn't the problem. A game like KCD2 running on CryEngine looks and runs like a dream on my computer. Meanwhile the Oblivion remaster chugs like a fuck because of UE5.
They really need to optimize that shit, it's a cancer for videogames.
Almost every game on UE suffers from some degree of stutter. Are you suggesting the whole industry is incompetent?Yet fortnite looks and plays great on a toaster with UE 5.5
But yeah, let's blame the engine instead of the developers
Since the RTX 3000/6XXX/Covid era, absolutely 100%Almost every game on UE suffers from some degree of stutter. Are you suggesting the whole industry is incompetent?
39 guys made a great game. Shame it runs like shit.Since the RTX 3000/6XXX/Covid era, absolutely 100%
30 guys just did a great game on UE5 that runs flawlessly, says a lot.
39 guys made a great game. Shame it runs like shit.
Oblivion vs KCD2.No, it doesnt, nice try finding a video from a small youtuber trying to prove some ridiculous point. It runs steady 60fps on ps5/xbox and pc easily
But hey, if you like having lazy developers brute force games instead of optimizing them since the last few years while blaming UE5, go for it
Can we get reliably stable UE5 usage across the board first?
And the hardware to overcome its inherent performance flaws is about 9-10 years away (optimistically).
UE5 didn't fix UE4 traversal stutter. I bet you UE5 will still have traversal stutter in some games.Yeah I'm sure it will fix everything…..
That's pretty much what UE X.O has been for awhile. It's like a World or Warcraft expansion.Not really a new Engine.
New UI.
UEFN built in.
Verse Programming language.
I'm guessing since it's expected in 2 years or so it's a name change as they might have exhausted the 5.x numbering.
We are already at 5.6p.
So it's kinda like CryEngine V and CryEngine 6
For those that don't know CryEngine 5 was CryEngine fourth iteration and CryEngine 6 is meant to be what was going to be CryEngine 5.7.
Name forwarding without major changes under the hood.
I'll see your UE5 and raise you id Tech.UE is the best engine available by a far margin: the technology, the reach (support every device), the quality of the code, the documentation, the tools...
literal cartoon shooter made for mobile phones runs great, who could have guessed?Yet fortnite looks and plays great on a toaster with UE 5.5
But yeah, let's blame the engine instead of the developers
Clair Obscur: Expedition 33literal cartoon shooter made for mobile phones runs great, who could have guessed?
Now give an example for a game using an artstyle the engine actually branded itself on
Yet fortnite looks and plays great on a toaster with UE 5.5
But yeah, let's blame the engine instead of the developers
Clair Obscur: Expedition 33
Lumen is the worst real time GI tech in the industry
(watch in 4K)
low ray count + low precision BVH:
Lack of Screen Space information when too far away as well as obvious light leak and denoiser boiling.
weird brightness changes due to lack of screen space information for secondary bounces, as well as specular highlights flaring up in the dark due to reflecting a bright cubemap while Lumen tries to react to the changing camera to fill the gaps where you see those white sparkles:
another example of reflection ghosting.
![]()
one issue that happens from time to time with Software Lumen, is that some scenes just go full darkness due to the lack of screen space information for secondary bounces. you saw this partly in the third video above.
Once the screen space information is gone, the entire GI just blacks out as no primary light bounce is visible anymore for it to create ambient lighting.
Fortnite needs to die.
I'm guessing since it's expected in 2 years or so it's a name change as they might have exhausted the 5.x numbering.
We are already at 5.6p.
Bruh, I play fortnite on pc every day. The first game the shaders load, I get off at the end of the bus then it runs flawless for the rest of the day. Sorry mate, it runs greatplay Fortnite on PC... I dare you.
after each driver update and each game update you have like 5 matches full of constant shader stutters. and every time a player has a skin you haven't seen before (or haven't seen since the last update) you'll also get a stutter.
THE LITERAL DEVELOPERS OF THE FUCKING ENGINE'S OWN GAME HAS INSANE SHADER STUTTERS! let that sink in.
also if you play it on a toaster you won't use any of UE5's actual features and basically run it with visuals that are equivalent to UE4.
and if you want to not have a competitive disadvantage you run it in a mode that downgrades the visuals to literal mobile game level (known as performance renderer mode).
even in that mode it has shader stutters tho btw. just not as extreme as in DX11 or DX12 mode
Insanely big map with 100 players and a shitload of traversal, lots of physics going on, pretty much everything is breakable.literal cartoon shooter made for mobile phones runs great, who could have guessed?
Now give an example for a game using an artstyle the engine actually branded itself on
That is correct, but even the PSO precaching wont solve the PCs underlying architectural problems, nor can it fix the Windows OS overhead.The stuttering is an hardware architecture problem that software can fix only in part. This problem already existed before (Remember the mess that Id Software's Rage was on PC?).
Truth is that GPUs have evolved too fast, the current PC architecture is no more suited for them; the other components can't follow anymore.
Can we get reliably stable UE5 usage across the board first?
I don't know if that's awesome or just hilarious!![]()
Asus adds an SSD slot to its RTX 4060 Ti graphics card, delivering up to 12GB/s of SSD performance via the GPU, and the M.2 port even allows using an RTX 4090 as an eGPU
An RTX 4090 piggybacking through an RTX 4060 Ti? Really?www.tomshardware.com
Not really a new Engine.
New UI.
UEFN built in.
Verse Programming language.
I'm guessing since it's expected in 2 years or so it's a name change as they might have exhausted the 5.x numbering.
We are already at 5.6p.
So it's kinda like CryEngine V and CryEngine 6
For those that don't know CryEngine 5 was CryEngine fourth iteration and CryEngine 6 is meant to be what was going to be CryEngine 5.7.
Name forwarding without major changes under the hood.