Tim Sweeney says UE6 previews could be 2-3 years away

jgrayw3b-frustrated.gif


Truly one of the most important conversations of our time.

Lex's speech cadence hitches almost as much as an Unreal Engine game.
 
This part here is interesting. I never thought Unreal Engine had these kind of limitations.

Seems like the only interesting thing that sets it apart from other game engines for me is Nanite.

First Unreal Engine 6 Info Shared by Epic's Tim Sweeney – Preview Versions in 2-3 Years, Goal Is to Go Multithreaded
"The biggest limitation that's built up over time is the single-threaded nature of game simulation on Unreal Engine. We run a single-threaded simulation. If you have a 16 core CPU, we're using one core for game simulation and running the rest of the complicated game logic because single-thread programming is orders of magnitude easier than multi-thread programming, and we didn't want to
burden either ourselves, our partners, or the community with the complications of multi-threading.

Over time, that becomes an increasing limitation, so we're really thinking about and working on the next generation of technology and that being Unreal Engine 6, that's the generation we're actually going to go and address a number of the core limitations that have been with us over the history of Unreal Engine and get those on a better foundation that the modern world deserves, given everything that's been learned in the field of computing in that timeframe."
 
Indiana Jones using only RTGI is sub 1800p (runs at 60 so it gets a pass)
Clair runs at 1080p60 and 1440p30 with a mix of RT features.
Avatar runs at 1000p60 and 1440p30 with some cool RT.
Final Fantasy XVI runs at 1080p60 and 1440p30.
Alan Wake 2 runs at 1300p30 and 900p60 with a beavy of RT features.


These are the custom engines that immediately come to mind when I think about feature sets, consider that XVI doesnt actually seem to be using any RT.
Does Unreal Engine really seem that far off?


Which custom engines are you talking about that have thrown the same amount of features at these consoles to be comparable?
The only one really is NorthLight and weve seen how low that engine goes.

the big difference is that the features that make UE5 games run at low resolutions are also fundamentally low quality, in part due to the low resolution.
it's like a death spiral.

Lumen GI, especially software Lumen is INSANELY low quality. it barely even qualifies as raytracing because it is just barely more capable than the screen space GI that The Coalition added to the Series X version of Gears 5, while at the same time looking worse in action than it.
software Lumen reflections as well. they are so insanely low quality that they are designed to only be used as a fallback to screen space reflections. they are essentially super low quality planar reflections that even PS2 devs would have beeen ashamed to use.

and software Lumen is barely perfomant enough to run ok-ish on console.

meanwhile Ubisoft is pushing way higher quality raytracing at 60fps in their games and at similar resolutions to UE5 games using software Lumen.
Indiana Jones has higher quality GI than any UE5 game while running at 1800p 60fps.


and then we're coming back to what I called a death spiral.
software Lumen is extremely low quality. this comes down to a low ray count, an extreme reliance on screen space information, and really shoddy denoising. all of this is the case due to perfomance reasons.

so add an already low quality GI/Reflection solution to a game
👇
in order to make it run well on console you reduce the internal resolution to around 1080p or sometimes below that
👇
because the internal resolution is low the ray count of Lumen is reduced accordingly as well
👇
because the ray count is now this low it takes a long time for the lighting to react to changes as the denoiser has to work overtime in accumulating information
👇
and because the internal resolution is low the screen space information needed to fill the gaps of software lumen is also low, meaning artifacts get more obvious

the end result of this is a game that looks blurry, has constant artifacting during motion, has lighting that can look downright broken at times, and has reflections that look worse than a decent cubemap solution would.
so you are applying extremely GPU intensive workloads to a game that actively makes it look worse than what it could look like if more traditional solutions were applied.

Lumen can look good if used in the absolute best case scenarios. that would be running hardware Lumen at native 4k 60+fps... but who has the hardware for that? you can barely do that on the best PC GPUs (if even there)
in the meantime many UE5 games will look like they are ports of games that were barely able to be squeezed onto the hardware, similar to those "impossible Switch ports", not for the reason that they barely run, but for the reason that Lumen simply doesn't work well at the resolutions and quality settings that are possible with it enabled on these consoles, so it looks like something that should not be ported to a console is ported to it in spite of it looking broken... like said switch ports often look

Case in Point, NfS Unbound's planar reflections on wet ground surfaces are about 5x higher quality than
software Lumen reflections are in any game they are used. while the game also uses convincing looking cubemaps on windows. if you replaced this with Lumen, the game would look worse and run at half the resolution (if not lower). with Lumen the game would look like it wasn't designed for PS5 and Series X but forced onto them, with artifacts from Lumen and aggressive reconstruction through TSR... while it looks clean and runs well with their own lighting and reflections solutions.

we are literally at a point where Metal Gear Solid 2 on the fucking PS2 has better and higher fidelity reflections than the newest UE5 titles... and I just feel that the industry as a whole took a wrong turn at some point to get to where we are now.
 
Last edited:
This part here is interesting. I never thought Unreal Engine had these kind of limitations.

Seems like the only interesting thing that sets it apart from other game engines for me is Nanite.

First Unreal Engine 6 Info Shared by Epic's Tim Sweeney – Preview Versions in 2-3 Years, Goal Is to Go Multithreaded
"The biggest limitation that's built up over time is the single-threaded nature of game simulation on Unreal Engine. We run a single-threaded simulation. If you have a 16 core CPU, we're using one core for game simulation and running the rest of the complicated game logic because single-thread programming is orders of magnitude easier than multi-thread programming, and we didn't want to
burden either ourselves, our partners, or the community with the complications of multi-threading.

Over time, that becomes an increasing limitation, so we're really thinking about and working on the next generation of technology and that being Unreal Engine 6, that's the generation we're actually going to go and address a number of the core limitations that have been with us over the history of Unreal Engine and get those on a better foundation that the modern world deserves, given everything that's been learned in the field of computing in that timeframe."
Multithreading has been around for more than a decade, and now they are adding it?

But the ue subreddit swears you don't need multithreading...

Pray for CDPR.
 
Custom engines that look and run better and have more features than Unreal Engine 5?


Name them.






P.S Isnt UE5 on console mostly fine its in the PC space where the real issues are.
You can't be serious? UE5 might have more features but it runs like dogshit on all platforms.
 
The hardware isn't the problem. A game like KCD2 running on CryEngine looks and runs like a dream on my computer. Meanwhile the Oblivion remaster chugs like a fuck because of UE5.

They really need to optimize that shit, it's a cancer for videogames.
Yet fortnite looks and plays great on a toaster with UE 5.5

But yeah, let's blame the engine instead of the developers
 
UE5 was the most excited I had been for the game industry in a while when it was first announced. I watched the preliminary GDC talks on some of its tech and it seemed truly revolutionary.

And then it came out, lacking most of those features. And now, I dunno like 8 years later, it has most of those features, but they're don't really work as advertised, or they're just way too expensive for real-world application.

So, uh, maybe fully bake UE5 before you start on 6, Epic.
 
Almost every game on UE suffers from some degree of stutter. Are you suggesting the whole industry is incompetent?
Since the RTX 3000/6XXX/Covid era, absolutely 100%

30 guys just did a great game on UE5 that runs flawlessly, says a lot.
 
39 guys made a great game. Shame it runs like shit.


No, it doesnt, nice try finding a video from a small youtuber trying to prove some ridiculous point. It runs steady 60fps on ps5/xbox and pc easily

But hey, if you like having lazy developers brute force games instead of optimizing them since the last few years while blaming UE5, go for it
 
No, it doesnt, nice try finding a video from a small youtuber trying to prove some ridiculous point. It runs steady 60fps on ps5/xbox and pc easily

But hey, if you like having lazy developers brute force games instead of optimizing them since the last few years while blaming UE5, go for it
Oblivion vs KCD2.

Its not ridiculous to suggest most UE5 games suffer from traversal stutter.
 
What gaming industry needed is not UE6. It needs UE5 optimization across all over the developers.

I really hate that everything in every industry choosing "Release new phone every year" philosophy. Nearly every developer (American, Asian, European) migrated to UE5. Maybe 80-90 percent of developers using this engine (which is a bad thing because monopoly is a bad thing). Why is it Epic wants to go to "new, new, new"? And that "new" is actually is not new, it is more like "more problem"?

Look below for one of the developer comment on reddit. Lets say UE6 came out? What will the players will want from this developer? The sector really doesn't need new engine. It needs more time with the current one for more dependability, familiarity, less problem overall in UE5 games.

55unJJr.png
 
Last edited:
Not really a new Engine.
New UI.
UEFN built in.
Verse Programming language.
I'm guessing since it's expected in 2 years or so it's a name change as they might have exhausted the 5.x numbering.
We are already at 5.6p.



So it's kinda like CryEngine V and CryEngine 6
For those that don't know CryEngine 5 was CryEngine fourth iteration and CryEngine 6 is meant to be what was going to be CryEngine 5.7.
Name forwarding without major changes under the hood.
That's pretty much what UE X.O has been for awhile. It's like a World or Warcraft expansion.
 
UE is the best engine available by a far margin: the technology, the reach (support every device), the quality of the code, the documentation, the tools...
I'll see your UE5 and raise you id Tech.

Gotta echo the sentiments here: I don't care about UE6 while UE5 is fucking over an entire generation of games. A lot of this stuff will age like the original Crysis release, meaning they'll chug 20 years from now because the issues are engine-level and no amount of additional hardware can fix that.
 
Yet fortnite looks and plays great on a toaster with UE 5.5

But yeah, let's blame the engine instead of the developers
literal cartoon shooter made for mobile phones runs great, who could have guessed?

Now give an example for a game using an artstyle the engine actually branded itself on
 
Technology has to move on but unlike what happened with UE3 and UE4 the feeling is that developers have just started to adopt UE5 and the results are super mixed, it's not clear if it's because current hardware is not powerful enough to run it without compromises or if the engine is not optimized and developers are just trying to brute force things.
In any case 2-3 years seems to coincide with the next gen of consoles, shock!
 
Yet fortnite looks and plays great on a toaster with UE 5.5

But yeah, let's blame the engine instead of the developers

play Fortnite on PC... I dare you.

after each driver update and each game update you have like 5 matches full of constant shader stutters. and every time a player has a skin you haven't seen before (or haven't seen since the last update) you'll also get a stutter.

THE LITERAL DEVELOPERS OF THE FUCKING ENGINE'S OWN GAME HAS INSANE SHADER STUTTERS! let that sink in.

also if you play it on a toaster you won't use any of UE5's actual features and basically run it with visuals that are equivalent to UE4.
and if you want to not have a competitive disadvantage you run it in a mode that downgrades the visuals to literal mobile game level (known as performance renderer mode).

even in that mode it has shader stutters tho btw. just not as extreme as in DX11 or DX12 mode
 
Last edited:
Clair Obscur: Expedition 33

Lumen is the worst real time GI tech in the industry
(watch in 4K)

low ray count + low precision BVH:



Lack of Screen Space information when too far away as well as obvious light leak and denoiser boiling.



weird brightness changes due to lack of screen space information for secondary bounces, as well as specular highlights flaring up in the dark due to reflecting a bright cubemap while Lumen tries to react to the changing camera to fill the gaps where you see those white sparkles:



another example of reflection ghosting.

isgza4kg.png



one issue that happens from time to time with Software Lumen, is that some scenes just go full darkness due to the lack of screen space information for secondary bounces. you saw this partly in the third video above.
Once the screen space information is gone, the entire GI just blacks out as no primary light bounce is visible anymore for it to create ambient lighting.

 
Last edited:
Multi-thread support might be the thing to remove stuttering, could be a bottleneck/pipeline issue.
 
Please have HDR by default! Mid-tier devs (not all) seem to have an issue implementing HDR in UE4 and UE5.
 
play Fortnite on PC... I dare you.

after each driver update and each game update you have like 5 matches full of constant shader stutters. and every time a player has a skin you haven't seen before (or haven't seen since the last update) you'll also get a stutter.

THE LITERAL DEVELOPERS OF THE FUCKING ENGINE'S OWN GAME HAS INSANE SHADER STUTTERS! let that sink in.

also if you play it on a toaster you won't use any of UE5's actual features and basically run it with visuals that are equivalent to UE4.
and if you want to not have a competitive disadvantage you run it in a mode that downgrades the visuals to literal mobile game level (known as performance renderer mode).

even in that mode it has shader stutters tho btw. just not as extreme as in DX11 or DX12 mode
Bruh, I play fortnite on pc every day. The first game the shaders load, I get off at the end of the bus then it runs flawless for the rest of the day. Sorry mate, it runs great
 
literal cartoon shooter made for mobile phones runs great, who could have guessed?

Now give an example for a game using an artstyle the engine actually branded itself on
Insanely big map with 100 players and a shitload of traversal, lots of physics going on, pretty much everything is breakable.

cartoon shooter made for mobile, right, thanks for proving you're not worth the time
 
So you're telling me these 8C/16T CPUs were barely being used in UE5 games, which would explain the stutters. lmao, they're only now working on multi-threading
 
The stuttering is an hardware architecture problem that software can fix only in part. This problem already existed before (Remember the mess that Id Software's Rage was on PC?).
Truth is that GPUs have evolved too fast, the current PC architecture is no more suited for them; the other components can't follow anymore.
That is correct, but even the PSO precaching wont solve the PCs underlying architectural problems, nor can it fix the Windows OS overhead.
But as you can easily see on this forum and elsewhere, instead of blaming Nvidia or Microsoft for not getting their shit together, gamers attack engines and devs.
What a joke!
 
We'll get next gen consoles that will finally be able to brute force decent resolutions and frame rates out of UE5.....and then devs will move to UE6 and it's going to be sub 1080p with tons of frame drops all over again.

Excited Sarcasm GIF by The Challenge
 
Can't wait for next gen console warriors to use the tech demo for arguments points and never to be materialized.
 
Last edited:
heres come people shitting on ue that know nothing about game development. UE is one of the best things that have ever happened to game development. The bottleneck isnt UE, you can more or less do anything in UE, any style , any shader. The bottlenecks have been your playtendos and management not being able to manage what they want and where.
 
Not really a new Engine.
New UI.
UEFN built in.
Verse Programming language.
I'm guessing since it's expected in 2 years or so it's a name change as they might have exhausted the 5.x numbering.
We are already at 5.6p.



So it's kinda like CryEngine V and CryEngine 6
For those that don't know CryEngine 5 was CryEngine fourth iteration and CryEngine 6 is meant to be what was going to be CryEngine 5.7.
Name forwarding without major changes under the hood.

Sounds like it's not coming out for another 3 to 4 years though.
 
People really went nuts over UE5 tech demos. I was in disbelief over how people couldn't fathom that tech demos in general don't represent what real games can pull off. Aside from that the UE5 docs that were available basically explained how bad the performance was for their features.

Made for a lot of fun threads lol
 
Top Bottom