Is Unreal Engine 5 truly dogshit, or just used improperly?

Is Unreal Engine 5 truly dogshit?

  • Yes

    Votes: 77 44.3%
  • No

    Votes: 97 55.7%

  • Total voters
    174
Ragnorak looks great in cutscenes which makes people think its doing a lot more than it actually is. In the main levels, especially the open worlds in vanaheim and crater, the game looks last gen as hell.

Some linear levels do look really good but in a 50 hour game, 2-3 levels looking decent isnt next gen when everything else from the lighting to asset quality is last gen by design.
No it doesn't look great in cutscenes, not when you have those character models...

god-of-war-ragnarok-freya-1a.jpg

il-mito-vanir-freyr-rivive-accurato-cosplay-god-of-war-ragnarok-v4-726639-1280x960.webp

God-of-War-Ragnarok-Atreus-and-Brok.jpg

(if you played the game you know they look bad even in 4k for nextgen standards, hell they are ugly even for ps4 standards)
Even the best like kratos, thor or odin or angrboda are nowhere near the models of hfw, an open world game with 20x times the amount of characters.

Unless you meant the direction of cutscenes.
 
Last edited:
It is. 2 of best looking games this year, ac shadows and death stranding 2 are not using it.
Ue5 is low res stuttering garbage that never runs good enough and always have bad image quality.
 
Its a good engine, its just very demanding, probably too demanding for current hardware. I am with Digital Foundry with the premise that due to visuals not being pushed for a long time, people got used to games running at high resolutions and at high framerates. However, back in the late 90s and early 2000s it was normal for a new engine to come out and for your framerate and resolution to half. I would be able to run Quake 3 Arena at 100 fps at 1080 and Half Life 2 at 720 30 fps
 
UE5 isn't well suited to current-gen consoles, but if you have a good PC, performance is solid, especially if you use DLSS. I've noticed stuttering in a few UE5 games, such as Silent Hill 2 and Elder Scrolls IV: Oblivion. However, most of the UE5 games I've played don't stutter nearly as much, so I cant really complain. In Robocop Rogue City I haven't seen a single stutters.

Nanite makes a big difference in games that used it. I was blown away when I played Hellblade 2, Black Myth Wukong, or even Robocop Rogue City (AA game). Even simple ground surface is extremely detailed in these UE5 games.

Digital Foundry said Death Stranding 2 has similar fidelity to nanite, and OP in this thread even beliefs that decima looks better. These are bold claims, so I would like to see some screenshots of the ground surface in Death Stranding 2. Let's see how close the Decima engine really is to Nanite.


Hellblade2-Win64-Shipping-2025-01-10-00-44-19-936.jpg


Hellblade2-Win64-Shipping-2025-01-10-01-20-34-276.jpg


Hellblade2-Win64-Shipping-2025-01-10-00-41-34-079.jpg


Hellblade2-Win64-Shipping-2025-01-10-00-28-53-562.jpg


Hellblade2-Win64-Shipping-2025-01-15-01-16-45-291.jpg


Hellblade2-Win64-Shipping-2025-01-15-01-21-52-979.jpg


Hellblade2-Win64-Shipping-2025-01-15-02-35-13-859.jpg



b1-Win64-Shipping-2024-09-01-00-07-05-687.jpg


b1-Win64-Shipping-2024-09-01-00-25-08-987.jpg


b1-Win64-Shipping-2024-09-01-00-30-46-747.jpg


Thanks to the UE5 engine, even small developers can create good-looking games. Robocop had a small budget, yet I was impressed by its graphics. I didn't see a single stutter while I was playing on my 7800X3D CPU.

Robo-Cop-Win64-Shipping-2025-03-20-18-39-51-409.jpg

Robo-Cop-Win64-Shipping-2025-03-20-14-24-15-374.jpg
Where'd you the get the first one? Is that available thru UE5?

Edit: oh wait is that hellblade lmao, thought it was the girl from the UE5 demo.
 
Last edited:
It's a topic about engines, nitpicking graphic is part of the game, nitpicking microdetails is what separe a great engine from the best engine since all of them can guarantee at least good graphic nowadays so the microdetails is what count the most if you have to decide what is the best engine.

Maybe posting in a topic about graphic inside an hardcore videogame forum to tell people that they are going overboard is not the high horse moment you think it is.
My comment, which you responded to, was a response to someone scoffing at Digital Foundry comparing Decima and DS2 usage of LOD bias favorably to Nanite, and the post was

"Durrr HB2 has real pebbles and DS2 has a low resolution mesh texture" (at times, by the way, very selective reasoning and examples going on with some posts here).

which to me showed a fundamental misunderstanding of what Nanite achieves as a technology. Lots of pebbles on a floor is not a sign of "Nanite" and it's not the best use case for Nanite either.

Nanite is essentially to remove the overhead in cost, resources. hardware intensity and dev time to LOD in development whilst also providing a seamless transition between such states rather than obvious levels in LOD.

It's not technology designed to make ground textures better; we already have tessellation tricks that devs implement to do such things (when they choose to do so).

Fortnite was one of the first real UE5 conversions, which implemented both Lumen and Nanite, and that game is hardly the ultra mega pebble game when you zoom into the ground.

The overall point is that taking a screenshot of a highly detailed ground mesh is entirely missing the point of why Nanite is important in the UE toolset and why DF might be considering Decima's solutions to LOD bias favourable to it.

Whether you agree if it is or not is a different story, but understanding what we are comparing is important.
 
Last edited:
Lol, people fighting over the best engine that truly shows the power of current gen consoles , probaly those who hate it are sony fanboys who like last gen looking games thinking it looking impersive, lol
 
No it doesn't look great in cutscenes, not when you have those character models...

Unless you meant the direction of cutscenes.
It's a bit of both. Some of the linear levels have great lighting and because they always have the camera right up Kratos' ass you dont really see the environments and just the carefully choreographed action which is still one of the best in the industry.

217185.gif
217201.gif


217187.gif
INT9_PC_Combat-Montage_Legal-ESRB.gif


It also has that last gen syndrome where gifs look better than what you see on your screen whereas gifs of UE5 games dont capture all the detail UE5 brings.
 
Its an amazing tool that when paired with great programers and brilliant artist can make magnificent things.

Its also a tool that gives you everything and has no restrictions on overdoing it/being stupid with your choices. And if you do not have a brillant engineer on your team, you better not try to make it go outside of its box (which all devs decided to do anyway).

Shader comp issues are a developer issue not properly utilizing the systems in place to mitigate it. I do blame UE (and Microsoft) for not making the systems for preventing compilation stutter mandatory and hard baked into the engine, but I understand why they aren't.

I really thought we'd see a big push this year from them to really address shader comp and give devs better training and more direct instruction on how to fix it, but alas they are more interested in moving forward with new tools.
 
oh your favorite game that you anticipated runs and looks bad and its ue 5? fire up hellblade 2 and enjoy the true ue 5 experience that you are destined for. don't forget to 5x zoom to senua in camera mode to fully experience it. there are enough amount of highly realistic looking pebbles in hellblade 2 to cover every ue 5 game that runs bad.

when you get bored of that, you can always fire up silent hill 2 and swoon over how realistic main character's hands look
 
It is a good game for mid size teams that wants to archive great graphical fidelity without having to spent a lot of money and time on an inhouse engine. For the big corporations this is only greediness to cheap the costs of actual devs and to follow the trend that many have that "Unreal Engine 5 looks amazing and it fixes everything!" (just look at Halo Studios, a company willing to sacrifice the last DNA of Halo for some PR points).

It is bad, because everything looks the same, they all share the same rendering and other techniques because, well it is the same engine. And the problem with that is that even if they try, Epic cant make an all round engine, there is stuff that this engine does incredible well, and other that doesnt. Some devs can get under the hood and try to solve these issues, but many will just ignore it.

Now, personally, it is a piece of shit: thanks to this and Fortnite we dont have Unreal Tournament... but now that I think about it, it might be for the best. We would probably only got a T-rated, always online with a big store live service with diverse characters, hero shooter arena.

It might be for the best.
 
On console it's shit I would say.

Even on Pro on a 4K screen the IQ is often god damn awful, with huge ghosting due to Lumen, color banding issues, smear all accross the image. Death Stranding 2 has not as many next gen technical features, but the image is so clean and pristine that it ends up looking better to the eye imho.

Sure the engine is competent and Nanite is absolutely amazing, but we might have to wait for next gen for it to truly delivers
 
For now the problem there is scalability. Stalker2, Oblivion Remaster etc, you can go from epic to low but there is mostly no significant difference in performance and it just doesn't really change until you seriously crank down resolution hard.
I rather prefer my games to be very well scalable between the heavy high fidelity and high performance but much lighter without the need to kill the resolution even with DLSS. But it's just not good when you have such small gains despite all the sacrifices.
 
Last edited:
UE5 is dog shit. Take away DLSS and you still have a shit running engine.

Nvidia shouldn't be responsible for having to get this engine running above 60fps
 
The engine is really good and is capable to achieve amazing results, but it looks like is not friendly with devs that doesn't have a lot of experience in consoles. Probably is not that easy to optimize.
 
If Epic themselves are anything to go by I would say it is a competent engine that sometimes is poorly optimised in some games like any other engine. Epic get very good performance with Lumen and Nanite in their own games like Fortnite.
 
Wait you are telling me that DX12 is the issue the entire time? I will nuke this shit everywhere I can then.
most dx12 games only do shader caching once. they only need to do it again if you update or change your drivers. you won't usually see shader compilation screen every time you launch a game

however most games also need to revalidate shaders every time you run the game. it is not compiling shaders again, it is just revalidating them. most other dx12 games are really quick with this revalidation process (mere seconds) and you get to main screen without waiting or without seeing "shaders are being compiled" screen. but with unreal engine 4 and unreal engine 5, shader revalidation can take unusually long compared to most other dx12 games, as such you see those screens again (otherwise it would be blank screen for 15-30 seconds)

so in a way, it is still a unique issue, revalidation taking unusually long compared to other dx12 games
 
What is a good looking UE5 game on modern consoles? AI says that could include Alan Wake 2, a game which most surfaces looked like they were melting.

This generation and UE5 are pretty close to being as unstable as a PS1 polygons.
 
Yeah I know this is posted on a monthly basis, but we should keep posting this until the engine is fixed or preferably, disappeared above the sea from a helicopter.

Entire articles and tech blogs covering stuttering from the source itself, which is my main gripe with this engine. It seems to be a major architectural problem with this engine and not something they can fix after the fact.

I also find it quite hysterical that UE5 focuses so much on fancy marketing nonsense like "Nanite", "Lumen" and whatever, while Decima has none of that yet both looks and runs better in the games that use it. Just look at Death Stranding 2. It looks many times better than most current games and it runs perfectly fine. Also the hardware requirements for some UE5 games are beyond retarded. I mean look at Borderlands 4. We need to put a stop to this shit.

Unreal Engine 5 has devolved gaming into a stuttering mess of bad graphics on high system requirements. And I legitimately can't think of any example of this pre-UE5. I just can't. Nothing stuttered back in the 2000s, just get a mid-range rig and you're playing games my friend. Well, except Crysis of course but that had problems on its own (I think the engine wasn't multi-threaded or something).

It seems nowadays developers go for Unreal Engine because there isn't something comparable to it publicly available, so it's the "the least worst" option since we lost any and all engine developers and budget for engines in the last decade or so apparently.

Nobody should care about what engine a game uses, but Epic made us do that due to its stank. And that's very unfortunate.
Do you know any other multiplat engine which can give access to the same graphic tech? It's has notorious flaws but shitty not absolutely.
 
I think you are misunderstanding what nanite is and what they are comparing.

nanite isn't. "small ground is really detailed", it's about it's dynamic ability to change and scale meshes depending on distance to retain it's quality.

so a prop from 1m away can still maintain it's "quality" 100m , 200m away etc.

i don't know why people think small ground textures equals nanite.
No your confused....its both. Without virtualized geometry you couldnt render that many polygons period. Its about memory managment and the other engines cant do it it. Like at all. Nanites memory managment is why its able to havd such extreme polygon counts. Try droping a full zbrush model all ovef a level in Decima and you would get .0005 fps before your computer was fried. Actually not even that you would most likely just get an out of memory message and a blue screen.
 
No your confused....its both. Without virtualized geometry you couldnt render that many polygons period. Its about memory managment and the other engines cant do it it. Like at all. Nanites memory managment is why its able to havd such extreme polygon counts. Try droping a full zbrush model all ovef a level in Decima and you would get .0005 fps before your computer was fried. Actually not even that you would most likely just get an out of memory message and a blue screen.
Just because nanite can be used to also give us small details for ground meshes doesn't mean it's "Literally impossible in decima", and again, that is not what the comparison that DF is making, and again, shows a lack of understanding of the reason for nanite. I still don't know when the narrative became "look at all the little pebbles, other engines can't do small details without nanite".

It's not just about "Infinite triangles". because anyone with half a brain knows that's a fallacy.

People have become so obsessed with zooming into pixels, it's ridiculous actually. To be fair, Digital Foundry themselves have become largely to blame on that front.

I also think people see Photogrammetry with a great global illumination implementation and confuse it with nanite also.
 
It's gone beyond dogshit and started effecting games not even using it! Two of my most cherished series have been utterly disemboweled by UE5. STALKER and Forza Motorsport. Once proud, resolved sharp kings. Now muddy, blurred and temporally unstable.

I remember thinking UE3 was bad because all the games looked the same. No one who games enough to know the release date of any video game was able to ID a UE3 game without even seeing the logo. It was uncanny. In the end, only Gears and Arkham did it justice. I'm sure some UE5 games will do it right, I just haven't seen one yet. Maybe we need to wait for Atomic Heart 2. Really the only devs to get the most out of UE4.

UE5 also have a radioactive poison pill. It's convincing pubs to gut their own technical divisions. Easy way to save money and hey, UE5 is cutting edge, and it lets us hire temp workers from India! Bethesda, CDPR and Halo are all moving to UE5. What's gonna suck, 6 years from now when we're all moved on, they're not going to have anything in their technical cupboard. Lord knows what ripples that will have.
 
Do you know any other multiplat engine which can give access to the same graphic tech?
Ah yes - the ultimate proof argument - the bullet/feature lists - you either match them point for point or you're not competing ...
But fine - Virtualized geometry (aka Nanite) 'is' available in other engines now (and even engine agnostic implementations exist), and GI has been solved to varying extents before Lumen was even announced - some solutions that give decidedly better results, whatever the argument may be with performance or not.

Frankly the actual advantages UE has are the same as in 3.x era - aka. productivity tooling - meta-human makes production easier, just like UE3 had well functioning normal-map workflows well ahead of what competition had at the time. And that's far from the only workflow advancement they have in 5.

Rendering pipeline is well - expensive and unwieldy - but that's how UE always was for the last 25 years (see my previous post). It works great for PR - but ultimately results are just - ok.


Its about memory managment and the other engines cant do it it. Like at all.
This is literally an open-source engine 18months ago
 
Last edited:
Ah yes - the ultimate proof argument - the bullet/feature lists - you either match them point for point or you're not competing ...
But fine - Virtualized geometry (aka Nanite) 'is' available in other engines now (and even engine agnostic implementations exist), and GI has been solved to varying extents before Lumen was even announced - some solutions that give decidedly better results, whatever the argument may be with performance or not.

Frankly the actual advantages UE has are the same as in 3.x era - aka. productivity tooling - meta-human makes production easier, just like UE3 had well functioning normal-map workflows well ahead of what competition had at the time. And that's far from the only workflow advancement they have in 5.

Rendering pipeline is well - expensive and unwieldy - but that's how UE always was for the last 25 years (see my previous post). It works great for PR - but ultimately results are just - ok.



This is literally an open-source engine 18months ago
Wake me up when its actually utilized in a release game with similar results. As if all the big studios would just be sitting on high fidelity solutions like this without implementing them. They have VG "tech demos" going at all major studios i assume but there clearly not ready for showtime. Meanwhile we got AA studios releaseing levels that make great use of the tech now via UE all while even tech favorite Naughty Dog showcased there recent game with blocky ass astroids in the "opening shot" 5 years later.
Showing me this gdc video is no different than showing people a video of the Brigade engine doing path tracing back in 2015. Its all tech demo nonsense untill it ships.
 
Last edited:
all those Hellblade 2 screencaps could 100% be done without Nanite. Nanite only makes sense in realtime rendering, not in screenshots.

you are complaining about artists work rather than the rendering pipeline.
I'm sure a tech demo could achieve graphics fidelity comparable to that of Hellblade 2 even without nanite technology, but developers of real games have much more limited polygon count budget and arnt willing to allocate much of it to the ground surface. They would rather spend it on the main characters. But please tell me that the flat ground surface in DS2 was an artistic decision!

4752530f113d4f5b9d8d.jpg


UE5 games can have extremely detailed character models and ground surfaces at the same time. Nanite can also be used to render buildings in extreme detail.



Personally, I have always noticed flat textures in games, even when I first got a PC. For this reason, games with bump mapping, such as Doom 3, greatly impressed me.

1.jpg


3.jpg


4.jpg


2.jpg


Then games started using POM and tessellation techniques to add depth to otherwise flat looking textures. Games that employed these techniques effectively, such as the Tomb Raider games (ROTTR/SOTTR), or Uncharted 4 featured highly detailed rock formations in the caves. These technices however never looked photorealistic. Only nanite has bridget that gap and that's the reason why UE5 games that use nanite impress me.
 
Top Bottom