• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Digital Foundry: What Can Be Done About Unreal Engine 5 Games With Image Quality/Performance Issues?

Witcher 4 and Epic devs said that they have resolved the traversal stutter with UE5.6. So you dont have to wait till next gen. Whenever that might come now that its been delayed.
I'll believe it when I see it in a full game rather than a tech demo, and if it's scalable for smaller teams without CD Projekt money + close Epic partnership.

I just know back in 2025 Sweeney spoke about UE6 making the complete transition to multi-threaded use across the board with the engine. I'm optimistic with whatever improvement happens though, and I'm also hoping there will be some way to improve older games in future so we're not stuck with compromised games.
 
Yes, but to be fair you can't really expect tech know how from the casual mainstream. They only see bad performance and IQ and are ofc looking for a scapegoat and Epic's marketing did their best pretending that the consoles could actually punch above their weight with UE5 out of the box which is simply not true.
Epic was actually fairly honest with their marketing. They came out and said they were targeting 1440p 30 fps when they revealed the first PS5 UE5 demo. Then when they revealed hardware lumen, and the valley of the ancient demo, they said xsx and ps5 would target 1080p 30 fps. And thats pretty much what they delivered with the Matrix. Actually closer to 1440p at times.

The problem like you said is that people are just ignorant of the facts. The vast majority of UE5 games target 1440p 30 fps just like Epic claimed. They just dont remember that. Or maybe they do and they just dont remember that back in May of 2020, 30 fps was the standard everywhere and they were perfectly fine with it. Commence a long 2-3 year long cross gen cycle full of 1440p 60 fps games, and everyone's standards changed, but Epic's claims and the reality of UE5 software have remained the same.

The only time ive seen Epic talk nonsense is when Tim Sweeney placed all the blame on developers. Clearly UE5 had a lot of issues, otherwise it wouldnt have taken them till 5.4 to resolve CPU bottlenecks and till 5.6 to resolve traversal stutters.
 
I've never been more incensed by an engine than UE5. I don't care for the specifics, but 99% of them run terribly AND look awful. Like fundamentally bad. They're always blurry and riddled with weird artefacts with reflections and lighting. It's dog water. Furthermore, almost every UE5 game looks the same. So a vast amount of the industry is currently consolidated into one awful dogshit look.

In contrast I played Death Stranding 2 with its shared engine with Horizon. Both are crisp, clean, beautiful and run incredibly. I never got a headache. Fundamentally they looked unique, but the shared elements was from it's competence much like an auteur where you see the echoes of shared competence across unique works. Meanwhile you play 5 different UE5 games and they have the same fucked reflections, shadows and blurry image. God I hate UE5.
 
I've never been more incensed by an engine than UE5. I don't care for the specifics, but 99% of them run terribly AND look awful. Like fundamentally bad. They're always blurry and riddled with weird artefacts with reflections and lighting. It's dog water. Furthermore, almost every UE5 game looks the same. So a vast amount of the industry is currently consolidated into one awful dogshit look.

In contrast I played Death Stranding 2 with its shared engine with Horizon. Both are crisp, clean, beautiful and run incredibly. I never got a headache. Fundamentally they looked unique, but the shared elements was from it's competence much like an auteur where you see the echoes of shared competence across unique works. Meanwhile you play 5 different UE5 games and they have the same fucked reflections, shadows and blurry image. God I hate UE5.

There's always something with unreal. First half of the ps360 generation we were cursed with pea soup textures because of streaming issues. UE4 had a blurriness to it etc.
 
I just tried putting MGS Delta and Mafia to 1440p TSR performance which should be 720p to get the console experience, and it looked pretty stable. I didnt change the settings down from Epic, but i was very surprised to see the image quality.

My guess is that the consoles are getting low quality lumen which is probably whats causing all the IQ issues because TSR performance looked pretty good on my 4k 65 inch screen.
 
I ain't reading all that. That isn't a summary… I keep scrolling and it keeps going Jesus Christ.

DF is absolutely not worth listening to unless you yourself own a 5090 and jerk off to the smallest of details to dismiss titles. Fart huffing coupled with overblowing issues like silent hill f which I never noticed what they were talking about because you had to be sitting still in very specific angles. Like come on man…

As for the topic of UE5. Yes it sucks. The industry doesn't care. The cost to make an engine and a game is too high hence the quick desire to have AI to keep trimming workload down to make more profit.

The industry will fall off a cliff in likely another 5 years and it'll be up to indie to pick up the pieces.
 
Unreal 6 will fix this.
And then Unreal 7 after that, but Unreal 8 for suuuuuuure, Sweeney promise.

laughing-dog-dog.gif
 
I've never been more incensed by an engine than UE5. I don't care for the specifics, but 99% of them run terribly AND look awful. Like fundamentally bad. They're always blurry and riddled with weird artefacts with reflections and lighting. It's dog water. Furthermore, almost every UE5 game looks the same. So a vast amount of the industry is currently consolidated into one awful dogshit look.

In contrast I played Death Stranding 2 with its shared engine with Horizon. Both are crisp, clean, beautiful and run incredibly. I never got a headache. Fundamentally they looked unique, but the shared elements was from it's competence much like an auteur where you see the echoes of shared competence across unique works. Meanwhile you play 5 different UE5 games and they have the same fucked reflections, shadows and blurry image. God I hate UE5.

You conveniently leave out that the Decima engine is only used by 2 studios, and they are top tier in the industry when it comes to tech competence. Lets how the optimization fares when Decima is given to some indie studio. UE is used by many studios big and small, and not all of them are as talented as the folks at Guerrilla Games and Kojima Productions. We can see what UE5 is capable of with a game like Hellblade 2 and that was using an old version of UE5. Kojima's new game OD runs on UE5 and it looks good. Making games is hard, and a general purpose game engine doesn't have solutions for every type of game, it's up to the developer to shape the engine to their needs.

Furthermore, the aesthetics of a game has little to do with what game engine it's running on. Does Hunters Gathering look like a Horizon or Death Stranding game to you?
 
Last edited:
I wonder how much they are paid for hardware companies to making every video the idea games are not unoptimized and the problem is your old/new hardware is not powerful enough.
 
Some of the earlier UE5 games suffered from CPU bottlenecks due to UE5 being extremely single threaded. This was resolved by UE5.4. Every game since then that has issues is because these consoles are not equipped to run games at 60 fps while pushing next gen tech.

Gamers can go to the 30 fps modes which run at an average of 1440p but they refused to do so, and that is basically the core of the problem. people bought these cheap $500 consoles in 2020 and think they are going to run games at 4k 60 fps while pushing ray tracing and AI upscaling these consoles dont support.


Some of the more recent UE5 games actually run at 1080p 60 fps. Expedition 33 and Mafia both target 1080p 60 fps. mafia has some drops while riding cars through the open world but during normal gameplay its mostly 60 fps. UE5.6 is supposed to make hardware lumen performant enough so that it runs at the same performance profile as software lumen. They already showed Witcher 4 running at 800-1080p 60 fps using hardware lumen.


100 percent!

I don't think it's just down to us. I mean, thinking logically, what you say makes sense, but then we have marketing for these consoles that's doing nothing but talking about 4k, 60fps ray tracing. Even 8k is sometimes mentioned. It's not great.

But I agree, consoles have always been built around 30 FPS since polygons were introduced and we shouldn't be expecting more. Its a nice thing when games are 60fps with great iq on consoles
 
I thought this was one of their better discussions. They do blame the engine in that they are you gotta leave out lumen in games like High on Life 2 that dont have any dynamic lighting or day night cycles.
That's not blaming the engine. But I know better than discussing UE5 with Epic's resident GAF advocate. We've been through this before. We agree to disagree.
 
Ah yes, the games with an RTGI resolution so low that it misses half the objects on screen....and when you are on PC and turn on PT it suddenly is just as heavy as any other PT mode in other games. Almost as if there aren`t any magic shortcuts with that tech beyond what is generally being used already, only compromises.
😂 this is some strong cope (everytime you refer to the non-UE5 games the somehow get worsen and worsen). UE5, beyond looking good in screenshots (Oblivion: Remastered is the poster child of customers going for bullshots), has missed the mark on one of its biggest markets (consoles) and even its own tech demo was showing issues that became hallmarks of UE5 temporal artefacts (panning the camera around the main character would leave a soup of pixels behind her)

No there is no magic shortcuts, but custom engines made by competent devs for their own games can find compromises that look quite good. For RTGI they did for PT it is quite likely they went a bit more brute force than they did for RT which was the focus and yet it was about a 39% penalty not 2-3x.

Nobody is asking for native 4K and Path Tracing on consoles or PC, lower resolution and upscalers are completely fine (well depend on how shitting you achieve that I guess). You are 6-7 years in the past where we could keep harping on "why use custom engines, I want devs to use Unreal Engine, look how cool it is and these days there is no advantage with a custom engines made for a game vs a generic jack of all trades one?!?"… we have seen the results of CFOs and CTOs everywhere jumping on the money savings promise and between stutter problems on even the highest end machines and other problems on top on all others we can see the engine also trying to make it into Hollywood movies was not the bestest match (the 4K TV report did by DF before a smaller PC monitors from afar is finally getting a bit of needed spotlight).
 
But I agree, consoles have always been built around 30 FPS since polygons were introduced and we shouldn't be expecting more.
???

What are you talking about?

Go and play some UE5 games like Oblivion: Remastered in the quality mode (30 FPS) and move the camera. Once you have stopped puking we can talk about it some more…
 
Recently started playing Spider-man 2 on PS5 Pro after playing Clair Obscur.
Back in the PS3 days, you'd expect a slow, super-linear, turn-based QTE-as-a-feature game to look a lot better than a superhero game where you zip around ray-traced skyscrapers at a 65+ VRR frame rate with no slowdowns or issues.
But alas, that's not the case! It's the slow, super-linear, turn-based QTE-as-a-feature game that both looks and renders a lot worse.
Yeah sorry for shitting on CO, but the entire industry declared this slow, super-linear, turn-based QTE-as-a-feature game the GOTY.
I know Spiderman 2 had probably 10x the budget and sold worse.
 
???

What are you talking about?

Go and play some UE5 games like Oblivion: Remastered in the quality mode (30 FPS) and move the camera. Once you have stopped puking we can talk about it some more…

I'm talking about consoles have never been able to deliver 60FPS or above in big games, for generations now. What are you talking about?

Consoles delivering a wavering FPS from 20s to 50s is not what we used to mean by 60FPS. All of this shit was goal post moving made by console developers marketing teams.

4k/60fps no longer means what it used to mean.
 
Last edited:
My immediate take is that the supposed ubiquity of Unreal 5 games is VERY exaggerated. It's mainly just smaller studios who can't afford their own engines, as far as I can see.

I might be wrong but I think the only Unreal 5 game I've actually bought has been Jusant, and that was on a deep discount. I've played Silent Hill 2 but that was "free" on PS+.

So I think it's a seriously overstated problem, first of all.

As for the actual engine, I don't think it's particularly great, tbh. I've seen lots of games this gen with no Nanite and no mesh shaders push much more detailed meshes just leveraging the SSDs. And LOD transitions aren't a visual blemish that have ever particularly bothered me, personally. So I think Nanite is very overrated as a technology right now. And other studios seem to get on fine just using regular RTGI instead of Lumen without killing performance much more, or even a bit less, so that also seems pretty unnecessary.

Dunno, to me it just seems like a very heavy engine for not that much gain.
 
I think YouTube forcing creators to do long form style videos to get monetization ruined things. So many creators really don't need to have their personal opinions heard if I'm being honest - just show me how to make a demi-glace and be done with it.
I also think DF are now more and more saying what they think people like to hear, rather than an honest opinion to get likes and subs.
 
Wuthering Waves is one of the most beautiful games out right now (scales incredibly well to high-end hardware) and runs on UE4.

Miss me with UE5. Sure we get the occasional Arc Raiders, an experienced dev team that knows how to leverage the engine's capabilities into their own pipeline/requirements and strip out all the fluff but for the most part UE5 has been a bane on the industry. Shits ass.
 
Last edited:
Trying to use full next gen features on current gen hardware and cutting corners will always be dogshit no matter what hardware you have ...

Ps6 will be more powerfull but if devs try to brute force lumen 2.0 or ultra path tracing or whatever shit they invent by the time , it dosent matter it will run like shit as always.
......

Games should target 60 fps and devs should program for it according to the avaible features for the gen they are in and then scale up for 30 fps if they choose to. We have great looking games running at 60 fps on consoles so dogshit ones are entirely the devs fault for just pressing the "ray tracing" or whatver bullshit button on ue5 and running with it. Only for their games to have "the latest tech" or to cut some corners no matter how dogshit it runs or looks.

....

Consoles will never ever be powerful enough to push the latest tech meant for a high end pc card and have good perfomance, and no, 30 fps its not good perfomance in 2026, Its a optional mode for graphics aficionados that can still stand the awful fluidity of it.
 
Last edited:
I wonder how much they are paid for hardware companies to making every video the idea games are not unoptimized and the problem is your old/new hardware is not powerful enough.
Unlike typical angry gamers, who don't know much about technology and only see a low framerate on their console, the Digital Foundry guys see the reality. Lumen (real-time lighting) and Nanite (unlimited geometry) come at a cost and require a certain level of hardware power to run at 60 fps at a high resolution. Epic spent a lot of money optimising these techniques as much as possible. Even without RT cores, Lumen lighting runs as fast as RT GI on a GPU with RT cores (I tested that, so I know and can show you examples). People praised ID tech 7 engine in Doom Eternal but once ID added RT GI to their next game "DOOM The Dark Ages" performance matched lumen in UE5 games. This shows how incredibly optimised Lumen is, not the other way around. As for Nanite, try rendering a similar amount of geometry without Nanite in another engine and see how the framerate compares. I was able to test the UE5 tech demos on a GTX 1080, and enabling Nanite didn't destroy the framerate, so I actually think Nanite is incredibly optimised. Other games (engines) may run faster, but they aren't rendering half of what UE5 games render. Show me a game with a more detailed ground surface than Hellblade 2, where every single small rock is rendered with geometry, that runs at a much higher framerate. Only then will you have proof that UE5 is unoptimised.

The only problem UE5 had on PC were stutters, especially in the SH2 remake and TES4: Oblivion. Ark: Survival Ascended supposedly runs even worse, but I haven't played that game. In most of the UE5 games I have played, stuttering does only happen from time to time and doesn't affect the experience. Here's my gameplay in one of the most demanding UE5 games with a huge open world. The stutters happened on rare occasions and didn't ruin my experience, even with the launch version. The game has now been updated and runs even better.



Does this game looks uplayable to you due to stutters?

There are even UE5 games with zero stuttering, such as MGS3: Delta (Alex Battaglia confirmed that in his MGS3D PC analysis video) . Most UE5 games have stutters on rare occasions (Borderlands 4 or Mafia: The Lost and Damned), but that doesn't detract from the experience. UE5 is already in a good state on PC and I believe that the collaboration between Epic and CDPR will finally eliminate stuttering in open-world games completely making this engine literally perfect. UE5 will look amazing on next-gen consoles, and those who are currently complaining about it will end up loving it.
 
Last edited:
Miss me with UE5. Sure we get the occasional Arc Raiders, an experienced dev team that knows how to leverage the engine's capabilities into their own pipeline/requirements and strip out all the fluff but for the most part UE5 has been a bane on the industry. Shits ass.
You said it yourself that you get good results with the engine if developers know what they are doing then go on that it is somehow still the engine`s fault if it isn`t used well half a sentence later.
Tim Ryan Incoherence GIF by GIPHY News



As if devs that allot zero time to configuration and optimization would be better off with any other engine, let alone be able to create their own with good results.
 
Last edited:
If the best that can do an experience team with UE5 is Arc Raiders, well…
Kyle Mooney Snl GIF by Saturday Night Live
The simple truth is that the consoles aren`t strong enough for all the bells and whistles, at least not at the performance and resolution targets people have come to expect. It´s a very different thing on PC.
 
Last edited:
The simple truth is that the consoles aren`t strong enough for all the bells and whistles, at least not at the performance and resolution targets people have come to expect. It´s a very different thing on PC.
The consoles not being powerful enough is not a revelation of a shitty engine. Is something everybody knows. Is something the customer knows and accept when he pays 1/3 of the price. The thing is: the consoles are not powerful enough period. With all the engines. But there is one engine that consistently under performs in consoles and mid-low end PCs even when it works correctly. A shitty engine. Do more with more power is not the mark of a good engine. We've been through this with Crysis almost 20 years ago, people.
 
The consoles not being powerful enough is not a revelation of a shitty engine. Is something everybody knows. Is something the customer knows and accept when he pays 1/3 of the price. The thing is: the consoles are not powerful enough period. With all the engines. But there is one engine that consistently under performs in consoles and mid-low end PCs even when it works correctly. A shitty engine. Do more with more power is not the mark of a good engine. We've been through this with Crysis almost 20 years ago, people.
the engine is not to blame when devs simply flip feature switches without a second look....Look at how far Ubisoft went with the adaption of their RTGI and Nanite solutions to the consoles...You could do the same in UE5....no one has done it. This isn`t an engine issue (aside from the old traversal stutters) this is a dev issue.
 
Last edited:
Games look last gen when you disable next gen features instead of optimizing them, no shit sherlock...
Yeah. Every game that runs ok in mid hardware looks like Arc Raiders.
Taye Diggs Wow GIF by Bounce

The mental gymnastics. After more than 20 years over here I'm used to them, but it used to be defending a machine, not fucking engine.
 
Yeah. Every game that runs ok in mid hardware looks like Arc Raiders.
Taye Diggs Wow GIF by Bounce

The mental gymnastics. After more than 20 years over here I'm used to them, but it used to be defending a machine, not fucking engine.
The incoherent jumps in your reasoning are baffling....
Stupidity Are You Stupid GIF
 
Last edited:
You said it yourself that you get good results with the engine if developers know what they are doing then go on that it is somehow still the engine`s fault if it isn`t used well half a sentence later.
Tim Ryan Incoherence GIF by GIPHY News



As if devs that allot zero time to configuration and optimization would be better off with any other engine, let alone be able to create their own with good results.
The issue is is that UE5 comes packed with tons of features that were usually done by third parties and are now 'baked in'. Lets say you want a volumetric cloud solution along with realistic lighting, in the past you'd perhaps enable it via a different vendor and getting the plugin to work would require some time and effort.

However if I, as a dev, can just tick a box for said feature and then rely on an upscaler as well then I don't think it's weird that we're at the point where we are now. A lot of "good enough" mentality and reliance on solutions such as upscaling, frame-gen, etc seem to be at play here. So a combination of 'lazy' devs and a lack of bespoke /specific tools basically.

Is there some other cause or reason you would ascribe to the issue with modern gaming?
 
Last edited:
However if I, as a dev, can just tick a box for said feature and then rely on an upscaler as well then I don't think it's weird that we're at the point where we are now. A lot of "good enough" mentality and reliance on solutions such as upscaling, frame-gen, etc seem to be at play here. So a combination of 'lazy' devs and a lack of bespoke /specific tools basically.

Is there some other cause or reason you would ascribe to the issue with modern gaming?

absolutely agree. Put "financial pressure" and "talent-bleed" on that heap, too.

And yet you didn't rebate any.
Nothing to rebate on incoherent babble.
 
Last edited:
People in this thread praise Death Stranding 2 graphics and image quality, so I checked the console screenshot thread to see what I'm missing.


9nz7pfHT_o.jpg



1752036927-death-stranding-2-on-the-beach-20250703203533.jpg


kxjSnZQY_o.jpg


kspensnT_o.jpg


1752124167-death-stranding-2-on-the-beach-20250709085644.jpg


The DS2 graphics look okay, but they're not in the same league as UE5 games. The geometry isn't very complex, the textures are flat, and the image quality doesn't look 4K like on my 4K monitor.

What's the most most impressive game I saw in console screenshot thread? UE5 game Black Myth Wukong.

D Darsxx82 screenshots from XSX version, probably in quality mode.

G0TfDIV8D6nLgxXB.jpg
vmEdfTxuW4rNwtsC.jpg
sMgSxsBAq0Ue4a7w.jpg
mLvmgxgfQV6Gy3QD.jpg
1WrUfAez4NiB9nNc.jpg
 
Last edited:
PeteBull PeteBull SlimySnake SlimySnake You guys upgraded your PCs recently and now have an RTX 5080. Could you share your thoughts on the image quality and frame rate of your current PC in UE5 games? Are you happy with the results you're getting with UE5 games on your PC? Have you noticed any issues that makes UE5 much worse compared to other engines?
 
Last edited:
UE5 is pure shit and clearly born under a bad star. On paper it can looks awesome, but in reality largest part of games looks terrible because of devs trying to push features that are not meant for this gen hardware.

Engines made for this gen hardware looks far better as final result because even lacking features that UE5 have, they shows and infinite better IQ, smoothness and stability.

When Rockstar will release GTA 6 for PS5/Xbox Series it'll looks thousands time better than any UE5 games because they targeted this gen hardware at 30fps.
 
Last edited:
People in this thread praise Death Stranding 2 graphics and image quality, so I checked the console screenshot thread to see what I'm missing.


9nz7pfHT_o.jpg



1752036927-death-stranding-2-on-the-beach-20250703203533.jpg


kxjSnZQY_o.jpg


kspensnT_o.jpg


1752124167-death-stranding-2-on-the-beach-20250709085644.jpg


The DS2 graphics look okay, but they're not in the same league as UE5 games. The geometry isn't very complex, the textures are flat, and the image quality doesn't look 4K like on my 4K monitor.

What's the most most impressive game I saw in console screenshot thread? UE5 game Black Myth Wukong.

D Darsxx82 screenshots from XSX version, probably in quality mode.

G0TfDIV8D6nLgxXB.jpg
vmEdfTxuW4rNwtsC.jpg
sMgSxsBAq0Ue4a7w.jpg
mLvmgxgfQV6Gy3QD.jpg
1WrUfAez4NiB9nNc.jpg

Death Stranding 2 looks like a last gen game when compared to Wukong 😂

But seriously DS2 excellent art direction along with photogrammetry textures is what makes it stand out, otherwise Decima doesn't have modern engine features that UE5 has like a fully dynamic global illumination system, so you can see the limitations with the graphics if you know what to look for.
 
Last edited:
PeteBull PeteBull SlimySnake SlimySnake You guys upgraded your PCs recently and now have an RTX 5080. Could you share your thoughts on the image quality and frame rate of your current PC in UE5 games? Are you happy with the results you're getting with UE5 games on your PC? Have you noticed any issues that makes UE5 much worse compared to other engines?
It depends on particular game, even native 4k60 even w/o rt is out of the question in some games, but dlss transformative model makes huge difference, but u still gotta fiddle with settings and for example use dlss balanced so native 1080p dlssed to 4k is fine enough.
There are so many ways u can make it work but again true native 4k at 60fps stable and maxed everything can even put much stronger 5090(that has base tdp of 575W) to its knees.

For comparision vs ps5(or pr0) kcd recently got next gen patch and it obviously doesnt hold stable 60 at those settings even on pr0, here on my 9800x3d and 5080 i max it out at 4k and i easily get 4k60, most of the time %use of gpu varries between 50 and 70%(there is one room where gpu usage spikes but dunno why, its ur own room in the inn near ratay ).

Forget about native 4k at maxed settings in current gen games but dlss4(transformer model) makes it much lighter on the gpu to the point that u can max or close to max those games at 1080p ai upscaled to 4k and IQ is still solid(not perfect but way above what consoles can offer atm).

Here quick example with cp2077, just coz every1 and their momma by now knows how it runs.


As u can see in the vid dlss quality(so 1440p to 4k upscale using transformer model) gives 70-90fps and still 0 rt.

Now lets check 4k ultra and rt at ultra(pt off)

U can see its barely 4k30-35 in this case

U drop res from native 4k to 1440p native(same settings so ultra/max with rt ultra but no pt) and suddenly 4k60-65

TLDR: At that lvl of hardware if we talking impressive current gen games u still cant go 4k60 at max and lowering settings below high isnt usually worth it, instead u take advantage of literal black magic dlss4 is(not framegen, fuck fake frames, they are only good if ur starting fps is 50-60, otherwise total bust, i mean dlss transformer model) and simply AI upscale from native 1080p or 1440p depending on the game to still get those max/close to max settings and solid IQ.

Here video with many more games tested, interesting point is- they check here native 4k and later dlss quality for comparision:


TLDR: DLSS is ur friend, and transformer model is far beyond what ps5/pr0 can do, IQ is almost as good as native durning gameplay where u dont look for pixels/artifacts DF wise at closeups/slowed down video, its not as good as native 4k obviously but degradation of IQ is relatively small, to the point it doesnt distrurb u from enjoying/playing the game like it does on base ps5/pr0 in 60fps modes :)

It even makes sense if we look at literal power of hardware:
ps5pr0 gpu is roughly similar to 9060xt( when talking in actual games results), 5080 is literally at +99% of that and add superior AI upscaling method/better rt capabilities too.
 
Last edited:
People in this thread praise Death Stranding 2 graphics and image quality, so I checked the console screenshot thread to see what I'm missing.


9nz7pfHT_o.jpg



1752036927-death-stranding-2-on-the-beach-20250703203533.jpg


kxjSnZQY_o.jpg


kspensnT_o.jpg


1752124167-death-stranding-2-on-the-beach-20250709085644.jpg


The DS2 graphics look okay, but they're not in the same league as UE5 games. The geometry isn't very complex, the textures are flat, and the image quality doesn't look 4K like on my 4K monitor.

What's the most most impressive game I saw in console screenshot thread? UE5 game Black Myth Wukong.

D Darsxx82 screenshots from XSX version, probably in quality mode.

G0TfDIV8D6nLgxXB.jpg
vmEdfTxuW4rNwtsC.jpg
sMgSxsBAq0Ue4a7w.jpg
mLvmgxgfQV6Gy3QD.jpg
1WrUfAez4NiB9nNc.jpg

Both DS2 and Horizon games are not even close to best looking UE5 games.

Decima has big advantage with performance but it's no surprise when game doesn't support any post PS4 features (like mesh shaders, RT etc.). UE5 with fixed stuttering will be a great engine, let's see if 5.6 actually delivers that..
 
Both DS2 and Horizon games are not even close to best looking UE5 games.

Decima has big advantage with performance but it's no surprise when game doesn't support any post PS4 features (like mesh shaders, RT etc.). UE5 with fixed stuttering will be a great engine, let's see if 5.6 actually delivers that..

We'll see what UE5 is truly capable of with The Witcher 4 since Epic is staking their reputation on making sure that game runs well and CDPR doesn't want another Cyberpunk situation.
 
Last edited:
Oh u can be sure it will look amazing, hell we had that witcher4 demo when devs admitted themselfs base ps5 runs this at 800p(and 60), so even at top talent and many improvements which we gonna get in W4 we still gonna need to accept that 800p on base ps5(and probably 1080p or around that on the pr0).


We all can admit that demo looks gorgeous for base ps5, especially considering 60fps, but lets not deceive ourselfs, on ps6 or even midrange pc in 2028 it will look like a milion bucks in comparision, once we see it there we will look at ps5 version like:
what-bitch.gif
 
We'll see what UE5 is truly capable of with The Witcher 4 since Epic is staking their reputation on making sure that game runs well and CDPR doesn't want another Cyberpunk situation.

Cyberpunk engine was very heavy (especially for last gen consoles) but it's the opposite of UE5 when it comes to multuthreading (it's excellent in CP) and lack of stuttering (in fully open world engine).

I'm really interested how they will adapt/fix UE5 to their needs.
 
Top Bottom