Graphical Fidelity I Expect This Gen

DS2 looks good enough, but the image quality and stable framerate is much more important to me, than some RT effects. So i prefer DS2 over any UE5 game on console that runs at 800p with heavy drops. The IQ here is PRISTINE and sadly this is not a common thing this generation

This is my main issue with UE5. It can look good but the image quality is poor always, even if you use TSR, the ghosting is disgusting and the performance in general is trash and I say this as a guy with a 4090 and one of the best amd cpus in the market.

The engine has potential but it's not good enough. I prefer playing a game like DS2 which looks and feels great in comparison
 
A lot of UE5's flaws were fixed with UE v5.6 and upcoming 5.7 (nanite foliage)

But most studios don't bother updating their engine mid project.
And considering how long it takes to develop games nowadays, it's going to take years until we see these improvements in our games.
 
This is my main issue with UE5. It can look good but the image quality is poor always, even if you use TSR, the ghosting is disgusting and the performance in general is trash and I say this as a guy with a 4090 and one of the best amd cpus in the market.

The engine has potential but it's not good enough. I prefer playing a game like DS2 which looks and feels great in comparison
Image quality is not poor using DLSS
 
Cyberpunk 2077 still stands out.

Cyberpunk-2077-_C_-2020-by-CD-Projekt-RED-2025_6_20-16_00_01.jpg


Cyberpunk-2077-_C_-2020-by-CD-Projekt-RED-2025_6_20-16_00_50.jpg


Cyberpunk-2077-_C_-2020-by-CD-Projekt-RED-2025_6_20-16_01_20.jpg


Cyberpunk-2077-_C_-2020-by-CD-Projekt-RED-2025_6_20-16_03_32.jpg
Yup, re-playing it currently and with normal rt on ultra(no psycho/no pathtracing) it still looks great on my 3080ti thx to that transformer dlss4 model, artifacts from upscaling are substantially reduced vs previous method.
Kinda shame new xbox gonna be made with coop of amd since going nvidia could have been 1 objective advantage it could get over amd based ps6...
 
Got any examples to post of what you see?

These types of visual issues are mostly visible in motion. Is not something you can see on static screenshots. When you are playing the game, you get to see these artifacts and problems that only happen in this engine. Ray tracing in combination with upscalers seems to exacerbate this issue even more. Another engine that has this issue as well is the Red Engine, in Cyberpunk there is a lot of temporal ghosting too, even with DLAA and such but its not so bad as an unreal engine game luckily, and at least Cyberpunk performs well without stuttering or any issues

If you want to see real ghosting, go and watch gameplay of Wukong on playstation 5. Its almost like having a glaucoma
 
Image quality is not poor using DLSS

A side note, that affects all temporal upscalers is that frame rate actually affects the quality of the upscaling.
The reason is that temporal upscalers rely on accumulating data from several frames. So if a temporal upscaler requires 32 samples, then a game running at low frame rates will take much longer to accumulate the necessary data.
This is from one of the developers that made FSR4. He is comparing the RTX5090 and 9070XT, but the same logic applies to any GPU in the same performance range.
This means that FSR4 or DLSS4 running on a console at 30 fps, will look inferior FSR4 or DLSS4, running on a PC at 100+ fps. Even if the base resolution and settings are the same.

For a 2x upscale, the sequence length that is recommended for FSR and XeSS is 32. That means 32 frames are required to complete the sequence, and fully converge to a super-sampled output pixel. You may think this sounds very long - and it is. The algorithm utilizes motion vectors and other tricks to keep samples valid so previous data can be used throughout the convergence.
Finding benchmark data for FSR 4 vs DLSS 4 at the same modes had proven tricky. However, if we grab 1440p average game data for the GPUs in question from Hardware Unboxed as a ballpark figure, we arrive at Radeon RX 9070 XT at 119fps versus Geforce RTX 5090 at 192fps.

If our jitter sequence is 32 frames for a 2x upscale, that means there are 3.71 full accumulations per second capable on Radeon RX 9070 XT, or 6 on Geforce RTX 5090.

In other terms, at these frame rates it takes 269ms to fully accumulate on Radeon RX 9070 XT, or 166ms on Geforce RTX 5090. A huge 100+ms difference in convergence rate. With animation, particles and then camera motion in play during these quality comparisons, that is a lot of time to display lower quality pixels.

These days we are all getting told anti lag improvements of 15-20ms are groundbreaking. That frame generation latency hits of 30ms are unacceptable. So should a graphical convergence difference 2-3 times that be acceptable during a quality review?


 
Last edited:
These types of visual issues are mostly visible in motion. Is not something you can see on static screenshots. When you are playing the game, you get to see these artifacts and problems that only happen in this engine. Ray tracing in combination with upscalers seems to exacerbate this issue even more. Another engine that has this issue as well is the Red Engine, in Cyberpunk there is a lot of temporal ghosting too, even with DLAA and such but its not so bad as an unreal engine game luckily, and at least Cyberpunk performs well without stuttering or any issues

If you want to see real ghosting, go and watch gameplay of Wukong on playstation 5. Its almost like having a glaucoma

You are probably talking about lumen artifacts and noise created by poor denoiser. There are also motion trails with lumen not related to dlss or any other reconstruction technique.
 
Last edited:
These types of visual issues are mostly visible in motion. Is not something you can see on static screenshots. When you are playing the game, you get to see these artifacts and problems that only happen in this engine. Ray tracing in combination with upscalers seems to exacerbate this issue even more. Another engine that has this issue as well is the Red Engine, in Cyberpunk there is a lot of temporal ghosting too, even with DLAA and such but its not so bad as an unreal engine game luckily, and at least Cyberpunk performs well without stuttering or any issues

If you want to see real ghosting, go and watch gameplay of Wukong on playstation 5. Its almost like having a glaucoma
You are probably talking about lumen artifacts and noise created by poor denoiser. There are also motion trails with lumen not related to dlss or any other reconstruction technique.

I will "play" some Hellblade 2 and look out for these artifacts because I did not notice them but now I know what to look for I'm interested. Will definitely agree though that UE5 is lacking on consoles
 
This is my main issue with UE5. It can look good but the image quality is poor always, even if you use TSR, the ghosting is disgusting and the performance in general is trash and I say this as a guy with a 4090 and one of the best amd cpus in the market.

The engine has potential but it's not good enough. I prefer playing a game like DS2 which looks and feels great in comparison
lol you have a 4090 and are complaining about image quality? dude i play on my 3080 and nearly every game runs at 4k dlss quality 60 fps. all i have to do is set graphics to high instead of ultra. there is no way you arent able to max out these games on 4k dlss quality.

absolute nonsense.
 
lol you have a 4090 and are complaining about image quality? dude i play on my 3080 and nearly every game runs at 4k dlss quality 60 fps. all i have to do is set graphics to high instead of ultra. there is no way you arent able to max out these games on 4k dlss quality.

absolute nonsense.

Do you know how to read?
 
Speaking of IQ, I dont think they are using GG's new upscaling system that we saw in the HFW's Pro update. There is a lot of breakup and shimmering in foliage and water that i typically see in PSSR. Or in the older version of HFW. It doesnt always happen but i see it every now and then when rain, or other alpha effects come into play. Horizon fixed this 3 years ago. And the Pro's scalar made it even better.

i wouldnt be surprised if KojiPro basically went with the very first version of checkerbaording here that GG launched with 3.5 years ago.
 
I admit my ability for reading nonsense is a bit limited.


Man, your fanaticism blinds you sometimes, you can't even read between the lines

Just because I'm calling the performance trash doesn't mean my PC isn't pulling high frame rates. Of course it is. I'm talking about, stuttering, frame time inconsistencies, and general instability, the kind of issues that ruin the experience while playing, regardless of how powerful your hardware is.

This has become common with Unreal Engine 5 games lately. Just look at Oblivion Remastered, it's one of the worst offenders. No matter how high end your setup is, the performance is baffling.

So yes, UE5 can deliver impressive visuals, but that means nothing if the performance is a mess. Add to that the smearing, ghosting, and poor image quality and the end result just isn't worth it.

That said, I'll give credit where it's due, as Senua Senua mentioned, Hellblade 2 is one of the few titles that show UE5's potential. But overall, this engine has been more of a pain than anything else, no matter how hard that is for you to accept
 
That said, I'll give credit where it's due, as Senua Senua mentioned, Hellblade 2 is one of the few titles that show UE5's potential. But overall, this engine has been more of a pain than anything else, no matter how hard that is for you to accept
Add Robocop to the list. The only UE5 with no performance issues in my experience.
 
Time/memory is a funny thing because its crazy to think many of us were gushing over the bottom pick late 2019. Clearly a generational upgrade that I initially undervalued.

xAnrdtBHwwga1kRo.jpeg
 
Man, your fanaticism blinds you sometimes, you can't even read between the lines

Just because I'm calling the performance trash doesn't mean my PC isn't pulling high frame rates. Of course it is. I'm talking about, stuttering, frame time inconsistencies, and general instability, the kind of issues that ruin the experience while playing, regardless of how powerful your hardware is.

This has become common with Unreal Engine 5 games lately. Just look at Oblivion Remastered, it's one of the worst offenders. No matter how high end your setup is, the performance is baffling.

So yes, UE5 can deliver impressive visuals, but that means nothing if the performance is a mess. Add to that the smearing, ghosting, and poor image quality and the end result just isn't worth it.

That said, I'll give credit where it's due, as Senua Senua mentioned, Hellblade 2 is one of the few titles that show UE5's potential. But overall, this engine has been more of a pain than anything else, no matter how hard that is for you to accept
Yes,UE5 have problems (and they trying to solve it with each upgrade like last 5.6 huge performance upgrades) but at least they trying to push visuals other than those sony fuc.ers who lied down on their as. This entire generation truly disappointed by them.
 
Last edited:
Getting real tired of seeing cutscene models and photomode bullshots as a way of demonstrating current gen graphics. Let's see some actual in game screenshots!

Photomode =/= what you see when actually playing the game.
 
Yeah, cutscenes shouldn't be used to judge generational differences when cutscenes look like this and gameplay looks like this.

DAfhID3XMu9odOrM.jpeg


nSaWceF4AWDHKeU0.jpeg


G0aZjSyxu3VEr8tG.jpeg


Jk. Or maybe half kidding. The whole snake level with the fireworks was legit jaw dropping. Really cool lighting and environments. Sadly the open world continues to underwhelm.
 
Yeah, cutscenes shouldn't be used to judge generational differences when cutscenes look like this and gameplay looks like this.

DAfhID3XMu9odOrM.jpeg


nSaWceF4AWDHKeU0.jpeg


G0aZjSyxu3VEr8tG.jpeg


Jk. Or maybe half kidding. The whole snake level with the fireworks was legit jaw dropping. Really cool lighting and environments. Sadly the open world continues to underwhelm.

The gameplay shots don't even look that bad but atleast it's representative of what we're playing. A cutscene....i don't care if it's in engine or CGI, it's a non-interactive video essentially. Being rendered in real time is impressive for sure but its an interactive medium, show me how good a game looks during the parts where I'm interacting with it.
 
i have no idea wtf kojipro were doing for 6 years?
Hololive Nod GIF

+
200.gif

=
200.gif


He spent 6 years slapping vtubers asses. Last gif is his post-nut clarity realization he will have to ship DS2 without any meaningful improvement, thus breaking his "most innovative game director" streak, which was going since 1987)

It is kind of sad that we can't really know for sure if the bad graphics of this gen are mostly due to incompetent devs or weak hardware... I guess it is a little bit of both but still.
 
The gameplay shots don't even look that bad but atleast it's representative of what we're playing. A cutscene....i don't care if it's in engine or CGI, it's a non-interactive video essentially. Being rendered in real time is impressive for sure but its an interactive medium, show me how good a game looks during the parts where I'm interacting with it.
And even the best parts of the game are basically non-interactive. I've seen the whole playthrough (sped-up during traversal and boring fights).

Story is Kojima-level of stupid and awesome together. Cuscenes quality, especially when indoors, is really something to behold (even though only certain scenes looks really generationally different from the first one).

We should still commend the effort, at least for the characters models and cutscenes quality, considering they're real-time and are not using a proprietary engine.
 
The amount of people blowing their minds over this blatantly up-rezzed PS4 game is puzzling.

Kojima went from showing games like mgs2, which didn't even look possible realtime, to ds2, which doesn't look meaningfully better than games from 6 years ago.
Yo for real, mgs2 reveal trailer looked impossible and even if the final game was downgraded, it still looked like a dream (mostly).

I bought a magazine that had the physical videotape of the trailer and i was showing that thing to everyone, my mother, my father, my sister, her boyfriend, the dude that bring gas cylynders to my home, never did that before or after, it was that fucking incredible that you wanted to show that thing to everyone, even people who gives 2 fucks about vg.
 
Indiana with path tracing is on another level entirely, and it is definitely next-gen (while some choices may be less striking).

The delusions come from the same side: playstation cross-gen titles or remasters: people saying Horizon Forbidden West or The Last of Us part 1 remake are the best looking ever, while they're all clean-up versions of ps4 games.
Indiana with PT is the most bi-polar looking game I've ever played. I still can't believe DF gave it graphics of the year. Some moments look great, others look far worse, early ps4 level. The characters look like uncharted games on the ps3 a lot of the time.
 
Come on ND don't let us down! (I've already dropped expectations for another Uncharted 4 type of moment)
This thread in a nutshell:
- GG will save us.
- PD will save us.
- SSM will save us.
- Insomniac will save us.
- KojiPro will save us.
- Sucker Punch will save us.

Sorry bud, but people targeting native 4k 60 fps are not going to deliver anything more than last gen visuals. DS2 is the perfect example of this and they were targeting 1440p 60 fps. Essentially a last gen game.

You can see from the main thread, twitter and era that most people are simply ok with last gen visuals at 1440p 60 fps. The game is being called a graphics masterpiece everywhere except for this thread. I bet other devs who put in the effort to implement ray tracing and mesh shaders along with numerous next gen efffects are on suicide watch right now. All they had to do was put in zero effort improving the graphics from last gen, fire all their graphics programmers, and they couldve had a graphics masterpiece.

Consumers have spoken. There is a saying, easy does it. Well, nowadays sony studios have figured out, lazy does it. And everyone loves it. Anyone seeing this cum guzzling who is not tearing up their games to remove all RT and next gen assets is literally wasting their time. All to please the 20 people in this thread, while the millions eat up DS2, GOW Ragnorak, and other COD and Fortnite slop.
 
Last edited:
It counts. I could never call Tsushima ugly

What they need to fix is the combat and the cutscenes, which were horrendous
The combat ? That was the best thing in the first game lmao. They need to greatly improve their open world, one of the most boring, bland worlds I ever explored (despite its beauty).
 
twitter straight up trolling me now. i have liked a total of zero of these DS2 posts. Yet they keep recommending them to me for some reason.

 
200.gif


It is kind of sad that we can't really know for sure if the bad graphics of this gen are mostly due to incompetent devs or weak hardware... I guess it is a little bit of both but still.

He was juggling a bunch of other projects this time around. Physint, OD, DS anime

And the hardware is fine. Devs just had other priorities this time around. Sony went GaaS, some devs were busy pushing agendas, some were lazy, some all of that. And many pubs probably feared general economic problems after the pandemic so they took no risks.
 
Last edited:
The combat in GOT horrendous? 🤣 You cant be serious

The combat ? That was the best thing in the first game lmao. They need to greatly improve their open world, one of the most boring, bland worlds I ever explored (despite its beauty).
It was terrible. Same exact death animations for every kill in the game. Automatically makes it bad. Just lazy. Damn near unplayable after TLOU 2. AC Shadows combat was far superior. . Open world was abysmal too.
 
Last edited:
It was terrible. Same exact death animations for every kill in the game. Automatically makes it bad. Just lazy. Damn near unplayable after TLOU 2. AC Shadows combat was far superior. . Open world was abysmal too.

I agree with you on the open world, it was repetitive, and I got tired of it pretty quickly. But calling the combat "horrendous" makes absolutely no sense. That feels like hyperbole, especially considering it's one of the most praised aspects of the game and a major reason for its commercial success. The melee sword combat was incredible, both mechanically and stylistically, supported by a strong art direction and animation

Saying AC Shadows is better in that regard doesn't really help your case either. While the combat in Shadows is good, it has plenty of issues, repetitive finishers, and laughable enemy hit reactions. You slash an enemy, and they barely flinch, as if you're using a feather instead of a blade.

But hey, if that's genuinely how you feel, fair enough, its okay to be wrong once sometimes
 
Last edited:
It was terrible. Same exact death animations for every kill in the game. Automatically makes it bad. Just lazy. Damn near unplayable after TLOU 2. AC Shadows combat was far superior. . Open world was abysmal too.
Funny. Playing GoT after playing TLOU2 for a month made me enjoy video games again. TLOU2 was so dreary and miserable, it left me in a shit mood every time i played it. GoT was like a breath of fresh air with its beautiful vistas, likeable protagonists, and a somewhat hopeful story despite some really dark events in the plot.

Animations were canned but the combat system with its three stances, and a host of different stealth moves was fantastic. Above average for the genre. Open world exploration was trash though. Fairly basic ubisoft slop, but thats pretty much every open world out there.
 
Top Bottom