Graphical Fidelity I Expect This Gen

FdCj4ZUPrRxMx1cn.jpg

EpyNN3ME3NRKAGAH.jpg
JeY9BDb40I0wMMUp.jpg
naaaah, DS2 hair and beard look not as good as H2 and these are cutscenes while H2 ingame graphic always look like that.
 
"Kojima san" and his team has write access to Decima engine. Kojima san literally chose the name of the engine. Kojima san had 6 years to add ray tracing either alone or by partnering up with GG, and failed miserably.

There were around 30 games released in the first year of PS5 that had ray tracing support. Mostly rt reflections or shadows, but nowadays in 2025, realtime GI either software or hardware RT is pretty much a standard and baked lighting is on its way out. Callisto added three RT effects in 2022, star wars jedi survivor added 3 RT effects including RTGI 4 months later. Both UE4 games that still hold up next to UE5 games.

No one should get a pass 5 years into this gen.

The biggest visual difference between decima DS2 and say Hellblade 2, is down to the geometry details.

UE5 can throw up tons of polygons and that gives visuals a sizable weighted feels.

Until GG create a new engine that can use mega-geometry and such, i dont think Kojima-san can do much, besides the designing the gameplay aspects and maxing out their artwork and game performance.
 
The biggest visual difference between decima DS2 and say Hellblade 2, is down to the geometry details.

UE5 can throw up tons of polygons and that gives visuals a sizable weighted feels.

Until GG create a new engine that can use mega-geometry and such, i dont think Kojima-san can do much, besides the designing the gameplay aspects and maxing out their artwork and game performance.
They don't need to create new engine they just need to update their current engine, so that ex rage 9, UE 5, Snowdrop and anvil are just updated versions from previous generations with new futures like rtgi,virtual geometry and etc
 
Dont blame kojima san. His team did their best with character models and superb art work, to hide the clearly dated decima engine.

Spiderman2, Hfw, Wolverine, intergalactic, all looks weak

You just cant make an UE4 game look better than UE5 as a developer. FF7RB looks dated no matter how much artwork poured into it
Ds2 like horizon use metahumans models for the characters...
If anything, the lucid\waxy skin he add to his characters make them look slighty worse than some of the best metahumans out there.

Using real famous actors only help tricking the brain into thinking that they look more real then they actually do.

Look at how good takeshi kitano look on a a freakin AA yakuza game on ps4

yakuza-6-2016812104829_1.jpg
 
Did any games use it on Vega? It seems to have been undercooked when AMD dropped driver path. I guess peoples expected AMD to hijack a game and squeeze performance out of it with this? That would have been a monumental task to develop on a game to game basis and then to keep support, but apparently AMD engineers made the claim they would?

There were no games using Vega's Primitive Shaders. Mostly because of the complexity of the driver stack.
It was one of the reasons why Raja Kudori left AMD soon after.
 
Nextgen buddy, nextgen.
That is gonna be the baseline for all decent devs.

Maybe even some sony studios if they don't completely turn into nintendo 2, the revenge.
no. it won't be a baseline.
Just as uncharted 4 is not a baseline for this gen.
It all depends on the devs and I hope the game look still remains. As I've said. The universally good, cgi looking ue5 realistic look is not what everyone should be going after.
We are speeding towards boring, every game is looking the same future. Let is be games for a little bit more.
I already miss clever graphics and tricks.. idk... like re4 or said uc4
 
no. it won't be a baseline.
Just as uncharted 4 is not a baseline for this gen.
It all depends on the devs and I hope the game look still remains. As I've said. The universally good, cgi looking ue5 realistic look is not what everyone should be going after.
We are speeding towards boring, every game is looking the same future. Let is be games for a little bit more.
I already miss clever graphics and tricks.. idk... like re4 or said uc4
That's why i said any decent dev, i should have added any decent dev that want to push graphic.

Once again i remind you to check the title of the topic before posting.

If you want to open a topic about nintendo graphic or low fidelity graphic, be my guest.
 
Last edited:
That's why i said any decent dev, i should have added any decent dev that want to push graphic.
fair enough.
But it's the same promise, every gen. New tools and more power should make these graphics easier and faster to achieve.
And it never happens. At least not to the point you would imagine.
 
no. it won't be a baseline.
Just as uncharted 4 is not a baseline for this gen.
It all depends on the devs and I hope the game look still remains. As I've said. The universally good, cgi looking ue5 realistic look is not what everyone should be going after.
We are speeding towards boring, every game is looking the same future. Let is be games for a little bit more.
I already miss clever graphics and tricks.. idk... like re4 or said uc4
Dude my main point not about hyper realism I talk about graphics tech like rtgi and virtual geometry which can also used in Pixar looking titles and give them great graphics in gameplay that compared to Pixar movies which not hyper realistic we taking about graphics tech dude, that i praised whatever I saw it, like when I was praise ds2 cutscenes characters models in previous comments
 
Last edited:
no. it won't be a baseline.
Just as uncharted 4 is not a baseline for this gen.
It all depends on the devs and I hope the game look still remains. As I've said. The universally good, cgi looking ue5 realistic look is not what everyone should be going after.
We are speeding towards boring, every game is looking the same future. Let is be games for a little bit more.
I already miss clever graphics and tricks.. idk... like re4 or said uc4

Better graphics ≠ everything looking the same

I don't know why you keep bringing this up as if it's a thing. Game visuals are more diverse than they've ever been.
 
We are speeding towards boring, every game is looking the same future.
We are, but not in the way you meant it.
Here, we are being echo-chambered in graphics-whore communities thinking that this is the direction the industry is going, and some of us bemoaning that "OOOH-ALL-GAMES-WILL-LOOK-AMAZING-AND-REALISTIC-SO-BORING!!!" But let me bring the anecdote. I watched the latest Sony's State of Play with my brother who last time played video games sometime in the early-mid 2010s and who remembers how all the devs then were trying to beat each other with better graphics and physics. While watching the State of Play, he was dumbfounded by EVERY game having some cheapo stylized/artsy look instead of good realistic graphics. He didn't follow the gaming industry for the last 15 years and expected that all games would look similarly amazing and realistic now.
The majority of the games do indeed look pretty similar now. Like stylized cartoon crap.
 
Noted. Didn't cross my mind but they are indeed in the path of geometry pipeline changes, right after geometry shader I guess.

I can't say I know much about their implementation, but this post seem to cover some of the hurdles that differentiates the primitive shader tech to mesh shader's later implementation.

Yeah that talk with the AMD VP was very enlightening, we now know that Mesh Shaders on AMD cards compile down into Primitive Shaders.

Did any games use it on Vega? It seems to have been undercooked when AMD dropped driver path. I guess peoples expected AMD to hijack a game and squeeze performance out of it with this? That would have been a monumental task to develop on a game to game basis and then to keep support, but apparently AMD engineers made the claim they would?

As others have pointed out Vega and even RDNA 1 supported this on a hardware level, but the driver stack was extremely poor. Mesh Shaders are the future, but there's also competing solutions, most notably Epic's Nanite.

If you want to find out more, you can dig into LeviathanGamer2's old tweets as he covered Mesh/Primitive Shaders extensively over the past 5 years.


 
Last edited:
idk. I am having this reappreciation for custom engine older games graphics.
Yeah, but with the older engines they still strived for creating realistic graphics, is just that everyone had their own take and cheats and thus different results in creating realistic graphics.

But not all want that realistic take and there are games even in UE5 that are stylized or somewhere in between, you are focusing too much on the negatives and just the games that want to be realistic.

And even if some games take the realistic approach, I don't see it as bad as others make it, to me Hellblade looks different than Silent Hill 2 for ex. Hellblade is more hyperrealism combined with all sorts of cinematic effects and abstract stuff, SH2 is more photorealistic with some surrealism. They even have different color palettes.
So even in the realistic approach, there is more than one style and and also substyle from which to choose.
 
Last edited:
most notably Epic's Nanite.

This is a common misconception about nanite, that it's about geometry. Which it is, but not in the same way as Mesh Shaders.
Nanite is about how geometry is rasterized. Not how it is calculated.
Nanite replaces the HW 2x2 rasterization in Nvidia and AMD GPUs, with a software based rasterization model, that uses compute.
 
what?
takes the color from environment to paint the textures. seems like a pretty usual method to cheat RT
There is a difference between ambient and specluar gi. Theres no directionality to ambient probes. They paint the enviroment with a flat bounce with zero direction to it no matter how dense they appear.

With real per pixel rt you get infinite resolution based on your distance to render geometry. For instance If you get close up on a wooden box full of beer bottles in CP the bottom of the box is darker both on the wood and the bottles themselves the deeper into the box they go even though they take upalmost zero screen realestate in usual play. The amount of probes nessasary to create a significant similarity would be fucking astronomical in density not too mention how much more memory intensive a non virtualized approach would devour. Probes are a dead end from just a memory perspective alone. We need to virtualize everything visualy.
 
Last edited:
Yeah that talk with the AMD VP was very enlightening, we now know that Mesh Shaders on AMD cards compile down into Primitive Shaders.

As others have pointed out Vega and even RDNA 1 supported this on a hardware level, but the driver stack was extremely poor. Mesh Shaders are the future, but there's also competing solutions, most notably Epic's Nanite.

Nanite is Epic's application of mesh shaders before it was a thing. The iteration just before, compute shaders. Its so similar that throughout the internet you'll find articles and peoples at GDC talks saying that nanite is basically using mesh shaders. They assemble and test the polygons in compute shader and bypass raster hardware with their software solution because HW raster cannot manage this sea of micro triangles, and also because compute shaders by themselves cannot output triangles. Well their solution is hybrid actually depending on benefits of performance and size of triangles. They have hurdles of their own with decompression though that Mesh shaders, AMD believes at least, are better at and remain on GPU pipeline along with much much more aggressive culling than nanite's solution. Mesh shaders avoid the round-trip through memory.

I think Epic picked a solution that is more broad and widely supported by any hardware almost and a solution probably started before even mesh shaders was a thing, the idea likely started in 8th gen consoles, as everything has compute shaders on AMD side since GCN. But eventually I think they'll lean even more on mesh shaders, next gen unreal engine I would bet they transfer from compute shaders+software rasterizer to just GPU mesh shaders, along with the fact that it is more programmable.

If you want to find out more, you can dig into LeviathanGamer2's old tweets as he covered Mesh/Primitive Shaders extensively over the past 5 years.




Gonna check
 
Top Bottom