• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Threat Interactive] Dynamic lightning was better nine year ago | A warning about 9th gen neglect.

SlimySnake

Flashless at the Golden Globes
I'm glad you posted that vid rofif. We can all agree now baked lighting looks dated. Even in its best examples. Thankfully Epic and other developers having pushed real time lighting tech forward.
That open world level in Uncharted 4 looks really off. At times it can look great but you can see just how dated it can look at other times. Even some of the other bigger levels towards the end of the game show this behavior. As you can see in the last screenshot. It looks like its from another game compared to some of the screenshots posted here. its a matter of consistency.

GDFiEE5WQAAH6wx


GFbDVO5WwAApJ9Y


F_d_nfnXAAEY8Zs


If you told me the screenshots below are from the same game, i wouldnt have believed you but this is where the devs clearly sat there baking in lighting and making it look the best it can look.


GLu_s0qWIAE5fM-

GLu_ll-XIAAZbeH
 

ap_puff

Banned
I think the problem right now with RT is that devs still have to do the raster version as Jensen hasn't seen fit to make a decent RT capable card under $500, all the cards below that need significant compromises which means RT is pointless. We really need the market to move to 4070-level cards instead of still being on 1060-3060 level cards if RT is going to catch on for good (and not as just an afterthought), and for that to happen the 4070 needs to be a $250-300 card for at least 3-4 years. Otherwise the only good RT games are going to be the ones that nvidia sponsors to sell their next gen of cards.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Here is a good example of turning off Lumen in the Matrix Awakens demo.

It's not just the added shadows and AO, but you can see the whole level take on the orange tint of the sunlight. So everyone saying just turn off lumen in linear games is probably not understanding just how much you can lose in each scene. He actually turned off Lumen in his Silent Hill 2 video and the game went from looking amazing to last gen as fuck.

LVeChT3.gif
 

Vick

Gold Member
The thing about Uncharted 4 open Madascar level, other than sheer size (and driving mechanics, car physics, outposts AI etc.), is the inclusion of volumetric shadows in every body of water (something still missing in 99.9% of games 10 years later):



VuSNpAs.gif


But especially the fact all shadows across the entire massive level are the extremely heavy PCSS (on PS4 in 2016!) variety:

LWDDLwX.png


Other than amount of red bounce GI everywhere. It is a level that combined with ND standards in animations:

H9arRiP.gif


The desire to provide the very best IQ ever seen in a console game with their TAA implementation that puzzled DF on how it was even achieved:

Nz0So4f.png



All the other things ND imposed on themselves to deliver at the same time in the same package, obviosuly raped the PS4 hardware like never before.

Maybe something more could have been achieved by dropping resolution on base PS4 to Alan Wake 2 2024 PS5 Pro levels.

On PC with PT however I havent seen a single scene where the lighting would look bad.
Well, it would be moronic if that was the case.. although I'm pretty sure that if I started to mess with brightness levels and gamma (like those Submugi U4 screenshots you posted) it wouldn't really hold up that well either.

As I've said many times, Cyberpunk PT is still the best lighting currently in the industry by a mile. It still presents sporadic issues:



And runs like crap even on thousand dollars graphic cards where Uncharted 4, which is a 2016 PS4 game running on $399 console hardware from 2013, would run several hundreds times better, but it is unquestionably the most advanced thing in the industry for the time being.

You found two places with directional light on character faces but I played this game so I know character lightings looks most of the times. I will post my Uncharted 4 screenshots tomorrow instead of using other people's screenshots.
I admit this is quite amusing after you just literally scrolled for more than tens pages of stunning Uncharted 4 screenshots only to post the worst aborts, mostly by user Sebmugi with completely Photo-Mode's altered values like gamma and brightness levels and some with filters on top, to drive your point home.
I was really suprised to see what you ignored in order to post those screenshots, often in the same post, as I wasn't really expecting that from you. Maybe I did misjudged afterall.

I'm confident you won't cherry pick screenshots by ignoring for hours all the times the game's lighting looks consistently impressive for the first 5 Chapters straight, and that you'll take gameplay screenshots instead of fabricated photo-mode ones, from the downgraded PC version of Uncharted 4.

As said already, no one here denies real time lighting advantages over dynamic objects and in especally when it comes to ND games which are known for this.
Just saying: "Characters lighting in Uncharted 4 looks good, but only during cutscenes, where the lighting artist has manually placed and adjusted light sources. During cutscens we can see directional shadows that give dimension to the characters faces. During gameplay however character model have only flat lighting that lack directionality and self shadows." is objectively false as already proven in these examples (as I hope you don't really believe these are the only instances in this colossal game) for directional lights during gameplay:

76lFFLx.gif


Y93WIXT.jpg



22ZfBuL.png



7SoKPz0.jpg



1bWkZnK.jpg


WYuKV75.jpg


JCcmJx9.jpg


And some nice ones from the same Thread where you selected your screenshots:

604Uncharted4AThiefsEnd.png


26374135153_c8b038292b_o.jpg


be0Uncharted4AThiefsEnd.jpg


33236263161_bb17b52eb6_o.png


And simple ambient light:

FaxNbxK.jpg


lC5Vbfy.jpg


LPAeNPB.jpg


3I4ZI6r.jpg


And some nice ones found in that same Thread, that were ignored during that selection:

da3oaha-22a05edd-71c5-45de-a797-0c153121e9de.png


w4vVRZE.png


mHaRcJk.png


9e3Uncharted4AThiefsEnd.png


da3oadk-383f4e10-7a8e-44bc-935e-bdb404ca85bb.png


215Uncharted4AThiefsEnd.png


And some less nice ones, but still serviceable to the point.

yAEkH5S.png


1470562913-uncharted-tm-4-a-thief-s-end-20160807111555.png


And posting screenshots from open world levels won't change much, especially when even those always feature GI bounce on Drake, like here:

iMk6vWe.jpg


Or here:

hR4zcpO.png


da3oacs-9b37a31a-5593-4089-9188-2fe91fe88043.png


Uncharted 4, as I've said multiple times:

Graphically, it's still the most inconsistent game i've ever played. The HDR, tone mapping, color grading, all make for a visually stunning presentation that made me appreaciate my plasma more than ever. But if most of the time it's a damn straight pre-rendered movie (especially thanks to their GI solution and God tier PBR and SSS) and therefore the best looking game i've seen to this day along with The Order, TLOUII and P.T., too many times in the open areas it shamelessly shows bordeline N64 geometry, with awful use of normal maps or texture in general to mitigate it.
It is known that ND textures are all hand painted but while the artistry and resolution is more than commendable, they were over confident here. It's like they didn't care, they had their perfect lighting, characters, animations, vegetation, AO, Temporal AA and all the little technical feats and thought they could afford old gen shapes with cartoony textures on top (climbable edges, for instance) because the thing would have looked great regardless due to the "paintery" style and their tech pipeline. And they were wrong, because it's super jarring to have photorealistic characters, vegetation and lighting in the same frame of sharp edges and low poly mountains on the horizon.
Luckily this is mostly true for the more open areas, which are not common, but playing The Lost Legacy made it clear that they received this same feedback from more than one person because most of these issues are completely solved in the "DLC". A night and day difference.

Is the most inconsitent game I've ever played, for a multitude of factors. Including a very fast development, as it was started from scratch mid development due to Druckmann and Straley taking over. But its highs are unmatched.
Lost Legacy doesn't suffer from this inconsistency, and manages to look virtually always great, even though some slight concessions needed to be made in the massive open world level.
I could post a literal hundred Lost Legacy screenshots looking absolutely insane.
 
Last edited:

Jinzo Prime

Member
Here is a good example of turning off Lumen in the Matrix Awakens demo.

It's not just the added shadows and AO, but you can see the whole level take on the orange tint of the sunlight. So everyone saying just turn off lumen in linear games is probably not understanding just how much you can lose in each scene. He actually turned off Lumen in his Silent Hill 2 video and the game went from looking amazing to last gen as fuck.

LVeChT3.gif
Yeah, but is the performance penalization worth it? Can you simulate the bounce lighting in a more efficient way? I think that is what Threat Interactive is trying to say.
 

kevboard

Member
And whatever the hell obscure solution ND is using for the Ellie reflection in that pool:

PsIATpE.gif

that's a cubemap for the background (you can clearly see the edges of the cubemap even) and I assume a mirrored mesh of Ellie, both with a filter on it to make it dark and muddy.

similar to how Mario Sunshine reflects Mario in puddles you create with FLUD
mnFLHur.png


or how in Mario Galaxy in the hub world on the glass surface here:
5a3fTSV.png
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Rasterization
It's not RT but they have a full realtime global illumination lighting system.

timestamped:



My guess is something like Software Lumen which doesnt utilize RT cores and thus isnt as accurate, but far more accurate than baking in lighting data in textures. it's too technical for me to understand but they go into detail here.

Timestamped:
 

Vick

Gold Member
Yeah, but is the performance penalization worth it?
Well, it depends on who you ask.

Can you simulate the bounce lighting in a more efficient way? I think that is what Threat Interactive is trying to say.
You can, and he does make a good example in the video.

Looks great to me.
Yep.

y2UZ79q.gif


50644102042_5a178d88f5_o.png


Enn7Cj9XIAM5j8P


Enn7CVZXIAEkKUK


EneqAZRW4AIQKhR


EoL7QsxXIAI_zPa


Demons-Souls_20201123011119.png


that's a cubemap for the background (you can clearly see the edges of the cubemap even) and I assume a mirrored mesh of Ellie, both with a filter on it to make it dark and muddy.
Nice.
Considering 90% of players won't ever even see what's contained in that pool (I don't think there's a single walkthough on YouTube showing it), it's a cool example of what makes ND games special, and why they take so long to make them nowadays.

 

kevboard

Member
Nice.
Considering 90% of players won't ever even see what's contained in that pool (I don't think there's a single walkthough on YouTube showing it), it's a cool example of what makes ND games special, and why they take so long to make them nowadays.



while I can't stand their game design, it is undeniable that Naughty Dog are crazy when it comes to polish
 

mrqs

Member
That guy from the video clearly knows his stuff, but he's misguided in his assertions.

GI is here to stay, and we all benefit from it. We're in a transition phase and it's always painful, but gaming became extremely static since pre-baked rasterization became the norm.

95% of the time, you can't break the object in the scene because it would mess up the baked lighting. That's terrible and made gaming become a static theater.

Real-time lighting helps devs to make games faster and helps us with more interactivity and more simulated environments.
 

Vick

Gold Member
That guy from the video clearly knows his stuff, but he's misguided in his assertions.

GI is here to stay, and we all benefit from it. We're in a transition phase and it's always painful, but gaming became extremely static since pre-baked rasterization became the norm.

95% of the time, you can't break the object in the scene because it would mess up the baked lighting. That's terrible and made gaming become a static theater.

Real-time lighting helps devs to make games faster and helps us with more interactivity and more simulated environments.
You are generally right, but I admit reading this post after all that Uncharted 4 talk gave me a good chuckle.



The level of interactivity and destruction in the game is absolutely insane for huge chunks of it. Not only where it's expected due to shootouts and fights, but also in peaceful levels you aren't supposed to draw weapons, like when you first find Libertalia, a very suprising amout of assets are completely interactive and destructable. Not just big ones like chairs but especially little decorative stuff like these glasses for instance, or the bottle in the foreground:

5aMVwzf.png


But I also don't think it's not fair to blame RT for it as the game didn't have RT before the Pro and looked worse.
At the end of the day Remedy are the same devs that shipped Alan Wake on 360 running at 544p and awful framerate, with literally pixelated textures straight out of a GameCube game, on the same machine GTA V later ran on..

They have a good reputation on PC, which is mostly deserved, but when it comes to consoles or optimization..
 
Last edited:

TGO

Hype Train conductor. Works harder than it steams.
I get the point you are trying to make, but how is this even a fair comparison?
It's an 8 year old game....
So unless they suddenly improve in the next two years you'll be saying it's not fair for a modern game to be compared to a game from a decade ago
But IQ is Alan Wake 2's problem, it definitely more advanced and looks better at it's best.
But I also don't think it's not fair to blame RT for it as the game didn't have RT before the Pro and looked worse.
 

Buggy Loop

Member
It's not RT but they have a full realtime global illumination lighting system.

timestamped:

snipped - Digital foundry demon soul interview - to remove videos collage in my big reply..

This is like every probe grid based solutions that go as far back as 2015 if not older. It's raster 100%.



Whole video is interesting so I suggest a watch. Very smart peoples finding tricks with raster. But if you want to go to probe solution its around at 12mins.

Same from Killzone shadowfall 2013, so tech even before that of course



@1:03:50 roughly. For most engines the light probes used to be cubed maps so for Killzone it's the same language, it's just a precursor to light probes terminology.

Very recently, Spider-man



As time advanced, the algorithms could be refined to have better coverage of those light probe grids, as they used to be distanced, they pack more of them, smarter placements on bigger maps, refined grids, etc. Not inside of buildings, etc. They raised cube map resolutions, improved PBR algorithms, etc.

Light probes bridge the gap between static pre-baked lighting and dynamic. The grid already has the pre-baked light at a certain point, but as the character / anything dynamic moves throughout the grid, the probes, containing the lighting for their "region", will blend with the one behind the character for example and blend together, so whatever is moving in the scene is making sense lighting wise.

This is a video that explains it all very well at a basic level for a simple scene.



Now you get problems with these solutions as a baseline of course, if it was perfect nobody would chase beyond that. They are not able to resolve visibility and occlusion of dynamic scenes, if not properly done, light bleeding, becomes difficult to find those leaking problems massive open worlds, etc. There's again a ton of "tricks" devs have developed throughout the years, if you've seen the videos above, but it's tricky. (fucking smart peoples though)

Now you say, buggy loop WTF? Probe grids? That's starting to look like Metro Exodus EE probe grids for ray tracing? Well yes.

Nearly all the initial RT games were probe-based. RTXGI is probe based. The difference is they are not pre-baked so that's where ray tracing differentiate itself of course.

Ray tracing per pixel is of course out of the fucking question for gaming. That's something peoples don't understand, games are not doing like offline render farms, simply impossible with the hardware we have now. Everything is ALWAYS a trick in real-time gaming and smart peoples figure interesting things to make it work. But they always have a pro and con. So to make it possible to have decent performances, they are again interpolating between probes rather than per pixel accuracy.



BUT, with probes, because they are distanced, they have placements, they can still fail some places, the volumes are very important.

So then came in other solutions like ReSTIR which rely on reservoir-based resampling to share the interpolation with neighboring pixels. But again, very heavy, imagine the equivalent of a probe grid at near full pixel resolution? (Nvidia later found a solution and tweaked that into their Cyberpunk 2077 solution)

AMD also came up with a solution, AMD GI-1.0.

Smart probes trimmed down to the scene.

MGTvwNx.jpeg


Again we're still with probes. A long fucking way from the Killzone shadowfall solution in 2013.

Star Citizen based their upcoming RT solution based on AMD GI. 25k freaking probes for that scene! So to unify their RT solution with software lighting and not make their artists places probes in a very big solar system with all kinds of dynamic terrains (base building later), they used the same grid for raster. You can also see later in the video their software solution based on that but heavily simplified (looks like a PSX game when looking at probes). Now for software solution, it's environment probes and not ray tracing, they are lighted/relighted to keep the lighting information close to what ray tracing provided (but not dynamic objects, kind of like Lumen limitation).




So again, this AMD paper for ray tracing, inspired these devs to make a raster solution that gets DAMN fucking close. Almost software Lumen, but really, no rays. I would say this would again breath new life into raster solutions for a while to come. It looks really amazing in their comparison video honestly.

I think at best if Demon soul devs made some crazy raster approach with probes and its that good looking, it would be close to the above solution. But crazy enough, raster.


WOW. WTF am I doing at 23:23 with work tomorrow morning making this reply to you dude. Wasted my night :messenger_tears_of_joy:. I hope you will check around the videos a bit at least.

All in all, the raster solutions these devs were developing all tried to reach as close as possible the results of what ray tracing would be. Dynamic lighting (not dynamic scene!), indirect lighting, etc etc. Brilliant solutions. Very very fast as the probes don't have to bake any new information at runtime.
I think honestly that it was VERY close to overlapping with ray tracing especially since DDGI-RTXGI used probe grids. Quantum break's video is amazing watch for the tech they made for that lighting engine.

Cherry on the sundae, a more advanced for Unity from Pierre Yves Donzallaz who worked on Crysis 1 - 2 - 3, Ryse, GTA 5 & RDR 2 (so yea... quite the experience). He goes through the baseline, an ugly scene, to lightmaps, to light probe volumes, to ray tracing and path tracing.

 
Last edited:

SlimySnake

Flashless at the Golden Globes
This is like every probe grid based solutions that go as far back as 2015 if not older. It's raster 100%.



Whole video is interesting so I suggest a watch. Very smart peoples finding tricks with raster. But if you want to go to probe solution its around at 12mins.

Same from Killzone shadowfall 2013, so tech even before that of course



@1:03:50 roughly. For most engines the light probes used to be cubed maps so for Killzone it's the same language, it's just a precursor to light probes terminology.

Very recently, Spider-man



As time advanced, the algorithms could be refined to have better coverage of those light probe grids, as they used to be distanced, they pack more of them, smarter placements on bigger maps, refined grids, etc. Not inside of buildings, etc. They raised cube map resolutions, improved PBR algorithms, etc.

Light probes bridge the gap between static pre-baked lighting and dynamic. The grid already has the pre-baked light at a certain point, but as the character / anything dynamic moves throughout the grid, the probes, containing the lighting for their "region", will blend with the one behind the character for example and blend together, so whatever is moving in the scene is making sense lighting wise.

This is a video that explains it all very well at a basic level for a simple scene.



Now you get problems with these solutions as a baseline of course, if it was perfect nobody would chase beyond that. They are not able to resolve visibility and occlusion of dynamic scenes, if not properly done, light bleeding, becomes difficult to find those leaking problems massive open worlds, etc. There's again a ton of "tricks" devs have developed throughout the years, if you've seen the videos above, but it's tricky. (fucking smart peoples though)

Now you say, buggy loop WTF? Probe grids? That's starting to look like Metro Exodus EE probe grids for ray tracing? Well yes.

Nearly all the initial RT games were probe-based. RTXGI is probe based. The difference is they are not pre-baked so that's where ray tracing differentiate itself of course.

Ray tracing per pixel is of course out of the fucking question for gaming. That's something peoples don't understand, games are not doing like offline render farms, simply impossible with the hardware we have now. Everything is ALWAYS a trick in real-time gaming and smart peoples figure interesting things to make it work. But they always have a pro and con. So to make it possible to have decent performances, they are again interpolating between probes rather than per pixel accuracy.



BUT, with probes, because they are distanced, they have placements, they can still fail some places, the volumes are very important.

So then came in other solutions like ReSTIR which rely on reservoir-based resampling to share the interpolation with neighboring pixels. But again, very heavy, imagine the equivalent of a probe at near full pixel resolution? (Nvidia later found a solution and tweaked that into their Cyberpunk 2077 solution)

AMD also came up with a solution, AMD GI-1.0.

Smart probes trimmed down to the scene.

MGTvwNx.jpeg


Again we're still with probes. A long fucking walk from the Killzone shadowfall solution in 2013.

Star Citizen based their upcoming RT solution based on AMD GI. You can also see later in the video their software solution based on that but heavily simplified (looks like a PSX game when looking at probes). Now it's environment probes and not ray tracing, they are lighted/relighted to keep the lighting information fully dynamic like ray tracing provided. I think at best if Demon soul devs made some crazy raster approach with probes and its that good looking, it would be close to this solution.




WOW. WTF am I doing at 23:23 with work tomorrow morning making this reply to you dude. Wasted my night :messenger_tears_of_joy:. I hope you will check around the videos a bit at least.

All in all, the raster solutions these devs were developing all tried to reach as close as possible the results of what ray tracing would be. Dynamic lighting (not dynamic scene!), indirect lighting, etc etc. Brilliant solutions. Very very fast as the probes don't have to bake any new information at runtime.
I think honestly that it was VERY close to overlapping with ray tracing especially since DDGI-RTXGI used probe grids. Quantum break's video is amazing watch for the tech they made for that lighting engine.

Cherry on the sundae, a more advanced for Unity from Pierre Yves Donzallaz who worked on Crysis 1 - 2 - 3, Ryse, GTA 5 & RDR 2 (so yea... quite the experience). He goes through the baseline, an ugly scene, to lightmaps, to light probe volumes, to ray tracing and path tracing.


lol ive been there. in fact, im here right now. i need to get up at 7:30 tomorrow to get my kids ready for school but fuck it. graphics come first.

I cant watch all the videos since they are not timestamped but i watched the Quantum Break GI part and some of the Spiderman presentation. Even if its probe based, its way different from the baked lighting solutions we are talking about here. I will be happy if devs forgo Lumen or RTGI in favor or these lighting solutions especially if they are this inexpensive. Demons looks stunning to me and at 1440p 60 fps its definitely cheaper than Lumen which is 1440p 30 fps though they did say they have improved the performance with the latest UE5 versions.

Honestly, all i want is a next gen version of KZSF so if they can do that with this probe based approach only with more probes then great. If they can do that with lumen, fantastic. RTGI, even better. All i care about are the results. Alan Wake 2 is probably using the next gen version of Quantum Break's tech. it looks stunning. But Spiderman 2 apparently had 33GB of lighting textures on disc. 1/3rd of the disc size. It took them 4 days to bake in textures (used to take over a week on spiderman 1) so whatever probe based technique they are using must not have been fully realtime like Demon Souls.

If TLOU Part 1 looked like this i would have no issues with whether or not its baked or realtime or path traced. Bake it all you want. Just make it look like this.

EO6xSMg.gif


mIaYwSD.gif
 

SlimySnake

Flashless at the Golden Globes
BTW, since Rockstar is using RTGI for GTA6, Kojima is using RTGI for DS2, and even Insomniac is switching to full realtime GI with Wolverine, I think it's safe to say that devs have made up their minds.

I wouldn't be surprised if ND and GG follow suit. GG also teased using PS5's RT hardware in one of their post Horizon Burning Whores interview.

Also, Kingsmaker released a new trailer today showing a lot of destruction and they are not using Lumen despite using UE5. They made their own custom realtime GI solution to calculate lighting in realtime after all this destruction. So its not like devs HAVE to use Lumen.

 

SlimySnake

Flashless at the Golden Globes
Holy shit. This new UE5 Black Hawk Down shooter is nuts. The SP is on UE5 while the mp is on UE4 and looks a generation behind. It literally takes a shit on all 5 CoD games this gen. Infinity Ward, Treyarch, Raven and Sledgehammer should just resign and go flip burgers if they cant produce results like this without Lumen and Nanite.

ecZZo8X.gif


iXWALVu.gif
 
The thing about Uncharted 4 open Madascar level, other than sheer size (and driving mechanics, car physics, outposts AI etc.), is the inclusion of volumetric shadows in every body of water (something still missing in 99.9% of games 10 years later):



VuSNpAs.gif


But especially the fact all shadows across the entire massive level are the extremely heavy PCSS (on PS4 in 2016!) variety:

LWDDLwX.png


Other than amount of red bounce GI everywhere. It is a level that combined with ND standards in animations:

H9arRiP.gif


The desire to provide the very best IQ ever seen in a console game with their TAA implementation that puzzled DF on how it was even achieved:

Nz0So4f.png



All the other things ND imposed on themselves to deliver at the same time in the same package, obviosuly raped the PS4 hardware like never before.

Maybe something more could have been achieved by dropping resolution on base PS4 to Alan Wake 2 2024 PS5 Pro levels.


Well, it would be moronic if that was the case.. although I'm pretty sure that if I started to mess with brightness levels and gamma (like those Submugi U4 screenshots you posted) it wouldn't really hold up that well either.

As I've said many times, Cyberpunk PT is still the best lighting currently in the industry by a mile. It still presents sporadic issues:



And runs like crap even on thousand dollars graphic cards where Uncharted 4, which is a 2016 PS4 game running on $399 console hardware from 2013, would run several hundreds times better, but it is unquestionably the most advanced thing in the industry for the time being.


I admit this is quite amusing after you just literally scrolled for more than tens pages of stunning Uncharted 4 screenshots only to post the worst aborts, mostly by user Sebmugi with completely Photo-Mode's altered values like gamma and brightness levels and some with filters on top, to drive your point home.
I was really suprised to see what you ignored in order to post those screenshots, often in the same post, as I wasn't really expecting that from you. Maybe I did misjudged afterall.

I'm confident you won't cherry pick screenshots by ignoring for hours all the times the game's lighting looks consistently impressive for the first 5 Chapters straight, and that you'll take gameplay screenshots instead of fabricated photo-mode ones, from the downgraded PC version of Uncharted 4.

As said already, no one here denies real time lighting advantages over dynamic objects and in especally when it comes to ND games which are known for this.
Just saying: "Characters lighting in Uncharted 4 looks good, but only during cutscenes, where the lighting artist has manually placed and adjusted light sources. During cutscens we can see directional shadows that give dimension to the characters faces. During gameplay however character model have only flat lighting that lack directionality and self shadows." is objectively false as already proven in these examples (as I hope you don't really believe these are the only instances in this colossal game) for directional lights during gameplay:

76lFFLx.gif


Y93WIXT.jpg



22ZfBuL.png



7SoKPz0.jpg



1bWkZnK.jpg


WYuKV75.jpg


JCcmJx9.jpg


And some nice ones from the same Thread where you selected your screenshots:

604Uncharted4AThiefsEnd.png


26374135153_c8b038292b_o.jpg


be0Uncharted4AThiefsEnd.jpg


33236263161_bb17b52eb6_o.png


And simple ambient light:

FaxNbxK.jpg


lC5Vbfy.jpg


LPAeNPB.jpg


3I4ZI6r.jpg


And some nice ones found in that same Thread, that were ignored during that selection:

da3oaha-22a05edd-71c5-45de-a797-0c153121e9de.png


w4vVRZE.png


mHaRcJk.png


9e3Uncharted4AThiefsEnd.png


da3oadk-383f4e10-7a8e-44bc-935e-bdb404ca85bb.png


215Uncharted4AThiefsEnd.png


And some less nice ones, but still serviceable to the point.

yAEkH5S.png


1470562913-uncharted-tm-4-a-thief-s-end-20160807111555.png


And posting screenshots from open world levels won't change much, especially when even those always feature GI bounce on Drake, like here:

iMk6vWe.jpg


Or here:

hR4zcpO.png


da3oacs-9b37a31a-5593-4089-9188-2fe91fe88043.png


Uncharted 4, as I've said multiple times:



Is the most inconsitent game I've ever played, for a multitude of factors. Including a very fast development, as it was started from scratch mid development due to Druckmann and Straley taking over. But its highs are unmatched.
Lost Legacy doesn't suffer from this inconsistency, and manages to look virtually always great, even though some slight concessions needed to be made in the massive open world level.
I could post a literal hundred Lost Legacy screenshots looking absolutely insane.

Even with PT I can get better performance and image quality compared to PS5 version, and you want to tell me that Cyberpunk is unplayable on PC with PT. Also keep in mind that even "Psycho RT" looks absolutely stunning in this game, while running at twice the framerate of PT. The PS5 version runs at 30fps with minimal RT (even lower than the low RT settings on the PC), but I have not seen you complain about this situation. My PC can run this game at 60fps even with PT. Of course, I have to use 4K DLSS performance to get the locked 60fps, but even DLSS performance in this game offers better image quality than the PS5 version running at 1440p (4K DLSS performance offers much better picture quality than 1440p). Maybe one day the game will be patched to make full use of the PS5Pro hardware and run RT mode at 60fps, and maybe Sony will sort out the PSSR issues so that the image quality will be comparable to DLSS image, but for now this game looks terrible on the PS5 compared to PC version.

As for uncharted 4, like I have said some of your screenshots look great, especially indoors where lighting artist carefuly adjusted lighting (placed light sources manualy and prebaked GI etc.). Small indoor locations (especially Drake house) still look good in this game, but the lighing in open levels look often flat. I will post my screenshots today and I wonder what you will say then? Maybe you'll say something along the lines of the PC version being downgraded, or try to suggest that the lighting is flat because of the HDR to SDR conversion (btw, I don't think any of the screenshots I posted from the PS4 screenshot thread used this dynamic range conversion because in 2016 not many people had HDR TVs, the lighting looked flat and washed out on these PS4 screenshots simply because that's how the lighting can look in this game, I know that becasue I played PC version two weeks ago).

The asset quality in U4 still looks very good, and this game also has beautiful locations, but the lighting in this game is dated. I havent noticed it when I played this game on my old GTX1080, but when I started playing RT games on my new PC I started noticing flat lighting in older games. RT lighting would benefit this game a lot. Even imprecise lumen (UE5) is on totally different level compared to Uncharted 4 lighting.

 
Last edited:

Mister Wolf

Member
Taa > native.

Taa solved shimmering which is the absolute worst shit u can look at. A bit more blurry is the result of it is fine by me. Absolute loved TAA in AC games before DLSS became a thing because the world wasn't a shimmering mess.

True. Look how much people cried about the shimmering in Metaphor begging for the developers to add TAA.
 
This is like every probe grid based solutions that go as far back as 2015 if not older. It's raster 100%.



Whole video is interesting so I suggest a watch. Very smart peoples finding tricks with raster. But if you want to go to probe solution its around at 12mins.

Same from Killzone shadowfall 2013, so tech even before that of course



@1:03:50 roughly. For most engines the light probes used to be cubed maps so for Killzone it's the same language, it's just a precursor to light probes terminology.

Very recently, Spider-man



As time advanced, the algorithms could be refined to have better coverage of those light probe grids, as they used to be distanced, they pack more of them, smarter placements on bigger maps, refined grids, etc. Not inside of buildings, etc. They raised cube map resolutions, improved PBR algorithms, etc.

Light probes bridge the gap between static pre-baked lighting and dynamic. The grid already has the pre-baked light at a certain point, but as the character / anything dynamic moves throughout the grid, the probes, containing the lighting for their "region", will blend with the one behind the character for example and blend together, so whatever is moving in the scene is making sense lighting wise.

This is a video that explains it all very well at a basic level for a simple scene.



Now you get problems with these solutions as a baseline of course, if it was perfect nobody would chase beyond that. They are not able to resolve visibility and occlusion of dynamic scenes, if not properly done, light bleeding, becomes difficult to find those leaking problems massive open worlds, etc. There's again a ton of "tricks" devs have developed throughout the years, if you've seen the videos above, but it's tricky. (fucking smart peoples though)

Now you say, buggy loop WTF? Probe grids? That's starting to look like Metro Exodus EE probe grids for ray tracing? Well yes.

Nearly all the initial RT games were probe-based. RTXGI is probe based. The difference is they are not pre-baked so that's where ray tracing differentiate itself of course.

Ray tracing per pixel is of course out of the fucking question for gaming. That's something peoples don't understand, games are not doing like offline render farms, simply impossible with the hardware we have now. Everything is ALWAYS a trick in real-time gaming and smart peoples figure interesting things to make it work. But they always have a pro and con. So to make it possible to have decent performances, they are again interpolating between probes rather than per pixel accuracy.



BUT, with probes, because they are distanced, they have placements, they can still fail some places, the volumes are very important.

So then came in other solutions like ReSTIR which rely on reservoir-based resampling to share the interpolation with neighboring pixels. But again, very heavy, imagine the equivalent of a probe grid at near full pixel resolution? (Nvidia later found a solution and tweaked that into their Cyberpunk 2077 solution)

AMD also came up with a solution, AMD GI-1.0.

Smart probes trimmed down to the scene.

MGTvwNx.jpeg


Again we're still with probes. A long fucking way from the Killzone shadowfall solution in 2013.

Star Citizen based their upcoming RT solution based on AMD GI. 25k freaking probes for that scene! So to unify their RT solution with software lighting and not make their artists places probes in a very big solar system with all kinds of dynamic terrains (base building later), they used the same grid for raster. You can also see later in the video their software solution based on that but heavily simplified (looks like a PSX game when looking at probes). Now for software solution, it's environment probes and not ray tracing, they are lighted/relighted to keep the lighting information close to what ray tracing provided (but not dynamic objects, kind of like Lumen limitation).




So again, this AMD paper for ray tracing, inspired these devs to make a raster solution that gets DAMN fucking close. Almost software Lumen, but really, no rays. I would say this would again breath new life into raster solutions for a while to come. It looks really amazing in their comparison video honestly.

I think at best if Demon soul devs made some crazy raster approach with probes and its that good looking, it would be close to the above solution. But crazy enough, raster.


WOW. WTF am I doing at 23:23 with work tomorrow morning making this reply to you dude. Wasted my night :messenger_tears_of_joy:. I hope you will check around the videos a bit at least.

All in all, the raster solutions these devs were developing all tried to reach as close as possible the results of what ray tracing would be. Dynamic lighting (not dynamic scene!), indirect lighting, etc etc. Brilliant solutions. Very very fast as the probes don't have to bake any new information at runtime.
I think honestly that it was VERY close to overlapping with ray tracing especially since DDGI-RTXGI used probe grids. Quantum break's video is amazing watch for the tech they made for that lighting engine.

Cherry on the sundae, a more advanced for Unity from Pierre Yves Donzallaz who worked on Crysis 1 - 2 - 3, Ryse, GTA 5 & RDR 2 (so yea... quite the experience). He goes through the baseline, an ugly scene, to lightmaps, to light probe volumes, to ray tracing and path tracing.



BTW, since Rockstar is using RTGI for GTA6, Kojima is using RTGI for DS2, and even Insomniac is switching to full realtime GI with Wolverine, I think it's safe to say that devs have made up their minds.

I wouldn't be surprised if ND and GG follow suit. GG also teased using PS5's RT hardware in one of their post Horizon Burning Whores interview.

Also, Kingsmaker released a new trailer today showing a lot of destruction and they are not using Lumen despite using UE5. They made their own custom realtime GI solution to calculate lighting in realtime after all this destruction. So its not like devs HAVE to use Lumen.



Holy shit. This new UE5 Black Hawk Down shooter is nuts. The SP is on UE5 while the mp is on UE4 and looks a generation behind. It literally takes a shit on all 5 CoD games this gen. Infinity Ward, Treyarch, Raven and Sledgehammer should just resign and go flip burgers if they cant produce results like this without Lumen and Nanite.

ecZZo8X.gif


iXWALVu.gif
Thanks for this my dudes, will check that stuff when I'm home. Some of this is looks like it will be a pretty nice watch.

I'm glad the thread took off like it did.

Also it's I guess nice that someone pointed out some stuff about Thread Interactive, nevertheless vids are definitely informative despite his tone. And I haven't checked how his fork is progressing. But it would be nice if this worked well.

Honestly, I didn't get the vibe it's a he knows better tone but rather frustration with the current state of industry, I guess. But I'm not native.

But I didn't read UE5 forums so I really would know that, if he did.

Despite that, his vids are pretty informative so there's value in that.

At least he presents what is wrong, how stuff could be fixed. Maybe he is young and unexperienced in working with big titles but I see he has pretty big knowledge on the subject, definitely at least for me, it's a better watch than some vague vids about games bad performance and hitches without an know how what could be causing it, and you're watching a dude talking to himself that he does not know why performance is the way it is.
 
Last edited:

Vick

Gold Member
Even with PT I can get better performance and image quality compared to PS5 version, and you want to tell me that Cyberpunk is unplayable on PC with PT.
I didn't say it's unplayable, I said would be insane if Path Tracing lighting would "look bad" given how its runs (objectively crap, compared to non-PT software) on thousand dollars hardware, and how it will run on a 5090.
I'm not even sure you understood the nature of this Thread while coming in here prasing the heavy tech that's impossible to even run on Console in a playable state, a Thread about optimization and alternatives to RT and PT possible to deliver at acceptable performance/resolution on software targeting Series X/PS5 hardware.

He even said in the video that 4070 owners should shut up about how the tech he criticizes runs on their hardware, given they're "way above target specs". And yet you are here, above even that, nitpicking on last gen raster and being defensive about Path Tracing..

Also keep in mind that even "Psycho RT" looks absolutely stunning in this game, while running at twice the framerate of PT. The PS5 version runs at 30fps with minimal RT (even lower than the low RT settings on the PC), but I have not seen you complain about this situation.
I haven't even ever bought the PS5 version, let alone positively commented on it. These I believe my only words ever in my GAF history on Cyberpunk on PS5:

Cyberpunk on a 4090 PT is a different game altogether.



If you know what to look for, there's just no going back. You can push it even to this kind of absurd extents you'll only see next-gen:



There's countless games with little to no difference with PC, some even better on Pro (and few even way better like DMC5 or Dead Rising Deluxe), but you picked a game that's a literal generation apart.

It was never going to get even close to path traced PC version anyway.





avGqUTf.gif


Maybe they'll release a free "Next-Gen Update" on PS6 along with the native release.

What could they even do on Pro, adding those subtle RT shadows at 60fps? Bring another, out of many, RT feature at 30fps, providing something still years behind the full package that requires insane hardware to properly run on?

This is the second time in the span of few days you imply something about me that's far from reality, and it would be appreciated at this point if you could stop already because it's starting to get on my nerves a little.

My PC can run this game at 60fps even with PT. Of course, I have to use 4K DLSS performance to get the locked 60fps, but even DLSS performance in this game offers better image quality than the PS5 version running at 1440p (4K DLSS performance offers much better picture quality than 1440p). Maybe one day the game will be patched to make full use of the PS5Pro hardware and run RT mode at 60fps, and maybe Sony will sort out the PSSR issues so that the image quality will be comparable to DLSS image, but for now this game looks terrible on the PS5 compared to PC version.
I couldn't care any less about Cyberpunk on PS5. Absolute zero, and I've never mentioned it to begin with so I have trouble understanding the reason behind these ramblings.

As for uncharted 4, like I have said some of your screenshots look great, especially indoors where lighting artist carefuly adjusted lighting (placed light sources manualy and prebaked GI etc.). Small indoor locations (especially Drake house) still look good in this game, but the lighing in open levels look often flat. I will post my screenshots today and I wonder what you will say then?
I will say exactly what I've said already plenty of times in this Thread, that maybe in your attempts to discredit a 10 year old PS4 game, of which undeniable mindblowing heights were brought up because revelant to the topic, or doing whatever you're doing in a Thread, you failed to read:

All this being said, I'm not arguing RT lighting advantages over baked when it comes dynamic objects. I'm arguing over the use of it over careful baking if the former results in 800p and the latter in native 4K.
As said already, no one here denies real time lighting advantages over dynamic objects and in especally when it comes to ND games which are known for this.
I too like RT. A lot actually, and especially Path Tracing which is like the ultimate dream and final destination of real time rendering.
It is just not worth the cost most of the time, at least on consoles. Even though I find some RT to be almost necessary, like RT reflections because SSR just suck so bad
And RT shadows because they also often suck so bad and the only alternative to truly realistic shadows is PCSS, which is also heavy and only implemented in console games I can count on one hand.
RT is the future, it's just simply sure as shit not worth it when it means I have to play at PS3-like internal resolutions and framerate.

Or how your upcoming cherry picked screenshots of said ten years old PS4 game are part of what I consider to be:

"Graphically, it's still the most inconsistent game i've ever played. The HDR, tone mapping, color grading, all make for a visually stunning presentation that made me appreaciate my plasma more than ever. But if most of the time it's a damn straight pre-rendered movie (especially thanks to their GI solution and God tier PBR and SSS) and therefore the best looking game i've seen to this day along with The Order, TLOUII and P.T., too many times in the open areas it shamelessly shows bordeline N64 geometry, with awful use of normal maps or texture in general to mitigate it.

It is known that ND textures are all hand painted but while the artistry and resolution is more than commendable, they were over confident here. It's like they didn't care, they had their perfect lighting, characters, animations, vegetation, AO, Temporal AA and all the little technical feats and thought they could afford old gen shapes with cartoony textures on top (climbable edges, for instance) because the thing would have looked great regardless due to the "paintery" style and their tech pipeline. And they were wrong, because it's super jarring to have photorealistic characters, vegetation and lighting in the same frame of sharp edges and low poly mountains on the horizon.

Luckily this is mostly true for the more open areas, which are not common, but playing The Lost Legacy made it clear that they received this same feedback from more than one person because most of these issues are completely solved in the "DLC". A night and day difference."


What even is your point? The obvious fact everyone knows and agrees on that Uncharted 4, the 2016 game running on $399 2013 hardware, lighting on characters isn't as good as it could be with RT?

Maybe you'll say something along the lines of the PC version being downgraded
PC version being downgraded is no news. It is downgraded in comparison with PS5 version:



Let alone PS4 Pro version.

5J3gW63.gif


6EqguyI.gif


qArok0O.gif


MJ0zkKr.gif


p9HJyjb.gif


yevlf8y.gif


AReXoSE.gif


cvOVtOi.gif


JOQkL6E.gif


UmWRfGk.gif


GPYraKB.gif


Kgi2Zid.gif


JbbfZr0.gif


4zk9qSv.gif


QlUXbpr.gif


6jlf79n.gif


ZHYyDVc.gif


quKIVJK.gif


Y8zMvsW.gif


Missing shadows, missing reflections, missing VFX, downgraded GI, missing shading on vegetation, missing shaders on clothes are just some of the more than 400 visual issues found in the Legacy of Thieves release.

or try to suggest that the lighting is flat because of the HDR to SDR conversion (btw, I don't think any of the screenshots I posted from the PS4 screenshot thread used this dynamic range conversion because in 2016 not many people had HDR TVs, the lighting looked flat and washed out on these PS4 screenshots simply because that's how the lighting can look in this game, I know that becasue I played PC version two weeks ago).
I assumed it was about HDR to SDR because of the completely incorrect values in many of your puke-inducing screenshots.

Just because I wouldn't have guessed you intentionally selecting Photo-Mode altered photos with adjusted brightness or gamma, and some even using filters, to prove a point about outdoor lighting..

This is what you posted as a representation of the game's lighting:

1463155676-uncharted-tm-4-a-thief-s-end-20160513170523.png

This is how that (absolutely unremarkable) spot really looks in the game with unaltered levels, most noticeable on whole right side of the picture:

Kmi3iyA.png


qazuxdx.gif


Does it make a huge difference? Perhaps not, but you were trying to make a point about lighting being bad, and this is how that exact spot looks in regards to lighting:



p1LkJt4.gif


8ZotemP.gif


Then this is what you posted:

1463070956-uncharted-tm-4-a-thief-s-end-20160512144146.png

When this is the lighting present in that exact spot:



5pwomE9.gif


AoZakyH.gif


eJPiGlr.gif


Once more, this is what you posted:

owbZRTK.jpeg

And this is the (stunning) lighting admirable in that precise spot:



TWzzabb.gif


DQlvTgW.png


I believe many wouldn't hesitate to call this sort of behaviour "disingenuous", and I have little doubt at this point about this being the exact scenario with your own screenshots.

RT lighting would benefit this game a lot.
No, it would very obviously not because if you scrap all the baking, in order to achieve a comparable result with RT (and you really need at least PT to match its heights) on the same hardware where the game runs at 4K 60fps and 1440p 120fps, the game would run at the very best at 540p 30fps, and not even that when it comes to its best locations or interiors as we are basically talking offline rendering in real timey.
For extremely marginal improvements only impactful in cherry picked scenarios.

Would the game benefit from additional RT on top, maybe on more performant hardware to reduce the sacrifices? Of fucking course. It is exactly what I asked for years and desired was going to happen with the PC port instead of the crap Iron Galaxy ultimately released.
But this is completely irrelavent here, because we're talking about improving a game with phenomenal baking already, instead of doing from scratch eveything in real-time trying to match what it does with alternative methods.

Again, I'm not sure what your point in this entire Thread is. We agree on most things, minus the fact you want to force on consoles tech that translates into abysmal performance and resolutions. You went so defensive about Cyberpunk you mentioned how shit it runs already on PS5, and yet you're here clamoring for tech that would make it run in a drastically shittier way even..

It's beyond bizarre.
 
Last edited:

TheStam

Member
I am a believer in Ray Tracing. Especially after playing Cyberpunk and Alan Wake 2 with Path tracing. It definitely makes the world look more believable and I feel worth it if you have the hardware. I can only do that in 1440p and not yet 4k even with DLSS on a 4080 with decent quality, but I feel like it's the future. Slight RT effects like whatever is present in Elden Ring doesn't seem worth the cost really.

Having said that I am also very frustrated with fuzzy graphics these days. I played Hellblade 2 recently and while it looks amazing, even at 4k it looks like 1080p at best. I got Star Wars Outlaws in a sae recently mainly to check out the graphics and it looked awful maxed out. Turning off a couple of Ray Tracing effects made it look better but still not very good at 4k with DLSS Quality. Black Myth Wukong also had weird rendering issues, it's a bit infuriating how we seem to be going backwards in some respects.

Some games and old games at 4k native look so much more crisp. I want supersharp games with high res textures most of all.
 

Gaiff

SBI’s Resident Gaslighter
The problem is not just TAA, though this is a major reason for it.
Things like motion blur, chromatic aberration and film grain, also contribute to reducing image quality.
Quite ironic when those going on and on about how past games had such better IQs love using ND stuff and other Sony first-party that are full of post-processing garbage.
 

winjer

Gold Member
Quite ironic when those going on and on about how past games had such better IQs love using ND stuff and other Sony first-party that are full of post-processing garbage.

On PC, we can disable on that crap.
And games look much better.
 

Filben

Member
I told Epic Games during the preview phase of 5.5...
Something's off here is my feeling. The way (and with what speed) they drop all these arguments and examples is not only hard to follow but it feels like it's been reading off of someone's bachelor thesis instead of their own opinion. However, what really made me wonder is this from another video by them:
They claim they're developing a game using UE5 but paused their production because of these UE issues, what they call flaws in the engine. Really? Because of some smearing and not-optimal TAA solutions you pause your production of your, how they call it, "ground breaking gameplay", their "epic and original story" AND ask for crowdfunding support? Naw man, this smells fishy. Even the guy himself looks like AI-generated or at least altered in some other videos of this channel.

Not saying they're outright making up facts and they do have a point to a degree. But something's not right here. Also, TAA wasn't as bad as it is now. I never had this smearing and noise artefacts with TAA in Dishonored 2, Assassin's Creed Syndicate. This is a new problem because TAA is just broad term where many solutions fall under.

Seems like some dudes went down a rabbithole and think they can unravel a whole industry-wide conspiracy.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Something's off here is my feeling. The way (and with what speed) they drop all these arguments and examples is not only hard to follow but it feels like it's been reading off of someone's bachelor thesis instead of their own opinion. However, what really made me wonder is this from another video by them:
They claim they're developing a game using UE5 but paused their production because of these UE issues, what they call flaws in the engine. Really? Because of some smearing and not-optimal TAA solutions you pause your production of your, how they call it, "ground breaking gameplay", their "epic and original story" AND ask for crowdfunding support? Naw man, this smells fishy. Even the guy himself looks like AI-generated or at least altered in some other videos of this channel.

Not saying they're outright making up facts and they do have a point to a degree. But something's not right here. Also, TAA wasn't as bad as it is now. I never had this smearing and noise artefacts with TAA in Dishonored 2, Assassin's Creed Syndicate. This is a new problem because TAA is just broad term where many solutions fall under.

Seems like some dudes went down a rabbithole and think they can unravel a whole industry-wide conspiracy.
He's reading off a script. Probably wrote it himself. Definitely seems to be on the spectrum which is why he might come off as a bit off.
 

SlimySnake

Flashless at the Golden Globes
Even with PT I can get better performance and image quality compared to PS5 version, and you want to tell me that Cyberpunk is unplayable on PC with PT. Also keep in mind that even "Psycho RT" looks absolutely stunning in this game, while running at twice the framerate of PT. The PS5 version runs at 30fps with minimal RT (even lower than the low RT settings on the PC), but I have not seen you complain about this situation. My PC can run this game at 60fps even with PT. Of course, I have to use 4K DLSS performance to get the locked 60fps, but even DLSS performance in this game offers better image quality than the PS5 version running at 1440p (4K DLSS performance offers much better picture quality than 1440p). Maybe one day the game will be patched to make full use of the PS5Pro hardware and run RT mode at 60fps, and maybe Sony will sort out the PSSR issues so that the image quality will be comparable to DLSS image, but for now this game looks terrible on the PS5 compared to PC version.

As for uncharted 4, like I have said some of your screenshots look great, especially indoors where lighting artist carefuly adjusted lighting (placed light sources manualy and prebaked GI etc.). Small indoor locations (especially Drake house) still look good in this game, but the lighing in open levels look often flat. I will post my screenshots today and I wonder what you will say then? Maybe you'll say something along the lines of the PC version being downgraded, or try to suggest that the lighting is flat because of the HDR to SDR conversion (btw, I don't think any of the screenshots I posted from the PS4 screenshot thread used this dynamic range conversion because in 2016 not many people had HDR TVs, the lighting looked flat and washed out on these PS4 screenshots simply because that's how the lighting can look in this game, I know that becasue I played PC version two weeks ago).

The asset quality in U4 still looks very good, and this game also has beautiful locations, but the lighting in this game is dated. I havent noticed it when I played this game on my old GTX1080, but when I started playing RT games on my new PC I started noticing flat lighting in older games. RT lighting would benefit this game a lot. Even imprecise lumen (UE5) is on totally different level compared to Uncharted 4 lighting.


The guy starts off the video talking about how people with their 4070s and 4090s are not the target audience for this video. And he's right. Games are developed on console specs and should be shipping in a solid state i.e., decent image quality and performance on those base consoles. And UE5 right now is not offering that.

You said it yourself. Cyberpunk on PS5 is nothing like what we play on PCs. So at the point, is it worth it when only 10-15% of the userbase can play these games at that setting? And an even smaller portion of that have the 3080s and 4070s that are needed to run these games with RTGI.

He does have a point. If you cant hit 1080p 60 fps with these settings on consoles then maybe you shouldnt be using these techniques. We have to have standards. I think a lot of the early issues with UE5 have been resolved in the latest releases regardless of what this guy is saying. So going forward, 1080p 60 fps should be possible with software lumen. They are already doing 1440p 30 fps with TSR upscaling so clearly the real issue is the CPU bottleneck that was in UE5.1 and has since been resolved.
 

CamHostage

Member
Thanks for this my dudes, will check that stuff when I'm home. Some of this is looks like it will be a pretty nice watch.

I'm glad the thread took off like it did...

No, "taking off" isn't good IMO if we just turn this thread into another killdozer thread like Graphical Fidelity I Expect This Gen.

It's nice that you've pulled out a topic which is getting some experienced developers back into conversation. Encourage that chain of talk. (GFIETG pushed everybody who's ever ridden the Moscone Center elevator to run for the hills from having their professional opinions spit on by those proudly proclaiming their credits being that they'd 'played both TLoU2 and Cyberpunk'. A pretty place for pictures and fresh finds of YT demos/trailer reels, but discussion is a goddamned lot of goddamn.) There's a lot to talk about with what this Threat Interactive kid is bringing up (and a lot of answers we gamers are desperate for with the troubling state of the game market and the frustrating end results we're finding ourselves with in our $70 purchases,) but if this thread just turns into a long stream of posts showing some new UE5 demo that looks so very NG or some screengrabs of a Naughty Dog game that is supposed to prove every other developer is making games wrong, it won't be a good thing.

At least he presents what is wrong, how stuff could be fixed. Maybe he is young and unexperienced in working with big titles but I see he has pretty big knowledge on the subject, definitely at least for me, it's a better watch than some vague vids about games bad performance and hitches without an know how what could be causing it, and you're watching a dude talking to himself that he does not know why performance is the way it is.

I'm assuming you're taking a swipe at DF (or maybe just the DF clones out there), which is fair, those dudes are big boys and can afford to take a shot. That said though, DF have fully read the docs on subjects they spout about, and more importantly, they shut up and listen when they can create an auditorium for developers to come and talk about the real work of game development. I'd rather have a mechanic than a manual writer tell me what's wrong with my car, but the world's not a bad place with both.

And just in general, there are a lot of YTers on a bully pulpit condemning stuttering and upscaling artifacts, but gamers live in the finished product domain not the development world, so if what we get when we plug in a game looks hinky to our eyes, all the tech talk and bulletpoint features are meaningless until complaints go away.

Threat Interactive's enthusiasm and promise to back up his bluster with a product is admirable. He seems like a handful, but he's a solo developer; nobody needs to sit through a sprint review with him to hear his opinions of what everybody else is doing wrong. If he actually comes out with something, great. If not, he's just another guy with a channel on Youtube...
 
Last edited:
I didn't say it's unplayable, I said would be insane if Path Tracing lighting would "look bad" given how its runs (objectively crap, compared to non-PT software) on thousand dollars hardware, and how it will run on a 5090.
runs like crap even on thousand dollars graphic cards where Uncharted 4, which is a 2016 PS4 game running on $399 console hardware from 2013,
I'm learning english, so maybe I missunderstood you, but your comments implies that Cyberpunk is unplayable on PC. If I'm wrong, I hope native English speakers can correct me.

On my PC Cyberpunk runs at 70-80fps at 1440p DLSSQ (the same image quality as TAA native) with PT and I dont consider that a crappy experience, yet alone with FG 100-120fps. That's way more playable experience than 30fps on consoles. As for artifacts and ghosting, that's a real problem in all PT games I played (also many UE5 games have this issue), however in cyberpunk that noise and ghosting is amplified when ray reconstruction is enabled and in your video RR was enabled for sure. I know this because RR makes the whole image look like an oil paint, and this can be seen in the video you linked to. Without without RR image quality looks good most of the time, I only notice this PT noise in very dark locations lit only with indirect lighting. If people dont want to see that noise, I recommend playing with RT. Psycho RT does not have this noise problem and still offers wayyyyy better lighting quality compared to raster (and it also runs 2x faster compared to PT, I get 120fps with psycho RT even without FG).

This is the second time in the span of few days you imply something about me that's far from reality, and it would be appreciated at this point if you could stop already because it's starting to get on my nerves a little.
I said you were misleading people by insisting that RE4R runs at native 4K on the PS5Pro. Do I need to remind you how that debate ended? I was correct, you were wrong. This discussion will also not end in your favor.

You annoy me, too dude. You tag me in your posts when you argue with other people because I wrote positive comments about the PS5Pro, but when I mention some inconvenient facts then you paint me as a disengenous person who posts nothing but ramblings.

When it comes to Uncharted 4 screenshots in my post, I said very clearly that these were not my screenshots. Even if some of these screenshots had changed color and gamma settings (photo mode in this game allows that) they still show poor lighting quality and missing shadows, and that's why I selected them.

Based on what you wrote, I think we both have different expectations when it comes to the quality of lighting in modern games.

ZGL7t7N.jpeg


You are impressed with this screenshot, and I try to understand your perspective, because not so long ago I would have thought this screenshot looked good, too. I even posted many screenshots from this game here on neagaf back in 2017 thinking the graphics in this game looks phororealistic. Now however after playing so many RT games my perspective has changed and I imediately notice flat lighting (missing indirect shadows and AO). To make my point, I will only use my own screenshots from now on.


This is PC version with maxed out settings. Indirect shadows are clearly missing, and as a result the whole scene looks flat.

U4-1.jpg


In the next screenshot I added the MXAO (reshade) filter. Have you noticed any changes? The lighting is still not realistic, but it has a less flat look.


U4-1-MXAA.jpg


More screenshots in the spoiler, because I dont want to spam this thread with so many images.

I can see that flat lighting very often in both the Uncharted 4 and the Lost Legacy.

tll-2024-12-05-17-13-52-061.jpg

tll-2024-12-05-17-15-21-954.jpg


tll-2024-12-05-17-24-58-676.jpg


tll-2024-12-05-17-26-48-456.jpg


u4-2024-12-05-14-51-19-701.jpg


u4-2024-12-05-14-51-39-788.jpg


u4-2024-12-05-14-33-43-091.jpg


u4-2024-12-05-15-11-52-800.jpg


u4-2024-12-05-15-38-38-590.jpg


Now lets talk about character lighting. This is what character lighting looks like in a cutscene.


u4-2024-12-05-13-59-43-960.jpg


u4-2024-12-05-15-11-52-800.jpg


And here's what it looks like during gameplay. Cant you see the difference? Something is missing, dont you think?

u4-2024-12-05-14-00-27-043.jpg


u4-2024-12-05-14-00-59-588.jpg


u4-2024-12-05-14-06-45-707.jpg


Characters can stand in shadows, but they are still well lit and look like they have been pasted into the scene.

u4-2024-12-05-14-59-33-018.jpg


This character model is not grounded into the scene. And you said that U4 would not benefit from RT, good joke dude.

u4-2024-12-05-15-26-48-957.jpg


I tried to find good lighting on character models during gameplay, but I couldn't.

tll-2024-12-05-17-12-26-665.jpg


u4-2024-12-05-13-51-51-067.jpg


u4-2024-12-05-14-27-26-526.jpg


u4-2024-12-05-14-28-51-536.jpg


u4-2024-12-05-14-32-30-945.jpg


u4-2024-12-05-14-50-53-206.jpg


u4-2024-12-05-14-55-55-231.jpg


u4-2024-12-05-14-58-29-725.jpg


u4-2024-12-05-15-10-49-267.jpg


u4-2024-12-05-15-14-58-294.jpg


u4-2024-12-05-15-15-51-649.jpg


u4-2024-12-05-15-19-16-913.jpg


u4-2024-12-05-15-20-25-043.jpg


u4-2024-12-05-15-34-27-610.jpg


The best ones I found:

u4-2024-12-05-15-28-26-499.jpg


u4-2024-12-05-15-30-30-433.jpg


And here's few PS4Pro screenshots, nothing changes:

Uncharted-4-Kres-z-odzieja-20241205175510.jpg


Uncharted-4-Kres-z-odzieja-20241205175721.jpg


Uncharted-4-Kres-z-odzieja-20241205180104.jpg



Uncharted-4-Kres-z-odzieja-20241205180725.jpg


Uncharted-4-Kres-z-odzieja-20241205180328.jpg
 
Last edited:

rofif

Can’t Git Gud
His other vid. This guy PERFECTLY explains the problems with epic and ue5. Stupid nanite and other nonsense like this that destroyed optimization and smart designs
 
Last edited:

Mister Wolf

Member
I'm learning english, so maybe I missunderstood you, but your comments implies that Cyberpunk is unplayable on PC. If I'm wrong, I hope native English speakers can correct me.

On my PC Cyberpunk runs at 70-80fps at 1440p DLSSQ (the same image quality as TAA native) with PT and I dont consider that a crappy experience, yet alone with FG 100-120fps. That's way more playable experience than 30fps on consoles. As for artifacts and ghosting, that's a real problem in all PT games I played (also many UE5 games have this issue), however in cyberpunk that noise and ghosting is amplified when ray reconstruction is enabled and in your video RR was enabled for sure. I know this because RR makes the whole image look like an oil paint, and this can be seen in the video you linked to. Without without RR image quality looks good most of the time, I only notice this PT noise in very dark locations lit only with indirect lighting. If people dont want to see that noise, I recommend playing with RT. Psycho RT does not have this noise problem and still offers wayyyyy better lighting quality compared to raster (and it also runs 2x faster compared to PT, I get 120fps with psycho RT even without FG).


I said you were misleading people by insisting that RE4R runs at native 4K on the PS5Pro. Do I need to remind you how that debate ended? I was correct, you were wrong. This discussion will also not end in your favor.

You annoy me, too dude. You tag me in your posts when you argue with other people because I wrote positive comments about the PS5Pro, but when I mention some inconvenient facts then you paint me as a disengenous person who posts nothing but ramblings.

When it comes to Uncharted 4 screenshots in my post, I said very clearly that these were not my screenshots. Even if some of these screenshots had changed color and gamma settings (photo mode in this game allows that) they still show poor lighting quality and missing shadows, and that's why I selected them.

Based on what you wrote, I think we both have different expectations when it comes to the quality of lighting in modern games.

ZGL7t7N.jpeg


You are impressed with this screenshot, and I try to understand your perspective, because not so long ago I would have thought this screenshot looked good, too. I even posted many screenshots from this game here on neagaf back in 2017 thinking the graphics in this game looks phororealistic. Now however after playing so many RT games my perspective has changed and I imediately notice flat lighting (missing indirect shadows and AO). To make my point, I will only use my own screenshots from now on.


This is PC version with maxed out settings. Indirect shadows are clearly missing, and as a result the whole scene looks flat.

U4-1.jpg


In the next screenshot I added the MXAO (reshade) filter. Have you noticed any changes? The lighting is still not realistic, but it has a less flat look.


U4-1-MXAA.jpg


More screenshots in the spoiler, because I dont want to spam this thread as you did with your screenshots.

I can see that flat lighting very often in both the Uncharted 4 and the Lost Legacy.

tll-2024-12-05-17-13-52-061.jpg

tll-2024-12-05-17-15-21-954.jpg


tll-2024-12-05-17-24-58-676.jpg


tll-2024-12-05-17-26-48-456.jpg


u4-2024-12-05-14-51-19-701.jpg


u4-2024-12-05-14-51-39-788.jpg


u4-2024-12-05-14-33-43-091.jpg


u4-2024-12-05-15-11-52-800.jpg


u4-2024-12-05-15-38-38-590.jpg


Now lets talk about character lighting. This is what character lighting looks like in a cutscene.


u4-2024-12-05-13-59-43-960.jpg


u4-2024-12-05-15-11-52-800.jpg


And here's what it looks like during gameplay. Cant you see the difference? Something is missing, dont you think?

u4-2024-12-05-14-00-27-043.jpg


u4-2024-12-05-14-00-59-588.jpg


u4-2024-12-05-14-06-45-707.jpg


Characters can stand in shadows, but they are still well lit and look like they have been pasted into the scene.

u4-2024-12-05-14-59-33-018.jpg


This character model is not grounded into the scene. And you said that U4 would not benefit from RT, good joke dude.

u4-2024-12-05-15-26-48-957.jpg


I tried to find good lighting on character models during gameplay, but I couldn't.

tll-2024-12-05-17-12-26-665.jpg


u4-2024-12-05-13-51-51-067.jpg


u4-2024-12-05-14-27-26-526.jpg


u4-2024-12-05-14-28-51-536.jpg


u4-2024-12-05-14-32-30-945.jpg


u4-2024-12-05-14-50-53-206.jpg


u4-2024-12-05-14-55-55-231.jpg


u4-2024-12-05-14-58-29-725.jpg


u4-2024-12-05-15-10-49-267.jpg


u4-2024-12-05-15-14-58-294.jpg


u4-2024-12-05-15-15-51-649.jpg


u4-2024-12-05-15-19-16-913.jpg


u4-2024-12-05-15-20-25-043.jpg


u4-2024-12-05-15-34-27-610.jpg


The best ones I found:

u4-2024-12-05-15-28-26-499.jpg


u4-2024-12-05-15-30-30-433.jpg


And here's few PS4Pro screenshots, nothing changes:

Uncharted-4-Kres-z-odzieja-20241205175510.jpg


Uncharted-4-Kres-z-odzieja-20241205175721.jpg


Uncharted-4-Kres-z-odzieja-20241205180104.jpg



Uncharted-4-Kres-z-odzieja-20241205180725.jpg


Uncharted-4-Kres-z-odzieja-20241205180328.jpg

Great post and exactly why baked lighting is now a joke. Inconsistency.
 

Vick

Gold Member
Some examples of what was possible to achieve 8 years ago on inexpensive 2013 console hardware thanks to passion, care and talent, as a mere indication of what could be potentially achieved today on hardware orders of magnitude more performant.

rGwrmMA.gif


WlF5lBV.gif


LcPrJ3X.gif


19fzCJi.gif


cMVfS6D.gif


IGUmhQW.gif


9dWLyTd.gif


UCr8zvW.gif


C7AycsN.gif


DQ5JJkJ.gif


azyYvne.gif


QkF6b9r.gif


EvEvriB.gif


r7F2avd.gif


V8mop4y.gif


RExiws6.gif


Ml8xPcu.gif


4dSjWMx.gif


ejWnojs.gif


PNacmjc.gif


3h91Ov2.gif


IO7iu2a.gif


pa6bzTp.gif


hsmCB5x.gif


ZLKLdVQ.gif


fxAyuwD.gif


bI9GlrB.gif


JvGKuLP.gif


eDGW5Tk.gif


IyvbhpD.gif


ogz7M3C.gif


W1rczKm.gif


70U1tuC.gif


J7z4ISQ.gif


nh5A79M.gif


pkbJtmb.gif


ix11QrG.gif


36553144892_09f4ff33b3_o.png


Great post and exactly why baked lighting is now a joke. Inconsistency.
Only jokes here are his post:

And you said that U4 would not benefit from RT, good joke dude.
Would the game benefit from additional RT on top, maybe on more performant hardware to reduce the sacrifices? Of fucking course.

And your "now", as an implication the worst examples from 2016 software running on $399 2013 console hardware is a good representation of what we could achieve now, and has been achieved already, when it comes to occlusion and self shadowing on dynamic objects such as characters.

As is your guys presence in this Thread to begin with, given the inconsistency you Nvidia shareholders make such a big deal out of couldn't be any more insignificant in comparison to a shimmering, awfully denoised 800p mess at unstable 60fps on the same hardware a pristine native 4K/60fps & 1440p/120fps artifact/denoise free rasterized presentation is possible. And would be for 99.99% of the userbase of such hardware target.
If you guys desire so hard to brag about the currently available real-time version of Path Tracing via FrameGen and AI upscaling on 40 series, or the talentless hacks developing games via automation for your hardware, feel free to open a Thread about it.
 
Last edited:
The guy starts off the video talking about how people with their 4070s and 4090s are not the target audience for this video. And he's right. Games are developed on console specs and should be shipping in a solid state i.e., decent image quality and performance on those base consoles. And UE5 right now is not offering that.

You said it yourself. Cyberpunk on PS5 is nothing like what we play on PCs. So at the point, is it worth it when only 10-15% of the userbase can play these games at that setting? And an even smaller portion of that have the 3080s and 4070s that are needed to run these games with RTGI.

He does have a point. If you cant hit 1080p 60 fps with these settings on consoles then maybe you shouldnt be using these techniques. We have to have standards. I think a lot of the early issues with UE5 have been resolved in the latest releases regardless of what this guy is saying. So going forward, 1080p 60 fps should be possible with software lumen. They are already doing 1440p 30 fps with TSR upscaling so clearly the real issue is the CPU bottleneck that was in UE5.1 and has since been resolved.
Current gen consoles cant run PT, but you dont need PT to improve lighting quality compared to PS4. Crysis 2 remastered has dynamic GI and runs on consoles. Some games like matro exodus have fairly light RT GI implementation that runs at 60fps even on consoles. I played this game on PC and character and objects blend well into the scene (that's the biggest issue I have with rasterized graphics). Performance on PC is also great, I had over 80fps at 4K native with RT, with DLSS over 140fps, that's comparable to the Uncharted 4 PC port which doesn't use RT and has flat lighting.

Indiana Jones RT GI is also well optimized. Even the RTX 2080ti can run this game at 60fps and the graphics look spectacular. Just look at these screenshots.


If this game runs so well on PC, I think the PS5Pro will also run and look amazing, as it has even better GPU than 2080ti.
 
Last edited:
Some examples of what was possible to achieve 8 years ago on inexpensive 2013 console hardware thanks to passion, care and talent, as a mere indication of what could be potentially achieved today on hardware orders of magnitude more performant.

rGwrmMA.gif


WlF5lBV.gif


LcPrJ3X.gif


19fzCJi.gif


cMVfS6D.gif


IGUmhQW.gif


9dWLyTd.gif


UCr8zvW.gif


C7AycsN.gif


DQ5JJkJ.gif


azyYvne.gif


QkF6b9r.gif


EvEvriB.gif


r7F2avd.gif


V8mop4y.gif


RExiws6.gif


Ml8xPcu.gif


4dSjWMx.gif


ejWnojs.gif


PNacmjc.gif


3h91Ov2.gif


IO7iu2a.gif


pa6bzTp.gif


hsmCB5x.gif


ZLKLdVQ.gif


fxAyuwD.gif


bI9GlrB.gif


JvGKuLP.gif


eDGW5Tk.gif


IyvbhpD.gif


ogz7M3C.gif


W1rczKm.gif


70U1tuC.gif


J7z4ISQ.gif


nh5A79M.gif


pkbJtmb.gif


ix11QrG.gif


36553144892_09f4ff33b3_o.png



Only jokes here are his post:




And your "now", as an implication the worst examples from 2016 software running on $399 2013 console hardware is a good representation of what we could achieve now, and has been achieved already, when it comes to occlusion and self shadowing on dynamic objects such as characters.

As is your guys presence in this Thread to begin with, given the inconsistency you Nvidia shareholders make such a big deal out of couldn't be any more insignificant in comparison to a shimmering, awfully denoised 800p mess at unstable 60fps on the same hardware a pristine native 4K/60fps & 1440p/120fps artifact/denoise free rasterized presentation is possible. And would be for 99.99% of the userbase of such hardware target.
If you guys desire so hard to brag about the currently available real-time version of Path Tracing via FrameGen and AI upscaling on 40 series, or the talentless hacks developing games via automation for your hardware, feel free to open a Thread about it.
I have to admit that some of your screenshots / gif's look good. Either you have a good eye for finding good looking places in this game, or maybe the PS5 version indeed looks a lot better than the PS4Pro / PC version I played. You may think I was cherrypicking, but when I played this game it was very easy to find ppaces with flat looking lighting (flat looking only compared to the best PT/RT games, because if you don't realise what's missing, the lighting in U4 can still look impressive, especially indoors). My screenshots show the real gameplay and have not been doctored in any way.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I have to admit that some of your screenshots / gif's look good.
That's because last gen games look way better in gifs than modern games.

This is due to modern games pushing a lot of detail that is hidden on youtube compression and gif compressions. You can still create good looking next gen gifs but a lot of those tiny but crucial details are lost. Meanwhile those PS3 and PS4 era games lack the volumetric effects that are destroyed by youtube compression, and hide the poor texture work in tiny gifs. The lighting gaps are also hidden whereas on giant 4k screens you can easily see where the AO and shadows coverage fails.

For example, here is kz3 on PS3:

killzone-3-pic-1.gif


gsye310kz3part14_1.gif


Does it look good? Yes. But we all know it wont be able to compete with modern shooters.

PS4 era games have a lot more detail than these PS3 games. They use PBR materials that will always stand the test of time unlike PS2 and PS3 era games. Which is partly why they still look great. However, there are things that RT and other modern techniques can add to give these materials the proper light bounce, AO, and shadows to make them look far more realistic. But thats harder to show off in tiny gifs. I make gifs of every game i play and my heart sinks when i see the final result. All the detail is lost. The HDR lighting is gone. The compression removes all the reflections off non-reflective surfaces. You lose almost everything that made the game great retaining just the scale and art of the game.

lastly, Uncharted 4 is targeting 30 fps at 1080p on last gen consoles. Its one of the best looking games on that platform not just because ND is talented but because they targeted 30 fps without targeting 4k. This gen ND targeted 4k 30 fps and 1440p 60 fps wasting literally half of the GPU on just rendering more pixels. They also didnt bother optimizing. The game runs at 1440p 90 fps at times on the base PS5 if you unlock the framerate. That means these geniuses left 50% of the GPU on the table on top of the 100% they wasted running the game at 4k 30 and 1440p 60.

The blame lies on these developers who have forgotten what made them great last gen. They are no longer maximizing the hardware and we have fans refusing to admit this and continue going to stan for their laziness.

Mark my words, they will all come around the moment ND embraces RT in their next game and shows a stunning looking next gen game that makes TLOU1 and Uncharted 4 look like the last gen games they are. This might happen literally in the next week or so if they show up at the VGAs which will make the 180s in this thread even more hilarious.
 
Last edited:

intbal

Member
That's because last gen games look way better in gifs than modern games.

This is due to modern games pushing a lot of detail that is hidden on youtube compression and gif compressions. You can still create good looking next gen gifs but a lot of those tiny but crucial details are lost. Meanwhile those PS3 and PS4 era games lack the volumetric effects that are destroyed by youtube compression, and hide the poor texture work in tiny gifs. The lighting gaps are also hidden whereas on giant 4k screens you can easily see where the AO and shadows coverage fails.

For example, here is kz3 on PS3:

killzone-3-pic-1.gif


gsye310kz3part14_1.gif


Does it look good? Yes. But we all know it wont be able to compete with modern shooters.

PS4 era games have a lot more detail than these PS3 games. They use PBR materials that will always stand the test of time unlike PS2 and PS3 era games. Which is partly why they still look great. However, there are things that RT and other modern techniques can add to give these materials the proper light bounce, AO, and shadows to make them look far more realistic. But thats harder to show off in tiny gifs. I make gifs of every game i play and my heart sinks when i see the final result. All the detail is lost. The HDR lighting is gone. The compression removes all the reflections off non-reflective surfaces. You lose almost everything that made the game great retaining just the scale and art of the game.

lastly, Uncharted 4 is targeting 30 fps at 1080p on last gen consoles. Its one of the best looking games on that platform not just because ND is talented but because they targeted 30 fps without targeting 4k. This gen ND targeted 4k 30 fps and 1440p 60 fps wasting literally half of the GPU on just rendering more pixels. They also didnt bother optimizing. The game runs at 1440p 90 fps at times on the base PS5 if you unlock the framerate. That means these geniuses left 50% of the GPU on the table on top of the 100% they wasted running the game at 4k 30 and 1440p 60.

The blame lies on these developers who have forgotten what made them great last gen. They are no longer maximizing the hardware and we have fans refusing to admit this and continue going to stan for their laziness.

Mark my words, they will all come around the moment ND embraces RT in their next game and shows a stunning looking next gen game that makes TLOU1 and Uncharted 4 look like the last gen games they are. This might happen literally in the next week or so if they show up at the VGAs which will make the 180s in this thread even more hilarious.
This whole thread is one big mess.
But this point about Gifs is absolutely correct. You can make 7th gen games look amazing when they're in gif form.
I kinda wish we had a thread solely about that. "Make a 6-7th gen game look great as a gif"
 

rofif

Can’t Git Gud
That's because last gen games look way better in gifs than modern games.

This is due to modern games pushing a lot of detail that is hidden on youtube compression and gif compressions. You can still create good looking next gen gifs but a lot of those tiny but crucial details are lost. Meanwhile those PS3 and PS4 era games lack the volumetric effects that are destroyed by youtube compression, and hide the poor texture work in tiny gifs. The lighting gaps are also hidden whereas on giant 4k screens you can easily see where the AO and shadows coverage fails.

For example, here is kz3 on PS3:

killzone-3-pic-1.gif


gsye310kz3part14_1.gif


Does it look good? Yes. But we all know it wont be able to compete with modern shooters.

PS4 era games have a lot more detail than these PS3 games. They use PBR materials that will always stand the test of time unlike PS2 and PS3 era games. Which is partly why they still look great. However, there are things that RT and other modern techniques can add to give these materials the proper light bounce, AO, and shadows to make them look far more realistic. But thats harder to show off in tiny gifs. I make gifs of every game i play and my heart sinks when i see the final result. All the detail is lost. The HDR lighting is gone. The compression removes all the reflections off non-reflective surfaces. You lose almost everything that made the game great retaining just the scale and art of the game.

lastly, Uncharted 4 is targeting 30 fps at 1080p on last gen consoles. Its one of the best looking games on that platform not just because ND is talented but because they targeted 30 fps without targeting 4k. This gen ND targeted 4k 30 fps and 1440p 60 fps wasting literally half of the GPU on just rendering more pixels. They also didnt bother optimizing. The game runs at 1440p 90 fps at times on the base PS5 if you unlock the framerate. That means these geniuses left 50% of the GPU on the table on top of the 100% they wasted running the game at 4k 30 and 1440p 60.

The blame lies on these developers who have forgotten what made them great last gen. They are no longer maximizing the hardware and we have fans refusing to admit this and continue going to stan for their laziness.

Mark my words, they will all come around the moment ND embraces RT in their next game and shows a stunning looking next gen game that makes TLOU1 and Uncharted 4 look like the last gen games they are. This might happen literally in the next week or so if they show up at the VGAs which will make the 180s in this thread even more hilarious.
The same was said for driveclub gifs. I've bought the game and it's simply just as stunning with only drawback being lack of anti aliasing (photomode supersamples it and that's the best looking 1080p image quality ever)
yeah Killzone 2 and 3 look great because it's arted perfectly.
Last gen targeted 1080p30 was like games today targeting 4k60. Different times, different goals.
The point being that back then, many games reached 1080p30 (so that gen goal) and looked stunning at that. Nowadays we have what? Demons Souls being this gen target goal looking amazing?

Anyway - driveclub in-game vs photomode... seriously... it's stunning still. Note - not a small gif, no rain.
That was taken on ps5 base. Maybe the pro edge enhancement is doing something.
j96nNUA.jpeg

7Ld21i9.jpeg

MFOkgbQ.jpeg

8kAFgC8.png



Or death stranding. my original ps4 slim shots.... PS4 SLIM. These are legis ps4 slim shots. Not ps5 or anything. not pc. Look at that image quality
I wish my mind was this blown again at 1080p 30fps.
Seriously. All the 30fps whining ruined this gen. I like 60fps as much as the next guy but it ruined this gen.
Back then we had 30fps, everyone played these games and each one felt more ground breaking than another.
I don't remember these games being unresponsive or 30fps.... All I remember is being in awe for 50 hours of death stranding and it played great. Now devs cant even do 30fps right.
hDtVvkp.png

wk8baRC.png

avLU8pO.png
 

Vick

Gold Member
I thought that person bringing up the RE4 resolution ordeal as a sort of low blow, when I was simply going off of informations read around:

9JH8xiE.png


NQ5FTWM.png


And pixel counts on screenshots that same person described as:

I believed you too at first because your screenshots could have passed for 4K
your screenshots looked like 4K to my eyes. Some games use such good upscaling technology that you can barely tell the difference unless you pixel peep using 200% zoom.

YKJuMwq.png


evx1yRG.png


HmRB8lS.png


And that ultimately settled only due to something I personally brought up as an indication the game was probably Interlaced instead:

Just watched the opening cutscene from PC version maxed out 4K FXAA + TAA because it just occurred to me that, in the opening scene, the first sacrificial victim's hair do actually shimmer on PS5 Pro in Resolution Mode.

Lo and behold, that doesn't happen at all on PC with those settings.

I guess this really settles it then, and virtually confirms a kind of Interlacing being used on PS5 Pro, just at higher resolution than the examples posted and probably carefully tailored around the console.
I owe you an apology, and to the other users being misinformed by my claim of RE4 being native 4K.

Was the lowest absolute point in the Thread.

But I was very clearly wrong because claiming 15-20fps Gifs with a third of the source color gamut (which is the reason why I also posted videos before) with entire bounce and shades entirely disappearing, desaturation of every color, disappearance of fine detail like dust particles, other than raised gamma and black levels, only look good because they hide flaws and that those same places would look worse, instead of considerably better, in the 1440p native source at 60/120fps in HDR running live in front of you..

Disney River GIF


Takes the cake by a matter of miles.
I would say I regret wasting time taking videos instead of pictures, which was done only to avoid the already made claims of screenshots being posted because videos would show flaws, but now I'm happy I didn't waste a single second more by taking also screenshots along with the moving captures, in yet another Graphics thread completely stained by the same exact person. I'll just have to post these ones uploaded last year, before gladily signing off for good.

zvCrkZX.png


UeaEtOW.png


kL3gkQ8.png


YXugCyQ.png


UMFkOL3.png


svsiBv2.png


ZNJZZLc.png


GBH4hjs.png


pwqAno3.png


OJeqJ8G.png


1nHIGqz.png


v0PoHxA.png


36553144892_09f4ff33b3_o.png
 
Last edited:
Holy shit. This new UE5 Black Hawk Down shooter is nuts. The SP is on UE5 while the mp is on UE4 and looks a generation behind. It literally takes a shit on all 5 CoD games this gen. Infinity Ward, Treyarch, Raven and Sledgehammer should just resign and go flip burgers if they cant produce results like this without Lumen and Nanite.

ecZZo8X.gif


iXWALVu.gif
This looks exactly like how I remember my first moments with Modern Warfare on 360 a decade ago! xD
 

SlimySnake

Flashless at the Golden Globes
The same was said for driveclub gifs. I've bought the game and it's simply just as stunning with only drawback being lack of anti aliasing (photomode supersamples it and that's the best looking 1080p image quality ever)
yeah Killzone 2 and 3 look great because it's arted perfectly.
Last gen targeted 1080p30 was like games today targeting 4k60. Different times, different goals.
The point being that back then, many games reached 1080p30 (so that gen goal) and looked stunning at that. Nowadays we have what? Demons Souls being this gen target goal looking amazing?

Anyway - driveclub in-game vs photomode... seriously... it's stunning still. Note - not a small gif, no rain.
That was taken on ps5 base. Maybe the pro edge enhancement is doing something.
j96nNUA.jpeg

7Ld21i9.jpeg

MFOkgbQ.jpeg

8kAFgC8.png



Or death stranding. my original ps4 slim shots.... PS4 SLIM. These are legis ps4 slim shots. Not ps5 or anything. not pc. Look at that image quality
I wish my mind was this blown again at 1080p 30fps.
Seriously. All the 30fps whining ruined this gen. I like 60fps as much as the next guy but it ruined this gen.
Back then we had 30fps, everyone played these games and each one felt more ground breaking than another.
I don't remember these games being unresponsive or 30fps.... All I remember is being in awe for 50 hours of death stranding and it played great. Now devs cant even do 30fps right.
hDtVvkp.png

wk8baRC.png

avLU8pO.png
Not all driveclub levels have issues with AA. That particular level especially in that lighting condition looks fine. I took a video last year that showed just how clean it looked.

However, try some of the Japan DLC levels. The forest levels in india and the grassy levels of Scotland. The aliasing is atrocious.

BTW, Kojima recently said that he wanted DS1 to be 60 fps and had to be talked out of it by his hardware guys. So expect DS2 to to be 60 fps now that there is no hardware bottleneck for 60 fps. He will target 60 fps just like the indiana jones devs did. I have a bad feeling about DS2.
 
Vick Vick , You seem to blame me for this RE4REMAKE resolution debate, so I will show how our discussion began and just maybe you will realize something.


You said "Resident Evil 4 is native 4K on Pro" and your post sparked my curiosity, because on PC you need a 6900XT or 4070ti to run this game at 4K native at 60fps with RT and hair strands. The PS5Pro GPU obviously isnt as strong given it's specs, so I asked you if you can confirm this extremely impressive information.

fXb9QMt.jpeg


CuNQ2ZL.jpeg


You once again confirmed that the RE4R runs at 4K native, and even added that the game runs at an average of 79fps in the village test. I found it hard to believe that the PS5Pro could run this game at 60fps 4K native, let alone at 79fps. That would mean the PS5Pro is only 12% slower than RTX4080S. Your information was unbelievable from my perspwctive, so I asked other people to verify the resolution situation in RE4Remake on the PS5Pro.

MisterXDTV was kind enough to show me a resolution chart that showed the game was not running at 4K native on the PS5Pro like you wanted people to believe. When you saw MisterXDTV's post with that RE4R resolution chart, you decided to reiterate once again that RE4Remake must run at 4K native.


EA07oB7.jpeg


I realized that I had to do my own research in order to find the truth, and I quickly found some interesting information. Nxgamer, digital foundry, elanalistadebits, all these well known sources said the RE4R on the base PS5 wasnt running at 4K native. I also found this comparison, which shows a relative difference of 43% between the base PS5 and the PS5Pro in the RE4 remake. This information made it clear to me that the PS5Pro could not run this game at 4K native simply because all of the extra GPU power on the PS5Pro was used for framerate boost and there was no power left for the resolution increase,

Screenshot-20241201-212911-You-Tube.jpg


Based on this information I said "it seems that the game does not run at 4K native, as you want people to believe", and your response was:

QnO0Ae8.jpeg


Most people would just call you a liar in this situation, but because I want to be a nice person to other people, I tried to politely communicate that your information was obviously false. You were however still offended, even though I had every right to say what I said.

You also still refused to believe that the RE4R does not run at 4K, so you started bombarding me with your "pixes counts" to prove the game must be running at 4K native.


kDUOcV4.jpeg


Because I mentioned that well known YT experts like Nxgamer said the RE4R runs at interlaced 4K on the base PS5, you asked me what does "interlaced" mean, so I showed you my screenshot comparsion. Earlier you said that if the game were rendered at a lower resolution, the internal aliasing would look different. My comparison screenshots however showed the same internal aliasing despite using different resolution.

lhMpZ1n.jpeg
.

You finally realized how wrong you were all along. You even apologized for your misinformation, and I really thought that was a sincere apology.

Now however when we started talking about different topic you decided to return to our previous argument trying to blame me for your misinformation.

This is the second time in the span of few days you imply something about me that's far from reality, and it would be appreciated at this point if you could stop already because it's starting to get on my nerves a little.

As for our Uncharted 4 discussion, I just realized that you are misleading people once again.

That's your comparison.


euNZ7x3.gif


You are telling people that the PC port has been downgraded compared to the original version.

So now let's look how the original game looks on the PS4Pro (BTW- the image quality isnt as bad on my TV, the PS4Pro compress screenshot too much).

Uncharted-4-Kres-z-odzieja-20241207031640.jpg


And here's the PC version

a1.jpg


As anyone can see, orginal version has exactly the same reflections as PC version. It seems that the PS5 version was not a simple port, but a remaster, that improved reflections and revamped lighting (new AO system and more sources of dynamic lights, higher shadow resolution). When you show people your PS5 screenshots from Uncharted 4, you should not tell them that they are looking at the PS4 2016 graphics, because that is false information.

I have only played the original PS4Pro and PC versions, so my comments do not reflect the reality on PS5. In the PC and PS4 versions, the lighting is however certainly not up to today's standards and my screenshots prove it. The lighting can still look good, especially in Drake's house, where Naughty Dog's lighting artists tried to fake realistic lighting as best they could to WOW people. Unfortunately, it's very easy to find locations with flat-looking lighting (especially outdoors). Indirect lighting is often missing on the PS4 and PC:

tll-2024-12-05-17-15-21-954.jpg


tll-2024-12-05-17-24-58-676.jpg


And character lighting rarely match the scene lighting.

u4-2024-12-05-14-59-33-018.jpg


u4-2024-12-05-15-26-48-957.jpg


There's also a glaring difference between cutscens (where the lighting artist faked the lighting) and gameplay. What I said about missing self shadows / lighting on character models during the gameplay was absolutely true.

Cutscene:

u4-2024-12-05-13-59-43-960.jpg


u4-2024-12-05-15-11-52-800.jpg


Gameplay

u4-2024-12-05-14-00-27-043.jpg


u4-2024-12-05-14-00-59-588.jpg


u4-2024-12-05-14-06-45-707.jpg


u4-2024-12-05-13-51-51-067.jpg


u4-2024-12-05-14-58-29-725.jpg


u4-2024-12-05-15-34-27-610.jpg


u4-2024-12-05-14-59-33-018.jpg


u4-2024-12-05-15-26-48-957.jpg


As I already said, this game would look much better with modern lighting techniques. As Indiana Jones and Metro Exodus prove, even RT GI can run well on current gen consoles, while Crysis 2 remastered with (SVOGI) dynamic GI could run well even on the PS4. Uncharted 4 would look noticeably better even with cheap SVOGI, as characters and objects would finally blend into the scene.
 
Last edited:

Vick

Gold Member
I'm not going to say I regret stepping in your defense when people there were calling you a clown in that Thread, Corporal.Hicks Corporal.Hicks , as that wouldn't be very mature or true to who I am as a person, but I would be lying if I said I don’t now lean toward that opinion myself.

I am not surprised at all to see you completely omitted these from your reconstruction of the events.

I'm honestly interested in other people opinions on this matter because I'd obviously like to know if I'm wrong about something
1:1

UaZfRKV.png


Full 4K detail on clothes and stiches, no dithering whatsoever on hair in any cutscene nor instance whatsoever that I could notice (replaced instead by native res aliasing), single strands on Luis facial hair always precisely resolved, every single aliased edge always coming as 3840 x 2160..

If this is not native 4K, then what are we are looking at?

It's a genuine question, reconstructed using what? I don't know of a single reconstruction technique that doesn't produce artifacts with fast movements while turning the camera, and that also presents native 4K aliased edges and fully preserved texture detail..
And this ain't an hallway, there's a lot of stuff like explosions and fire and destruction and enemies happening in this place, if there was a dynamic resolution in place this is the exact kind of situation it would intervene. Does this look like 1800p or "a more realistic 1620p" to you?
Material that could also easily disprove what I'm saying in case more well-versed users would want to join the discussion.
Evidence anyone here could easily corroborate in seconds by pixel counting or testing themselves.
f0a44825fc5594ea2fb316d10c3a65a5.gif


It's not "my Pro", you're in a Thread about a console most participants here bought and would be interested in these kind of info.

As unmasking the situation for the obvious genuine attempt at discovering the truth wouldn't fit with the narrative you're attempting to fabricate.
Or the way you phrased this:

You once again confirmed that the RE4R runs at 4K native, and even added that the game runs at an average of 79fps in the village test. I found it hard to believe that the PS5Pro could run this game at 60fps 4K native, let alone at 79fps. That would mean the PS5Pro is only 12% slower than RTX4080S. Your information was unbelievable from my perspwctive, so I asked other people to verify the resolution situation in RE4Remake on the PS5Pro.

As if, yet that again, I wasn't simply reporting what was available online:



Or that my own framerate assessments on the game, in that same message, turned later on to be completely accurate:

However even if the game is 99.9% of the time above 60fps maxed out, I did notice it drop when in the middle of the village at night during the storm with Ashley where I'd say it feels like 57-60, even though I was expecting it because that same area used to go below 40fps with same settings on base PS5.
I've also noticed some slight drops in the maze when fighting dogs, and I can tell it's struggling a little to keep locked 60fps in the Grand Hall. Honestly found both instances bizarre because the whole open area in the castle with the armored Giant causing mayhem with tons of fire particles and destruction everywhere and enemies and miles of terrain and water around, is 100% locked no matter what.

Or that you bring up this chart:

MisterXDTV was kind enough to show me a resolution chart that showed the game was not running at 4K native on the PS5Pro like you wanted people to believe. When you saw MisterXDTV's post with that RE4R resolution chart, you decided to reiterate once again that RE4Remake must run at 4K native.

From ResetERA, containing incorrect information such as the game in the mode we were referring to being machine leaning (PSSR) upscaled, or it being locked 60fps:

0JEi5YHv_o.png




4EIvxXl.png


When other users, playing the game, were already telling you PSSR is not being used in such mode.

Most people would just call you a liar in this situation, but because I want to be a nice person to other people, I tried to politely communicate that your information was obviously false. You were however still offended, even though I had every right to say what I said.
No, not at all. Most people would simply recognized this as a good faith mistake caused by incorrect sources:

9JH8xiE.png


Combined with analysis on the game screenshots not matching Interlacing issues, not producing Checkerboard artifacts, and not using PSSR.

It wouldn't be different than me calling you a liar for repeating what was contained in that ResetERA chart.

And, as it was told to you already, I was offended because you implied I did this on purpose, to prop up the PS5 Pro version with malicious intent intentionally, when these were my previous post in that same Thread waring people about the version when I, unknowingly, had a glitch that caused incorrect visuals:

Not sure if this was posted yet, but Resident Evil 4 Remake Pro patch is broken. You select Resolution mode but it is in fact Framerate Mode, so if you recently played RE4 Remake on Pro, you played with base PS5 Framerate settings (meaning much lower resolution textures and many more very noticeable differences).

This is why framerate reports indicated identical performance between Resolution and Framerate mode on Pro, they are the same.

The most ridiculous thing is that the original Resolution Mode finally runs at stable 60fps on Pro, but the only way to access this much better presentation is from unpatched disk version (thank God for the Gold version).*

Hopefully they'll fix this shit as soon as possible, would be awesome if someone could contact John on X or someone from DF and let them know, so they can spread awareness.

*Unfortunately even the Gold Edition is the launch code, so while it has the higher graphical settings it suffers from some original IQ issues like sporadic moire and some aliasing.
To play the ideal version? Unless you installed RE4 on the Pro before the official Patch and never updated, no.
The ideal version was the one right before the latest patch.

Resolution/IQ matter on the game is a little complicated.. even if the patched game on Pro runs at Framerate settings when you select Resolution Mode (much lower res textures etc.), the IQ is better still on Pro than disc 1.00 version* because PSSR (or whatever they're doing) is way ahead at resolving some of the raw aspects affecting the original base PS5 Resolution Mode (aliasing, moire, etc). So even if it is perceived as a little softer, it looks both better and cleaner.

Those unpleasant IQ issues* were later solved with a patch, so if you own a disc version, you're going to get the higher graphics settings and sharper picture, but with those IQ issues the game originally shipped with.


I've played the Chainsaw Demo right after two playthoughs on Pro, and immediately noticed the much higher resolution textures and higher resolution/sharper IQ.
This Demo never received the Pro patch, but did receive the IQ improvements over 1.00.


Here. This is the Demo and those much higher assets/textures are everywhere, texture difference is night and day in so many spots a proper comparison would take forever:

fODG0TB.gif


Fd5OI9C.gif


k7boZlh.gif


qCwZfFx.gif


Just confirmed the higher settings are present on 1.00.


Unfortunately, you're right. I naively believed Gold Version shipped with updates (due to save file working), but no, it is indeed 1.00.

Capcom needs to fix this shit immediately. We have to do something.

Edit:

*That being said, IQ "issues" are greatly reduced by simply disabling Lens Distortion and CA, maybe on 1.00 they don't work as good as patched version.
I apologize for the many posts on the matter, but base PS5 version of Resident Evil 4 has also been massively affected by patch 1.400.
It runs at much more stable 60fps now, but those higher settings graphics are simply gone.

Basically on PlayStation platforms there's no longer high resolution textures/assets, and there's no way/graphic mode to bring them back.

Sarcastic Oc GIF


Great job Capcom, please stay the fuck away from RE2/RE3 now that the physical versions are coming.

Edit:

Thanks to the amazing @FUBARx89, we discovered it's actually a stupid 1.400 glitch preventing those graphics to activate, on both base PS5 and Pro.
Likely caused by selecting Resolution when first booting the game. To fix this issue you have to switch between modes until you see the background (post first run) changing in real time.

Game is looking SPECTACULAR! And I did two whole playthroughs with the awful textures..

Which is something you did again in this Thread when you implied I was harsh with my definition of how Path Tracing runs on PC while favoring the PS5 version, nothing more than a bold-faced lie given my only ever posts on said console version being:

Cyberpunk on a 4090 PT is a different game altogether.



If you know what to look for, there's just no going back. You can push it even to this kind of absurd extents you'll only see next-gen:



There's countless games with little to no difference with PC, some even better on Pro (and few even way better like DMC5 or Dead Rising Deluxe), but you picked a game that's a literal generation apart.

It was never going to get even close to path traced PC version anyway.





avGqUTf.gif


Maybe they'll release a free "Next-Gen Update" on PS6 along with the native release.

What could they even do on Pro, adding those subtle RT shadows at 60fps? Bring another, out of many, RT feature at 30fps, providing something still years behind the full package that requires insane hardware to properly run on?

You appear to have a serious issue when it comes to jumping to wrong conclusions, and being obnoxious when faced with it. And now, have done the same for a third time:

As for our Uncharted 4 discussion, I just realized that you are misleading people once again.
At this point I am confident many people would simply abandon their composure.

That's your comparison.

euNZ7x3.gif


You are telling people that the PC port has been downgraded compared to the original version.
PS4 Version:
1LSZOgq.gif


PC Max 4090:
goqzKdj.gif


As anyone can see, orginal version has exactly the same reflections as PC version.
As also obviously depicted in the image above the one you chose..

yevlf8y.gif


You should not tell them that they are looking at the PS4 2016 graphics, because that is false information.
Every single one of the screenshots I posted, (nothing more than a sample of literal hundreds issues between missing shadows, reflections, shaders, VFX, particles etc.) is made in regards to the original 2016 PS4 version, not PS5 version.

And as for your bizarre persistence when it comes to your issues with Uncharted 4 lighting, there is not much more I could do other than once more exposing them as aimlessly ramblings, as those specific issues have nothing to with the Thread given the situation you described has been addressed by Threat Interactive in the video (time-stamped) in the very first post of this Thread:



And do not represent what rasterized solutions could be ultimately delivering, as already depicted in many other games.

Ultimately, your entire point on the 2016 software created to run on $399 2013 hardware is an absolutely obvious one everyone with an IQ surpassing the single digit threshold knows already, and that I've been saying myself more than once in this very Thread..

Would the game benefit from additional RT on top, maybe on more performant hardware to reduce the sacrifices? Of fucking course. It is exactly what I asked for years and desired was going to happen with the PC port instead of the crap Iron Galaxy ultimately released.
But this is completely irrelevant here, because we're talking about improving a game with phenomenal baking already, instead of doing from scratch eveything in real-time trying to match what it does with alternative methods.
Again, I'm not sure what your point in this entire Thread is. We agree on most things, minus the fact you want to force on consoles tech that translates into abysmal performance and resolutions. You went so defensive about Cyberpunk you mentioned how shit it runs already on PS5, and yet you're here clamoring for tech that would make it run in a drastically shittier way even..

So when you reveal yourself as an edgy teenager with statements like this:

This discussion will also not end in your favor.

I am only confused once more as to what kind of discussion you believe is taking place, considering the only point advanced is that gains RT would bring in selected instances due to the use of lighter and alternative solutions, isn't worth the drastic drop in resolution, framerate, and image quality issues such as reconstruction/denoising artifacts, on the target hardware this Thread is about.

You mentioned Metro Exodus multiple times, but this is the kind of image it produces on PS5:



Reconstruction/denoising mess, shading on tree worthy of PS2 software.

That even pretending the base, crude version of Metro Exodus under the RT implementation being even close to something like Uncharted 4 when it comes to the underlying technical structure and hardware cost due to assets, mechanics, animations, to begin with.. or that 9 out 10 random people wouldn't all pick PS4 Uncharted 4 as the better looking game even in spite of RT.

The entire concept, simple enough to be toddler-proof, has been pointed out to you repeatedly and yet somehow still hasn't landed.
Would you rather play something at a shimmering/noisy/artifacty (often around 800p internal) software with RT, or something like an evolved 2025 version of 2016 PS4 Uncharted 4/Lost Legacy at native 4K 60fps and 1440p 120fps?

Because it's as simple as that. And:

"Yeah, I'd take the much better resolution, IQ, framerate and clean visuals looking often absolutely incredible still as you clearly have shown multiple times, artists and competent people created over what Alan Wake 2, Wukong, Jedi Survivor or Outlaws delivered on the PS5 I'm playing on, because I'm fucking anal bastard that loves to put his 2.5K PC and IQ at the center of virtually every single post here, and I could never stand such glaring compromises when playing on this console"

Is the only possible answer here, and happens to immediately put an end to the argument.

As I already said, this game would look much better with modern lighting techniques. As Indiana Jones and Metro Exodus prove, even RT GI can run well on current gen consoles, while Crysis 2 remastered with (SVOGI) dynamic GI could run well even on the PS4. Uncharted 4 would look noticeably better even with cheap SVOGI, as characters and objects would finally blend into the scene.
And SVOGI, not being RT or PT, completely lands into my own point about what would be possible to achieve in absence of heavier alternatives provided by RT and PT..
 

Mister Wolf

Member
I'm not going to say I regret stepping in your defense when people there were calling you a clown in that Thread, Corporal.Hicks Corporal.Hicks , as that wouldn't be very mature or true to who I am as a person, but I would be lying if I said I don’t now lean toward that opinion myself.

I am not surprised at all to see you completely omitted these from your reconstruction of the events.







As unmasking the situation for the obvious genuine attempt at discovering the truth wouldn't fit with the narrative you're attempting to fabricate.
Or the way you phrased this:



As if, yet that again, I wasn't simply reporting what was available online:



Or that my own framerate assessments on the game, in that same message, turned later on to be completely accurate:



Or that you bring up this chart:



From ResetERA, containing incorrect information such as the game in the mode we were referring to being machine leaning (PSSR) upscaled, or it being locked 60fps:

0JEi5YHv_o.png




4EIvxXl.png


When other users, playing the game, were already telling you PSSR is not being used in such mode.


No, not at all. Most people would simply recognized this as a good faith mistake caused by incorrect sources:

9JH8xiE.png


Combined with analysis on the game screenshots not matching Interlacing issues, not producing Checkerboard artifacts, and not using PSSR.

It wouldn't be different than me calling you a liar for repeating what was contained in that ResetERA chart.

And, as it was told to you already, I was offended because you implied I did this on purpose, to prop up the PS5 Pro version with malicious intent intentionally, when these were my previous post in that same Thread waring people about the version when I, unknowingly, had a glitch that caused incorrect visuals:





Which is something you did again in this Thread when you implied I was harsh with my definition of how Path Tracing runs on PC while favoring the PS5 version, nothing more than a bold-faced lie given my only ever posts on said console version being:





You appear to have a serious issue when it comes to jumping to wrong conclusions, and being obnoxious when faced with it. And now, have done the same for a third time:


At this point I am confident many people would simply abandon their composure.


PS4 Version:
1LSZOgq.gif


PC Max 4090:
goqzKdj.gif



As also obviously depicted in the image above the one you chose..

yevlf8y.gif



Every single one of the screenshots I posted, (nothing more than a sample of literal hundreds issues between missing shadows, reflections, shaders, VFX, particles etc.) is made in regards to the original 2016 PS4 version, not PS5 version.

And as for your bizarre persistence when it comes to your issues with Uncharted 4 lighting, there is not much more I could do other than once more exposing them as aimlessly ramblings, as those specific issues have nothing to with the Thread given the situation you described has been addressed by Threat Interactive in the video (time-stamped) in the very first post of this Thread:



And do not represent what rasterized solutions could be ultimately delivering, as already depicted in many other games.

Ultimately, your entire point on the 2016 software created to run on $399 2013 hardware is an absolutely obvious one everyone with an IQ surpassing the single digit threshold knows already, and that I've been saying myself more than once in this very Thread..




So when you reveal yourself as an edgy teenager with statements like this:



I am only confused once more as to what kind of discussion you believe is taking place, considering the only point advanced is that gains RT would bring in selected instances due to the use of lighter and alternative solutions, isn't worth the drastic drop in resolution, framerate, and image quality issues such as reconstruction/denoising artifacts, on the target hardware this Thread is about.

You mentioned Metro Exodus multiple times, but this is the kind of image it produces on PS5:



Reconstruction/denoising mess, shading on tree worthy of PS2 software.

That even pretending the base, crude version of Metro Exodus under the RT implementation being even close to something like Uncharted 4 when it comes to the underlying technical structure and hardware cost due to assets, mechanics, animations, to begin with.. or that 9 out 10 random people wouldn't all pick PS4 Uncharted 4 as the better looking game even in spite of RT.

The entire concept, simple enough to be toddler-proof, has been pointed out to you repeatedly and yet somehow still hasn't landed.
Would you rather play something at a shimmering/noisy/artifacty (often around 800p internal) software with RT, or something like an evolved 2025 version of 2016 PS4 Uncharted 4/Lost Legacy at native 4K 60fps and 1440p 120fps?

Because it's as simple as that. And:

"Yeah, I'd take the much better resolution, IQ, framerate and clean visuals looking often absolutely incredible still as you clearly have shown multiple times, artists and competent people created over what Alan Wake 2, Wukong, Jedi Survivor or Outlaws delivered on the PS5 I'm playing on, because I'm fucking anal bastard that loves to put his 2.5K PC and IQ at the center of virtually every single post here, and I could never stand such glaring compromises when playing on this console"

Is the only possible answer here, and happens to immediately put an end to the argument.


And SVOGI, not being RT or PT, completely lands into my own point about what would be possible to achieve in absence of heavier alternatives provided by RT and PT..


SVOGI is raytracing. Software raytracing no different than Lumen. It's raytracing using a Voxel representation of the environment.





"After Gamescom, I followed up with Warhorse PR Manager Tobias Stolz-Zwilling, who confirmed that the software ray tracing mentioned by Klima is actually an improved version of SVOGI, which stands for Sparse Voxel Octree Global Illumination. SVOGI was already available in the first Kingdom Come Deliverance game and in a few other CryENGINE-powered titles like War of Rights and SNOW. I also got confirmation that SVOGI will be available on consoles, too, for Kingdom Come Deliverance 2."
 
Last edited:
Top Bottom