• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Threat Interactive] Dynamic lightning was better nine year ago | A warning about 9th gen neglect.

proandrad

Member
AzwAKWi.gif


⬆️ Runs at native 864p 60fps, looking worse than that, on PS5 Pro.

⬇️ Runs at native 4K 60fps and native 1440p 120fps on the same hardware.

YufLyTD.gif


2dScimV.gif


aIXragn.gif


76lFFLx.gif


hKGvNSS.gif


Qljn9kn.gif


Well There It Is Jurassic Park GIF

The Office Thank You GIF

Unless your game looks twice as good as uncharted 4, GTFO with your sub 1080p or 30fps bullshit. Ray-Tracing is a waste of resources and a good art team doesn't need that crap to make a game look good.
 
Last edited:

Buggy Loop

Member
ThatPolishGuy ThatPolishGuy

You'll find this one interesting for TAA.



I'm not sure what I hate more, TAA smearing and ghosting or screen space reflection artifacts. Both high on the list of downgrades compared to some old games.

Vick Vick fucking hell Sony cooked last gen

There's so many intelligent solutions that appeared for raster to imitate indirect lights, dynamic lights, etc. Hours and hours of siggraph videos from wide variety of studios, including research papers on the subjects, and we kind of all threw that under the bus so that artists don't put lights manually.

I pushed RT & path tracing a lot in the past years, but you once in a blue moon get one that is super well optimized like 4A games' Metro Exodus EE and Cyberpunk 2077 overdrive, you get 20 other games on the side that perform like total shit. So in the end, I'm really conflicted.

I hope to see one of Naughty's Dog tour de force by the end of the gen. I think they'll show how it can still be done with raster and meshes.
 

Buggy Loop

Member
Not DLSS/DLAA/TAA's fault for this one but wow







It's Lumen's temporal accumulation coupled with screen space reconstruction and denoiser

Switching to DLSS ray reconstruction can help

But cmon..

barf GIF


What the fuck are we even doing? What's the point of buying a 4k OLED TVs or a 4090 GPU when it's a fucking PIXEL SOUP.
 
Last edited:

ap_puff

Banned
Not DLSS/DLAA/TAA's fault for this one but wow







It's Lumen's temporal accumulation coupled with screen space reconstruction and denoiser

Switching to DLSS ray reconstruction can help

But cmon..

barf GIF


What the fuck are we even doing? What's the point of buying a 4k OLED TVs or a 4090 GPU when it's a fucking PIXEL SOUP.

Can't blame PSSR for this one
 

Shodai

Member
As someone who has been in software development for 20 years, I appreciate this kids zeal, but he comes off like a lot of my junior developers. Talk a big game, but have no perspective as to why things are done the way they are.

Carmack used to take this tone, but he backed up it by churning out revolutionary tech. I'd like to see this guy put his money where his mouth is.
 
Last edited:
I hate the look of most games now with fuzzy reflections and noisy FSR image.

Also modern Call Of Duty games (except Cold War) seem to look like shit unless I research a ton of settings to change, which is something I've never had to do for any other game - and I've played PC games for decades.

032rr16f5rn31.png
Totally...I'm so disappointed with Cod campaign visuals with MW 2019 and Cold War being the last two that actually impressed. There's 0 fucking excuse for MW2019 to be MUCH better looking than MW2/mw3/vanguard/and BO6, which all came out on a new generation of hardware than the Xb1X version I played back then!

Shameful stuff imo ..at least this is the situation on console. Now I'm playing BO6 on the PS5 Pro and it looks all shimmery and blurry compared to One X MW2019 ...how the fuck?

Alan Wake 2 on the PS5 Pro still has the worst image quality of any major game of the last 5 years....NOTHING has been fixed in terms of shimmer/aliasing and now that it has RT reflections the noise is really bad on certain surfaces (subway cars in caldera station).

Something has gone very wrong in terms of graphics in the industry. I used to think it was only consoles but now I'm seeing many PC people with similar complaints. I've reached my limit of frustration after buying the PS5 Pro only to see that almost every game that I was hoping would finally have it's iq/aliasing issues resolved on Pro is the same or worse than before! The only exception to that is FF7 Rebirth and Star Wars Jedi Survivor (rebirth at 60 fps and Survivor at 30 both look amazing).

Gamers are slowly waking up to this bullshit now but it's always an all too slow realization among the casual crowd.
 
I'm glad people are finally waking the fuck up about upscaling as a silver bullet to solve any problem. It can be effective, but it's not and end-all solution. Somehow, we've ended up in the timeline where both image quality and responsiveness are just getting worse and worse, and nobody dared to ask why we've accepted that.
Gamers sadly will accept anything they get, especially console gamers. Look at the ps5 pro situation. People who just dropped $700-$800 are defending and justifying the pathetic state that a handful of games are in. Why's "a handful" significant? Because these particular ges account for most of the true current gen titles that have received a patch ...each one of these games not only hasn't resolved the image quality woes they had on base console but have added new problems that arise from pssr and low internal resolutions .. .these are the games people should be focused on NOT the handful of Sony cross gen titles that have retained or improved good iq! As always these defenders have it backwards as they accept a console that is failing entirely to do what it was supposed to. Sony has been dead silent. Very little communication or acknowledgement from the developers of such games, again, business as usual and these gamers want to tell us we're overreacting because "most games are great on the Pro"....

It should be an outrage but never is and Sony and 3rd party devs know this by now which is why Sony had the gumption to release a Pro console in the first place that can't run a game like AW2 or SH2 at a minimally acceptable internal resolution so as not to have these artifacts..

not even the hardcore, Pro-purchasing, "discerning" gamer, wants to accept the reality of the situation ..Remedy has already patched AW2 4 or 5 times since the Pro released with 0 improvement whatsoever ..
 

Vick

Gold Member
Vick Vick fucking hell Sony cooked last gen

There's so many intelligent solutions that appeared for raster to imitate indirect lights, dynamic lights, etc. Hours and hours of siggraph videos from wide variety of studios, including research papers on the subjects, and we kind of all threw that under the bus so that artists don't put lights manually.

I pushed RT & path tracing a lot in the past years, but you once in a blue moon get one that is super well optimized like 4A games' Metro Exodus EE and Cyberpunk 2077 overdrive, you get 20 other games on the side that perform like total shit. So in the end, I'm really conflicted.
I too like RT. A lot actually, and especially Path Tracing which is like the ultimate dream and final destination of real time rendering.
It is just not worth the cost most of the time, at least on consoles. Even though I find some RT to be almost necessary, like RT reflections because SSR just suck so bad (they suck less when well implemented and in conjunction with perfectly placed cubemaps):





And whatever the hell obscure solution ND is using for the Ellie reflection in that pool:

PsIATpE.gif


Or for all their reflective mirrors, but they still suck for vast majority of teams that aren't the ND of the industry.

And RT shadows because they also often suck so bad and the only alternative to truly realistic shadows is PCSS, which is also heavy and only implemented in console games I can count on one hand.



RT GI is completely useless in static time of day games as you can simulate perfectly both GI bounce (and doing it much better than RT + denoise possible on consoles and most PCs) and occlude the bounce in many ways like capsule AO ND uses perfectly. It's a different story with dynamic time of day, there it's maybe the most important feature, but also extremely taxing and has been convingly simulated even in PS3 games like GTA V, so..

And RT AO I would never be able to tell the difference without side by side from a good HBAO+, and at times would even visually prefer an artifact free SSAO which appeared in many console games from most talented devs.

RT is the future, it's just simply sure as shit not worth it when it means I have to play at PS3-like internal resolutions and framerate.

I hope to see one of Naughty's Dog tour de force by the end of the gen. I think they'll show how it can still be done with raster and meshes.
Believe it or not, there's like a dozen users here on GAF in the "Graphical Fidelity I Expect This Gen" who believe points should be detracted from ND games for not using Ray Tracing.
On the same hardware where they already successfully simulate every RT feature possible on that hardware, but at four times the resolution and twice the framerate actual RT would require. At least to achieve similar results without noise/artifacts.
 
Last edited:
It's times like these that I'm really glad that most image quality issues don't bother me in the slightest. I just don't need the crispiest of graphics. Is it nice? Sure. Does it bother me to see some shimmering or graininess? Nah. It's all good.

I also think raytracing is a gamechanger and the most transformative improvement we've seen in a long while. I'm okay accepting some growing pains for what should likely be the standard going forward.
 
AzwAKWi.gif


⬆️ Runs at native 864p 60fps, looking worse than that, on PS5 Pro.

⬇️ Runs at native 4K 60fps and native 1440p 120fps on the same hardware.

YufLyTD.gif


2dScimV.gif


aIXragn.gif


76lFFLx.gif


hKGvNSS.gif


Qljn9kn.gif


Well There It Is Jurassic Park GIF

Sony premier studio ND using a proprietary, purpose built engine for a flagship title with a far bigger budget and all the resources and technical expertise as needed provided by the platform holder vs. Remedy using UE.

I get the point you are trying to make, but how is this even a fair comparison? This is like thinking a privateer can make an off the shelf GT4 as fast as a prototype with full factory backing.
 

Buggy Loop

Member
Sony premier studio ND using a proprietary, purpose built engine for a flagship title with a far bigger budget and all the resources and technical expertise as needed provided by the platform holder vs. Remedy using UE.

I get the point you are trying to make, but how is this even a fair comparison? This is like thinking a privateer can make an off the shelf GT4 as fast as a prototype with full factory backing.

Isn’t Epic's Fortnite $Billions not trinkling down in UE5 engine somehow?

Naughty Dog did it with a Jaguar CPU, under 2 TFLOP old AMD tech with a 2012 architecture and 8GB of shared RAM. No AI, no RT cores.

We can barely go above, if not even under what was done by them when we factor what happened to image quality to get there with UE, with the best modern CPUs on chiplet designs and 3D V-caches, 32 GB of RAM and 83 TFLOPs of massive silicon 4090 chuck full of RT cores, tensor cores and vastly superior architecture compared to GCN days.

So it was never about hardware, nor an engine's "revolutionary" nanite or Lumen global illumination, tools to help developers, vast database of help and support.

Just raw talent and Sony money to pull off a miracle

I Dont Care Whatever GIF


Well that was a fucking waste of a gen, ain't it.

We effectively plateau’d even though we have expensive hardware, we fucked up image quality in general, especially consoles, we can barely get a 60 fps and only so that big studios can save money by going UE 5 and then send work to cheap labour countries because it’s universally « the » engine now? Bunch of junior devs so studios save costs, no engine R&D, less artists to put lights on raster tech, but at the same time it ballooned so high that studios are closing? What even is that nonsense? AA studios can go and take assets from a store and pump them in a game easy, not optimizing for shit. Wow that asset flip game can look good for sure, but is anyone impressed? Did anyone expect the gen to unfold like that? I certainly didn’t. More time passes the more I think this was a huge mistake.
 
Last edited:

ZehDon

Member
Sony premier studio ND using a proprietary, purpose built engine for a flagship title with a far bigger budget and all the resources and technical expertise as needed provided by the platform holder vs. Remedy using UE...
Slight correction: Remedy used Northlight for Alan Wake 2. It's their own, purpose built engine.

While I certainly don't disagree with the video, I think it's worth considering why this kind of change occurred. Looking at earlier 3D titles, like Quake 2, they actually have bounce lighting - albeit, quite primitive in their calculations. This type of calculation wasn't feasible in real-time, so they used pre-computed lightmaps and were able to use fledgling GPUs to merge the pre-baked lightmap textures with realtime lighting passes - things like muzzle flashes and rocket trails - to create the illusion of interactive light. As computers got faster and GPUs stepped up, we made it to fully real-time lighting - however, the sheer performance cost of this type of approach was often prohibitive for the final result relative to what was still achievable with pre-baked lighting. As scenes, materials, and lighting got more and more complicated, the baking requirements in terms of time and hardware grew and grew. Moving to real-time lighting mean taking a hit in the visual arms race that developers engage in, so pre-baked techniques persisted. Oftentimes, games would need to contain half a dozen pre-calculated lighting passes for their entire environments and materials to create various lighting transitions, all to fake large-scale high fidelity real-time lighting. This baking process could take literally days to actually render, leaving artists and designers with no real way to see their work in-progress. This slowed down production as developers pushed these techniques to their limits. Slower production costs more money.
By the time we got to CryEngine, where the editor rendered everything in real-time, production was able to move much faster. More iterations, higher quality, less time. This pushed a lot of engines to move in this direction, because it allows developers and artists to see their output immediately, as opposed to needing to wait potentially days to see that you need to do another pass to get everything looking as you wanted it to in all scenarios. Faster production costs less money.
What we see with titles like TLOU2 isn't sophisticated real-time lighting - it's just what can be done with the hybrid techniques in 2020 using a $250m budget and literally thousands of artists to pull off the illusion. Even Naughty Dog's smooth animations are actually pre-baked, using a technique called motion matching to merge together thousands of pre-canned animations. The sheer amount of manpower necessary to produce stuff at that kind of scale is prohibitive for everyone except those with the biggest budgets in the history of the medium. For their artists, they usually create local-rendered versions of their work, just rendered at significantly lower fidelity before they can merge it into their nightly builds. This provides them at least some ability to check their work, but the process can still take several days to final something that could take only a couple of hours if everything were real-time. The trade off is their games look that much better than everyone else's because the machines are calculating that much less every frame. For a first party looking to deliver system-selling visuals, that's money well spent for Sony.
In order for real-time to keep up with pre-baked techniques, something has to give - and in this case, it's usually image quality and performance. Realtime will look just as good as pre-baked when rendered at native 4k on a bleeding edge dual-GPU PC, but when upscaling from 600p using FSR2 on an Xbox Series S, it runs and looks like vomit soup. The trade off is made so that developers can work faster and easier, and change jobs on demand due to standardised technology like Unreal Engine, all working to help games cost less money to make. For example, the biggest selling point of Unreal Engine 5's Nanite wasn't the lack of pop-in for gamers, it was that developers no longer needed to massage LOD systems or have artists hand-craft LODs for games using the system. Running your game at 30FPS instead of 60FPS, but needing one year less of development costs, is an easy trade off to make in this industry.
 
Last edited:

Rubim

Member
Doesn't Epic's Fortnite $Billions not trinkling down in UE5 engine somehow?

Naughty Dog did it with a Jaguar CPU, under 2 TFLOP old AMD tech with a 2012 architecture and 8GB of shared RAM. No AI, no RT cores.

We can barely go above, if not even under what was done by them when we factor what happened to image quality to get there with UE, with the best modern CPUs on chiplet designs and 3D V-caches, 32 GB of RAM and 83 TFLOPs of massive silicon 4090 chuck full of RT cores, tensor cores and vastly superior architecture compared to GCN days.

So it was never about hardware, nor an engine's "revolutionary" nanite or Lumen global illumination, tools to help developers, vast database of help and support.

Just raw talent.

I Dont Care Whatever GIF


Well that was a fucking waste of a gen, ain't it.

We effectively plateau’d even though we have expensive hardware, we fucked up image quality in general, especially consoles, we can barely get a 60 fps and only so that big studios can save money by going UE 5 and then send work to cheap labour countries because it’s universally « the » engine now? Bunch of junior devs so studios save costs but at the same time it ballooned so high that studios are closing? AA studios can go and take assets from a store and pump them in a game easy, not optimizing for shit. Wow that asset flip game can look good for sure, but is anyone impressed? Did anyone expect the gen to unfold like that? I certainly didn’t. More time passes the more I think this was a huge mistake.
I think its more about time.
UE5 solution will work mostly out of the box.

 

mrqs

Member
The Office Thank You GIF

Unless your game looks twice as good as uncharted 4, GTFO with your sub 1080p or 30fps bullshit. Ray-Tracing is a waste of resources and a good art team doesn't need that crap to make a game look good.

Well, one of those games is from a massive first party studio with devs, time and money to tweak fake lights so it looks 'real'.

The other is an independent game with probably 1/4 of the budget.

Real time lighting makes game dev faster and to indie devs to make the best-looking game possible without needing to fake every single corner so the illusion doesn't break.

We're never going back to pre-baked lighting.
 
Last edited:

panda-zebra

Member
Never seen this person's content before and have no idea who he is, but have to say it was difficult to tell if this was a parody video or not at first. Such anger and edge right from the off, this must be the result of a frustrating, months long back-and-forth. Coupled with that hair it made a hell of a first impression.

I told Epic Games during the preview phase of 5.5...

monday night raw whatever GIF by WWE


When a pre-teen finds their voice, sees so clearly in their mind how to right *all* the world's wrongs, doesn't have a second for anyone else's opinion because it's all so obviously flawed... same energy.
 

ZehDon

Member
Never seen this person's content before and have no idea who he is, but have to say it was difficult to tell if this was a parody video or not at first. Such anger and edge right from the off, this must be the result of a frustrating, months long back-and-forth. Coupled with that hair it made a hell of a first impression.

I told Epic Games during the preview phase of 5.5...

monday night raw whatever GIF by WWE


When a pre-teen finds their voice, sees so clearly in their mind how to right *all* the world's wrongs, doesn't have a second for anyone else's opinion because it's all so obviously flawed... same energy.
To be fair, the industry has a history of arrogant early-twenty-somethings who know their shit. I'm not saying that's the case here, but his understanding of nanite's overdraw problem show's he's not blowing smoke. However, some of the info in this video highlights a lack of understanding of the realities of producing a AAA game at scale.

It's all well and good to say "Why aren't we using the offline techniques that made great looking games on the Xbox 360 so performant?" until you understand that doing that type of work for modern film-quality models, targeting 4K displays, using multi-layered PBR materials would require a work force of hundreds dedicated in those techniques. For example, the directional lighting and shadowing system used in Crysis 2, which created both highlights and lowlights to simulate bounce lighting on the Xbox 360, was programmatically crafted but then hand tweaked to create its visuals. Doing that for a linear FPS campaign on the Xbox 360 at sub-720p still took their skilled environmental artists months and months. Doing that kind of work for a game the size of STALKER 2 with dynamic night and day would require a massive, dedicated team doing nothing but that work for years. Or I can use Unreal Engine 5 and lumine and skip it, save for a polish pass on light leak.
 
Last edited:

Doczu

Member
Thief Deadly Shadows on UE3 had also very nice lightning and shadows. Although it didn't have soft shadows it impressed me back in the day.



Same with Doom 3, I think. I always wondered where we end up in the future. And yeah, blurry image upscaling tech...

Dude that game was running on modified UE2, sometimes called 2.5
Same as Splinter Cell: CT. That game had IMO the best lightning when it released.
 

Bernardougf

Member
I already said this ... I would be pretty happy playing games with uncharted 4 IQ and fluidity until the hardware could produce something better and keep the perfomance ....

We entered the gens of progressively diminishing returns ... forcing minimal upgrades that are barely noticeable during gameplay but tank perfomance for the sake of photo mode is the biggest error for some of the games this gen.

Do a photo mode gtaphical set with 15 fps and let people take pictures and jerkoff to puddle reflexes and muh ray tracing... keep your game at locked 60+ fps on the maximum number of hardware you can including consoles.

Not doing perfectly successful past techniques that produced great results to bruteforce new techniques that produce meh results and/or bad perfomance in 90% of the hardware avaible is the joke os this gen.
 
Last edited:

john2gr

Member
Slight correction: Remedy used Northlight for Alan Wake 2. It's their own, purpose built engine.

While I certainly don't disagree with the video, I think it's worth considering why this kind of change occurred. Looking at earlier 3D titles, like Quake 2, they actually have bounce lighting - albeit, quite primitive in their calculations. This type of calculation wasn't feasible in real-time, so they used pre-computed lightmaps and were able to use fledgling GPUs to merge the pre-baked lightmap textures with realtime lighting passes - things like muzzle flashes and rocket trails - to create the illusion of interactive light. As computers got faster and GPUs stepped up, we made it to fully real-time lighting - however, the sheer performance cost of this type of approach was often prohibitive for the final result relative to what was still achievable with pre-baked lighting. As scenes, materials, and lighting got more and more complicated, the baking requirements in terms of time and hardware grew and grew. Moving to real-time lighting mean taking a hit in the visual arms race that developers engage in, so pre-baked techniques persisted. Oftentimes, games would need to contain half a dozen pre-calculated lighting passes for their entire environments and materials to create various lighting transitions, all to fake large-scale high fidelity real-time lighting. This baking process could take literally days to actually render, leaving artists and designers with no real way to see their work in-progress. This slowed down production as developers pushed these techniques to their limits. Slower production costs more money.
By the time we got to CryEngine, where the editor rendered everything in real-time, production was able to move much faster. More iterations, higher quality, less time. This pushed a lot of engines to move in this direction, because it allows developers and artists to see their output immediately, as opposed to needing to wait potentially days to see that you need to do another pass to get everything looking as you wanted it to in all scenarios. Faster production costs less money.
What we see with titles like TLOU2 isn't sophisticated real-time lighting - it's just what can be done with the hybrid techniques in 2020 using a $250m budget and literally thousands of artists to pull off the illusion. Even Naughty Dog's smooth animations are actually pre-baked, using a technique called motion matching to merge together thousands of pre-canned animations. The sheer amount of manpower necessary to produce stuff at that kind of scale is prohibitive for everyone except those with the biggest budgets in the history of the medium. For their artists, they usually create local-rendered versions of their work, just rendered at significantly lower fidelity before they can merge it into their nightly builds. This provides them at least some ability to check their work, but the process can still take several days to final something that could take only a couple of hours if everything were real-time. The trade off is their games look that much better than everyone else's because the machines are calculating that much less every frame. For a first party looking to deliver system-selling visuals, that's money well spent for Sony.
In order for real-time to keep up with pre-baked techniques, something has to give - and in this case, it's usually image quality and performance. Realtime will look just as good as pre-baked when rendered at native 4k on a bleeding edge dual-GPU PC, but when upscaling from 600p using FSR2 on an Xbox Series S, it runs and looks like vomit soup. The trade off is made so that developers can work faster and easier, and change jobs on demand due to standardised technology like Unreal Engine, all working to help games cost less money to make. For example, the biggest selling point of Unreal Engine 5's Nanite wasn't the lack of pop-in for gamers, it was that developers no longer needed to massage LOD systems or have artists hand-craft LODs for games using the system. Running your game at 30FPS instead of 60FPS, but needing one year less of development costs, is an easy trade off to make in this industry.

QFT. This explains what's been happening in the gaming space with pre-baked and real-time lighting. More should read this.
 

Clear

CliffyB's Cock Holster
Interesting video, but in my opinion its pretty naive.

Here's the thing; if you are Epic, when architecting your engine you need to choose a certain path forward because you can't advance every technique and innovation simultaneously. Which elements to focus on are going to based on a vision for where the industry is going based on consumer, client, and technological trends. The goal being to play the "long game" so as to avoid radical re-architecting of core functions and pipelines down the line.

What we're seeing with UE5 are a bunch of choices that will have been made in respect of what they think the overall direction of travel is going to be over the next decade.

The alternative approaches the guy in the video is talking about making might be more efficient and effective right now, but the million dollar question is will they still be significantly advantageous in the future as AI becomes more ubiquitous and powerful across all types of hardware implementation?

Even the most brilliant of ideas can become obsolete if their use-case shrinks substantially over the time required to implement, deploy, and demonstrate them.
 

Lootlord

Member
You mean the scripted fake 'gameplay' demo or the released downgrade?


I finished the campaign this year for the first time, only played the demo back then. Replayed it on my 4090 with DLDSR for AA and it still looks amazing at times. Then i played the 2nd one and while it improves on density it kinda lacks the great depressing winter atmo from the original.
 
This is the first gen where I prefer to play last gen games. Why would I want to play a game running at 720p when I can play something like Last of Us 2 at 1440p. The worst part is that the game running at 720p looks maybe 10% better at a technical level. 1080p should be the minimum for the internal res. 40fps should also be used more as something like Silent Hill 2 might be able to run at 1080p/40. Same goes for Alan Wake 2 etc.
 
This is the first gen where I prefer to play last gen games. Why would I want to play a game running at 720p when I can play something like Last of Us 2 at 1440p. The worst part is that the game running at 720p looks maybe 10% better at a technical level. 1080p should be the minimum for the internal res. 40fps should also be used more as something like Silent Hill 2 might be able to run at 1080p/40. Same goes for Alan Wake 2 etc.
This is the crux of the matter. Outside of R&C and Demon's Souls, no next gen only game looks any better than last gen games on console. I do not care about accurate reflections and indirect lighting/bounce lighting. Just give me an image that does not look like a soupy dithered mess.
 
Last edited:
I like this guy's channel. If he maintains the same energy he'll bulldoze his way into the industry.
His CV would be binned. Guy is a grifter, most of the stuff in the videos is just plain wrong. He is a one man studio trying to make a name for himself by bluffing normies. I doubt he has much done besides a few blue prints and messing about with settings. Guy has a whole video on Nanite overdraw and culling, which, gets everything wrong 100% about how Nanite works.
 
Last edited:

CrustyBritches

Gold Member
His CV would be binned. Guy is a grifter, most of the stuff in the videos is just plain wrong. He is a one man studio trying to make a name for himself by bluffing normies. I doubt he has much done besides a few blue prints and messing about with settings. Guy has a whole video on Nanite overdraw and culling, which, gets everything wrong 100% about how Nanite works.
Can I have the link to your channel for comparison?
 

Rivdoric

Member
TAA level of shitness has pushed my GPU requirements so high. A clusterfuck level of craziness.

I'm playing on a 4K 42" and every game that uses TAA/DLSS i need to use DLDSR 2880 or 3240 to compensate for the blurriness occurred.
Makes a lot of modern games extremely difficult to play even on my 4090 because 2880/3240 + 100 fps is very difficult to achieve.

The motion sickness is real.

When i launch an old game with good ol' MSAA i feel like my eyes are working correctly again.
 
Last edited:

Buggy Loop

Member
His CV would be binned. Guy is a grifter, most of the stuff in the videos is just plain wrong. He is a one man studio trying to make a name for himself by bluffing normies. I doubt he has much done besides a few blue prints and messing about with settings. Guy has a whole video on Nanite overdraw and culling, which, gets everything wrong 100% about how Nanite works.

Hmm? What did he say about nanite that is 100% wrong?

There’s tons of posts on UE forums about nanite overdraws too much and so on. It’s well documented that when the game is not brimming with geometry that you are throwing performances under the bus. Hell, I think everyone sees It themselves. Remnant 2 without Lumen still performed like absolute dogshit.

There’s an overhead cost to nanite obviously, but it gets super efficient as you get to high complexity.high poly count. But not a lot of games really went there so far. It’s expensive to make super detailed geometry unless you asset flip on a store.

Epic of course recommends everyone goes full nanite, as virtual shadow maps perform well on them, but yeah, those shadows? I spot their artifacts all the time. Those Stalker videos above are full of it. Flickering mess.

Those virtual shadow maps that were basically presented straight out in their demo as being the solution for changing time of day and seeing the shadows accurately? Well damn, look at Epic’s own 5.4 on-going bugs

If you would like to be able to have dynamic time of day changes using a directional light such as the sun, be aware that this can incur a heavy performance cost on the VSM. We have introduced some new console variables in 5.4, which you can use to split cache updates into static and dynamic caches. This should hopefully reduce the overhead on your VSM caches.

the naked gun facepalm GIF


I know this is all the future. Don’t get me wrong, but this came in way too hot. This is an engine for PS6, but by then they’ll again be pushing for some nonsense heavy feature I bet..
 
So besides using the editor view to "confirm" his suspicions about overdraw, which itself has issues with respect to how Nanite solves this issue (it won't be as bad as the editor shows). Talking about optimizing Models, LODs etc to provide performance, and specifically skipping over the fact that it is recommended to optimize your meshes /placements for Nanite also when it comes to arborary placements and high-frequency intersections, or intersections that single pixel covers. The dude is raising funds and throwing shade at UE to boost his subscribers who kinda, maybe think sorta UE5 performs worse, but latch onto THIS guys reasoning for some reason?

Anyway leave it to the experts to explain what you should do

 
Last edited:

Branded

Member
So that video was very enlightening (pardon the pun) and I posted it on my own discord, which is evidently still full of TDS and EDS sufferers, as I got these replies:

jxFbjUk.png
hFizi2G.png
RckPTDG.png


Jerry Seinfeld Reaction GIF
 

nkarafo

Member
I was playing Dirt Rally 2 and was wondering why the game looks so bad? Not only the cars look blurry as hell, the distant foliage and trees looked like oil painting plastic blobs. I thought there is no way this game looks so bad at the highest settings... Until i disabled TAA and the game now looks great.

Same with Death Stranding. The whole game looks blurry. Disabling TAA made it look like a completely new game. Even with FXAA it's still sharp in comparison.

TAA and post processing are ruining games and i'm glad i'm on PC so i can disable most of this crap.
 
Last edited:

Buggy Loop

Member
So besides using the editor view to "confirm" his suspicions about overdraw, which itself has issues with respect to how Nanite solves this issue (it won't be as bad as the editor shows). Talking about optimizing Models, LODs etc to provide performance, and specifically skipping over the fact that it is recommended to optimize your meshes /placements for Nanite also when it comes to arborary placements and high-frequency intersections, or intersections that single pixel covers. The dude is raising funds and throwing shade at UE to boost his subscribers who kinda, maybe think sorta UE5 performs worse, but latch onto THIS guys reasoning for some reason?

Anyway leave it to the experts to explain what you should do



Latch onto the guy? No

We see the performances ourselves.

Jurassic Park Poop GIF by Vidiots


It’s shit
 

winjer

Gold Member
The problem is not just TAA, though this is a major reason for it.
Things like motion blur, chromatic aberration and film grain, also contribute to reducing image quality.
 
Last edited:

Buggy Loop

Member
I was playing Dirt Rally 2 and was wondering why the game looks so bad? Not only the cars look blurry as hell, the distant foliage and trees looked like oil painting plastic blobs. I thought there is no way this game looks so bad at the highest settings... Until i disabled TAA and the game now looks great.

Same with Death Stranding. The whole game looks blurry. Disabling TAA made it look like a completely new game. Even with FXAA it's still sharp in comparison.

TAA and post processing are ruining games and i'm glad i'm on PC so i can disable most of this crap.

That was still possible for UE4 games

UE5 forces a temporal solution

You can mod it out, but it’ll look like a dithered mess because they cheated away almost all effects to half res so that the temporal solution smears everything into a shape.

Stalker 2 without TAA has those comical trees


Laughable tech. Going to render almost everything half res so temporal smears into a result is probably some nerds’ brilliant idea but then, where is the performance gain? All these shortcuts for how it looks and how it performs?
 

Wildebeest

Member
Developers are unable to look at their games with fresh eyes, which is why they just think all this blurry, smeary, color warping "tech" just magically makes their art assets look better. "for almost no cost".
 
Last edited:

Buggy Loop

Member
This tech is not for improving performance, it's to hide incompetence, bloated code and lack of optimizations.

And so we come to what I also said in the graphic thread :

UE5 is fine as a tool, devs are making the decisions to use X or Y, be it lumen or nanite, etc. But Epic is selling an ecosystem, not just an engine. The ease of use, almost all devs have toyed with it by now so you can hire someone and insert on your project easily, you can offload labour to a cheap country and they know the engine.

Brilliant right? Cost savings. Studios are saved, accountants are happy!

What happens then is you fired your in-house engine nerds, you pushed away your experienced workforce for juniors, and this results in a bunch of headless chickens running around applying plugins and assets from the store without any optimization and we are at this state now. A shitshow.

So yeah, UE5 as an engine is fine, it’s just a tool. The ecosystem though is fucking things up royally.

AND with those cost savings they still manage to not meet financial expectations :messenger_tears_of_joy: This industry is more fucked than ever.
 
Top Bottom