• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Threat Interactive] Dynamic lightning was better nine year ago | A warning about 9th gen neglect.

CRT remains king

I still own 5 CRTs (mostly mid-sized all connected to retro systems upstairs, also one that is larger late model though).
I agree with that. My GT60 panasonic plasma has good image quality overall, but CRT motion clarity is unmatched (my plasma has 4ms persistence blur, while CRT 1ms). My plasma although a lot brighteer compared to other plasma TVs it's also not as bright as my CRT TV from 2001.

I can watch DVD films or play PS2 games on my CRT and be still impressed with the picture quality. If I however play the same SD content on my 4K TV, 1440p monitor, or even 1080p plasma, the quality isnt nearly as good. I wish they would still make CRTs, becasue eventually all CRT TVs will die.
 
Last edited:

Vick

Member
Kuro for life, son.
Actually prefer the very latest Panasonic to my KRP-500M. Pretty similar overall, but even though the tweaked/reset Kuro reaches pure black, motion resolution is noticeably better on the Pana.

I find 60hz signals on a Panasonic impossible to describe. Main reason why it's 60fps or bust for me now.
 

Stooky

Banned
So Metro Exodus is now considered ugly:messenger_grinning_smiling:, a game that uses dynamic GI, while games like Uncharted 4 are so beautiful to look at :messenger_beaming::messenger_savoring: despite using old and flat looking raster lighting, that make character models look like this:

u4-2024-12-05-14-58-29-725.jpg


u4-2024-12-05-14-55-55-231.jpg


I feel obliged to show the ugly lighting in Metro exodus. I will show screenshots from the original version (running at 4K TAA native), not the enhanced edition, because this "improved" version has washed out blacks and generally looks much worse IMO.


Metro-Exodus-2024-12-08-07-13-14-541.jpg


Metro-Exodus-2024-12-08-07-13-37-492.jpg


Metro-Exodus-2024-12-08-07-14-08-225.jpg


Metro-Exodus-2024-12-08-07-14-18-906.jpg


Metro-Exodus-2024-12-08-05-26-03-740.jpg


RT GI vs raster

Metro-Exodus-2024-12-08-07-37-45-541.jpg


Metro-Exodus-2024-12-08-07-37-16-409.jpg


Metro-Exodus-2024-12-08-08-36-43-492.jpg


Metro-Exodus-2024-12-08-08-36-53-914.jpg


Metro-Exodus-2024-12-08-08-42-18-238.jpg


Metro-Exodus-2024-12-08-08-42-31-250.jpg



Metro-Exodus-2024-12-08-08-44-29-030.jpg


Metro-Exodus-2024-12-08-08-44-41-667.jpg


Thanks to RT GI objects and characters in metro exodus game are well grounded into the scene. Maybe for some people these are some small differences, but I studied the rules of light and use that knowledge in my daily work, so I'm absolutely blown away that we are finally seeing such realism in games.

As for Uncharted 4 graphics, since certain individual believes that the PC version looks much worse compared to the PS4 version, so I decided to replay the PS4 version once again after so many years and see if I can really notice these huge downgrades.

20241208-025713.jpg


PS4Pro

Uncharted-4-Kres-z-odzieja-20241207165928.jpg


PC

u4-2024-12-07-19-17-31-723.jpg


PS4Pro

Uncharted-4-Kres-z-odzieja-20241207031422.jpg


PC

u4-2024-12-07-20-16-49-753.jpg


a2.gif


PS4Pro

Uncharted-4-Kres-z-odzieja-20241205175618.jpg


PC

u4-2024-12-05-14-26-17-173.jpg


PS4Pro

Uncharted-4-Kres-z-odzieja-20241205180725.jpg


PC

u4-2024-12-05-15-26-48-957.jpg


PS4Pro

Uncharted-4-Kres-z-odzieja-20241208032318.jpg


PC

u4-2024-12-07-21-19-10-436.jpg


PS4Pro

Uncharted-4-Kres-z-odzieja-20241208032534.jpg


PC

u4-2024-12-07-21-20-13-261.jpg


Guys, please feel free to decide for yourself which version looks better. I'm not going to say what I think, because certain individual can be triggered and commit suicide :messenger_winking_tongue:.

Uncharted-4-Kres-z-odzieja-20241206014639.jpg


Uncharted-4-Kres-z-odzieja-20241206014842.jpg


I can only say that it's VERY EASY to find similar "cherry picked" locations in both the PC and PS4Pro versions. If someone has the knowledge of lighting and can see the light (or should I say the lack of it) in the scene, similar spots in this game can be found every couple of seconds, at least during gameplay, because in the cutscenes ND lighting artists used many gimmicks to make the lighting look more realistic.
dynamic light kit for gameplay based from probes from baked lighting. will never be super accurate but will give you better resolution an frame rate. for cinematic used a cinematic light kit. used like traditional lighting in cg movies but limited to how many lights you use that cast shadows etc. All this for best resolution and steady frame rate combo on playstation console,
 

bender

What time is it?
Actually prefer the very latest Panasonic to my KRP-500M. Pretty similar overall, but even though the tweaked/reset Kuro reaches pure black, motion resolution is noticeably better on the Pana.

I find 60hz signals on a Panasonic impossible to describe. Main reason why it's 60fps or bust for me now.

Once I had my Kuro, I really stopped keeping up with the latest and greatest. That set lasted me over twelve years. It still functions, but one of the control boards is going out and it makes a louder noise than the usual plasma buzz (even at elevation).
 

Stooky

Banned
for the tree texture debacle, if i remember right, for the remaster if the hi res texture art was not available the texture was uprezed from ps3 version. so most likely what you are seeing is an uprezed texture and depth map. For the ps3 version that tree bark texture was probably hi res
(for ps3 era) because the player camera would be close to it.
 

Vick

Member
Once I had my Kuro, I really stopped keeping up with the latest and greatest. That set lasted me over twelve years.
Same.

Actually have to keep up simply because sooner or later, I would eventually need a new panel from the home-theater once the plasma supply is exhausted. But at the moment I leave the new shiny tech for the living room and family, as they'll never be bothered by what bothers me about new panels.

Sorry about the control board issue, at this point when problems occur you're on your own. That said, you can still find Kuro community support of all kinds on the internet.

For the ps3 version that tree bark texture was probably hi res (for ps3 era) because the player camera would be close to it.
Not to open the disgraceful ordeal again, but no.
That tree is placed in an angle, with the texture not facing the main playable area. Player is never supposed to even see it, and there are countless similar/higher resolution textures in the game.
 

Stooky

Banned
Same.

Actually have to keep up simply because sooner or later, I would eventually need a new panel from the home-theater once the plasma supply is exhausted. But at the moment I leave the new shiny tech for the living room and family, as they'll never be bothered by what bothers me about new panels.

Sorry about the control board issue, at this point when problems occur you're on your own. That said, you can still find Kuro community support of all kinds on the internet.


Not to open the disgraceful ordeal again, but no.
That tree is placed in an angle, with the texture not facing the main playable area. Player is never supposed to even see it, and there are countless similar/higher resolution textures in the game.
bro i worked on the game, and was there when we were sending assets for the remaster. i can get the definitive answer from the artist that worked on the asset. but i kinda know what they are going to say.
 
Last edited:

Vick

Member
bro i worked on the game, and was there when we were sending assets for the remaster.
That is awesome.

But still, "That tree is placed in an angle, with the texture not facing the main playable area. Player is never supposed to even see it, and there are countless similar/higher resolution textures in the game."
 

Stooky

Banned
That is awesome.

But still, "That tree is placed in an angle, with the texture not facing the main playable area. Player is never supposed to even see it, and there are countless similar/higher resolution textures in the game."
Yeah im saying it depends. tree texture could be high res. I know we were playing detail maps that would be dailed in when the camera close was that help some textures look higher res. Here's something else blu ray storage size was super limited. I think we used only a single layer blu ray. at that time i don't think it was 50g. we rendered all the cutsences, took up a lot of space then we because it was 3d we had to have another set of all the movies that and all the languages support. takes alot of storage space. So im saying all this to say using a high res texture on a tree that no one is looking at would not be the best use in that scenario with limited ram and storage space. But a detail map would make that texture pop and appear to be higher res than what it is, that could be streamed in from disc, could be done in a shader at lower cost. Also there were baddass artist that worked on that level they could squeeze water from rocks
 
Last edited:
Yeah im saying it depends. tree texture could be high res. I know we were playing detail maps that would be dailed in when the camera close was that help some textures look higher res. Here's something else blu ray storage size was super limited. I think we used only a single layer blu ray. at that time i don't think it was 50g. we rendered all the cutsences, took up a lot of space then we because it was 3d we had to have another set of all the movies that and all the languages support. takes alot of storage space. So im saying all this to say using a high res texture on a tree that no one is looking at would not be the best use in that scenario with limited ram and storage space. But a detail map would make that texture pop and appear to be higher res than what it is, that could be streamed in from disc, could be done in a shader at lower cost. Also there were baddass artist that worked on that level they could squeeze water from rocks
That's exactly what I see on my TV. The detail map is clearly there and it creates the illusion of tree detail, but that's not the actual tree texture. In the PS3 version, the underlying tree texture is blurry, whereas in the remaster the same texture looks sharp and you can easily see fine detail. PS4 texture could have been up-resampled from the original, but the artist made sure it looked much better.

It was a clever idea to use detail maps on this tree, because if people can be fooled into thinking the tree texture is high rez then that's clever programming. I always liked tricks like this that help overcome the limitations of the hardware and create the illusion of detail.
 
Last edited:

Kataploom

Gold Member
Disagree. From a rasterized visual standpoint, sure, ac unity is at the top. But fully rtxed games like cyberpunk and dragon dogma 2 have surpassed these games handily. Rtx is the real deal. Devs need to abandon rasterized graphics rast, or else choose static baked lighting like ac unity did.
This so much.
Devs have to stop using RT on top of rasterized lighting like a gimmick and go full RT (not even PT) or stay in rasterized baked lighting, pick one and do it well instead of appealing to every single computer out there, the result ends up being mediocre at best and badly optimized on top.
 

MAX PAYMENT

Member
Games that used MSAA or SMAA always had razor sharp image, but these AA techniques do not work well in modern games, so whether people like it or not, temporal AA methods are here to stay. TAA looked pretty awful in old games, but this AA technology has improved a lot over the years. In RE2Remake for example TAA native looked like an upscaled image to my eyes, so I played it with 150% resolution scale, but in RE village, or RE4R image is really sharp even without increasing resolution scale.


re8-2024-12-02-03-22-32-825.jpg


re8-2024-12-02-03-26-00-910.jpg


re8-2024-12-02-03-21-50-995.jpg


re8-2024-12-02-03-28-33-691.jpg


Or RE4 Remake

4-K-TAA-native-lens-filters.jpg


Even if the TAA looks blurry in certain games, I can always use reshade sharpening filter and increase the resolution scale to get rid of the TAA blur. Many games also support DLSS, and because nvidia allows to use DLSS and DLDSR simultaneously, I can get perfect image quality for no performance cost.

DLSS image quality has also improved over the years. In games like RDR2, DLSS image looked like a blurry mess, but the latest games using this technology look razor sharp.

DLSSQuality

4.jpg



Horizon-DLSSQ.jpg


IMO even DLSS performance looks very good in this game and I wouldnt mind playing like that if my PC couldnt run higher resolutions.

Horizon-DLSSP.jpg


In Black Myth Wuking, I was not happy with the DLSS image quality because that game use excessive sharpening settings, but I disabled it with mods and used my own sharpening settings, so now even DLSS performance looks very good to my eyes.

b1-Win64-Shipping-2024-09-01-00-25-53-709.jpg


b1-Win64-Shipping-2024-09-01-00-07-05-687.jpg


b1-Win64-Shipping-2024-09-01-00-06-20-759.jpg


If all console games had similar image quality, not many people would be complaining. For comparison, this is what the PS5 version looks like.


25d10d16247e97b0712c.jpg


200c0f408188e0bb6aeb.jpg


As for ray tracing, games are obviously more demanding when RT features are enabled, but RT is also very scalable. Even on the lowest settings, RT still looks much better than raster without destroying performance (at least on RTX40 series cards).

Raster / no RT

raster.jpg


RT with minimum settings (rt reflections and shadows).

RT-reflections-shadows.jpg


I'm happy developers started using RT. Screen space reflections never looked good to my eyes and were ruining my immersion (screen space reflections fade as you move the camera and that's very distracting to me). RT GI also make a huge difference, especially in sandbox games. Without RT, the lighting in Cyberpunk or The Witcher 3 looks flat.
Would you mind elaborating a bit further on the pros of running.DLDSR in conjunction with DLSS? I've never considered trying that. I'm intrigued.
 
Would you mind elaborating a bit further on the pros of running.DLDSR in conjunction with DLSS? I've never considered trying that. I'm intrigued.
DLDSR improves image quality (especially in games that have a blurry TAA), but at a huge performance cost as the game will run at a much higher resolution.


As you can see even DLAA look somewhat blurry compared to DLDSR.

Thanks to DLSS, you can run DLDSR with minimal performance penalty. Sometimes (especially RT games) you can even get better performance compared to native TAA. For example In Cyberpunk I have 67 fps at 1440p native (max settings + psycho rt) and 72 fps using a combination of DLSS balance and DLDSR2.25x. That's 5fps boost while still having much better image quality.

If DLSS implementation is very good, even DLSS performance + DLDSR2.25x will look sharper than native TAA while offering big performance boost.
 
Last edited:

MAX PAYMENT

Member
DLDSR improves image quality (especially in games that have a blurry TAA), but at a huge performance cost as the game will run at a much higher resolution.


As you can see even DLAA look somewhat blurry compared to DLDSR.

Thanks to DLSS, you can run DLDSR with minimal performance penalty. Sometimes (especially RT games) you can even get better performance compared to native TAA. For example In Cyberpunk I have 67 fps at 1440p native (max settings + psycho rt) and 72 fps using a combination of DLSS balance and DLDSR2.25x. That's 5fps boost while still having much better image quality.

If DLSS implementation is very good, even DLSS performance + DLDSR2.25x will look sharper than native TAA while offering big performance boost.
Interesting. Is there a chart or anything for this? Or does 2.25x + quality = similar to DLAA/ native performance?
 
Interesting. Is there a chart or anything for this? Or does 2.25x + quality = similar to DLAA/ native performance?
You have to test yourself. Based on my experience in RT games, DLSS balance + DLDSR 2.25x improves performance compared to native TAA, but in raster games performance will be a little bit lower (around 2-3fps).
 
Last edited:
QD-OLED can beat plasma when it comes to brightness, resolution, refreshrate, but not everything (not every picture quality aspect) is better. Motion clarity is a lot worse.
the only thing plasma/CRT has over OLED is motion clarity (big win there though, no joke)... and CRT has no native resolution (so it plays nicely with all resolutions).

CRTs also have a buttload of issues though, like geometry wonkiness (can a nigga get a straight line?), convergence issues (this dot is supposed to be one color... why is it 3?), phosphorus decay (when i move my white crosshair fast in this dark cave, it ghosts/smears), etc etc

plasma cant do perfect blacks, can have a "noisy" image, many have color accuracy issues (non-elite kuros were infamous for their inability to be properly calibrated), only go up to 1080p, etc etc
(but i love you plasma--you got me through those dark pre-OLED days)
 
Last edited:
the only thing plasma/CRT has over OLED is motion clarity (big win there though, no joke)... and CRT has no native resolution (so it plays nicely with all resolutions).

CRTs also have a buttload of issues though, like geometry wonkiness (can a nigga get a straight line?), convergence issues (this dot is supposed to be one color... why is it 3?), phosphorus decay (when i move my white crosshair fast in this dark cave, it ghosts/smears), etc etc

plasma cant do perfect blacks, can have a "noisy" image, many have color accuracy issues (non-elite kuros were infamous for their inability to be properly calibrated), only go up to 1080p, etc etc
(but i love you plasma--you got me through those dark pre-OLED days)
I'm aware that CRTs had these issues, but I didn't notice them as much with my CRTs, so they weren't a real issue for me. Poor blacks, washed out colors and extreme motion blur on the LCD were much more noticeable to me and affected my gaming experience a lot.

A typical plasma had average black levels (black looked more like dark grey in pitch dark room), but strong contrast made dark scenes look good anyway. My GT60 has however exceptionally good black levels for a plasma TV (that's the last plasma panel panasonic ever made). OLED has just a tiny hair better blacks in pitch dark room and only if there's nothing displayed on the screen, because one small bright object (or stars) will force my eyes to adjust and blacks will appear perfect. I have noticed dithering / noise, but only with dynamic settings that have raised blacks. When I use calibrated settings, there is no dithering / noise that I can see. The picture quality on my GT60 is close to perfection, at least as far as SDR is concerned.
 
Last edited:
This doesn't surprise me in the least, anyone who has gamed over these periods could just see it right in front of their face when playing the games, none of the new games or those in the last 5 years or so have impressed me or wowed me with graphic's or visual's, that hasn't happened since Crysis in 2007 with a few exception's here and there since, new games either look cartoony like Avowed does to me or the same as games from the past, as Indiana Jones does looking like Wolfenstein!.
Hellblade 2, Wukong or Horizon Burning Shores isn’t impressive to you?
 

Don Carlo

Member


Nice channel I stumbled while ago about SH2 situation.

Seems to connect to this vid



While also pointing out the errors in this vid.

Apparently they take optimalization seriously and are working on their own UE5 fork since it's industry standard and want to fix it. Worth a watch.

Great post and honestly really surprising. One would think that with the advancement of game making engines and GPUs, there's little room for neglect, but here we are.
 

‘MxBenchmarkPC’ has compared the MegaLights version with the Software Lumen and the Hardware Lumen versions. And, as we can see, MegaLights can bring up to 50% better performance.
This right here shows why MegaLights is one of the most important new features of UE5.5. With MegaLights, an identical scene (with all the benefits of Hardware Lumen) can run way, way faster. And that’s without any image reduction.
The important thing to note is that MegaLights does not “de-activate” Hardware Lumen. So, this is a MAJOR optimization and performance improvement. And yes, this is a feature more and more games should start using.
But you know what? If you believe you can optimize UE5 better than Epic Games or the triple-A devs, develop and release a game. If you are truly capable of something like that, do it. And if it looks better than Hellblade 2 or Indiana Jones (with Path Tracing), a lot of people will buy it. You’ll make a lot of money. Then, you’ll have proven yourself. Until then, you sound like all those “fake” companies that promise “unlimited detail and graphics” in their games.
 

Buggy Loop

Gold Member
^

A lot of people still don’t realize that Lumen is a form of Ray Tracing. And Ray Tracing IS expensive. But no. Suddenly, they want to run it – even in Software Mode – at Native 4K with 60FPS. You have to be STUPID to expect something like that, even on an RTX 4090

The fuck is this article? Is it written by Tim himself?
 

Laptop1991

Member
You aren’t impressed by any game today? Also The Matrix Awakens is a playable demo..
No i'm not impressed by any game today, that's my opinion, it doesn't matter whether you like or agree with it, your can have yours but mine won't change, and i haven't played the Matrix demo so why is it relevant for you to use the laughing emoji, am i supposed to play it and be wowed. by it.
 

True. Devs do have a skill issue, brought on by tight timelines and high budgets. Devs need to make short term development games and long term development games. Publishers are the enemy making these timelines.

But either way, we need to get back to custom, efficient engines and not relying on UE or Unity the Swiss Army knives of game development as one size doesn’t fit all.
 

Bojji

Member
RDR2 with all (advanced) settings maxed out is extremely demanding. I know how this game runs on my RTX4080S OC and I doubt the RTX 2080ti can run this game at 4K native, even at high settings, let alone maxed out.

Probably something like Xbox one x settings. One x was running this in native 4k - 30fps. So no problem for 2080ti to run it at least 2x.

Most of the settings in rdr 2 are hard to tell in reality above certain quality level (yet game becomes more demanding). One x settings are good starting point when tweaking the game.
 

Zathalus

Member
8 years of development, almost 2000 people working on it, and $400-$500 million budget does indeed contribute to a good looking game. Getting 4K/60 at max settings does require a bit more grunt than a 2080ti has, a 3090 just about does it. Most of the visuals gains for some settings are really not worth the performance cost.

That being said a number of games do exceed it in the graphics department these days, if not perhaps not in the small details that such a massive development allows for.
 
No i'm not impressed by any game today, that's my opinion, it doesn't matter whether you like or agree with it, your can have yours but mine won't change, and i haven't played the Matrix demo so why is it relevant for you to use the laughing emoji, am i supposed to play it and be wowed. by it.
Laughing emoji where?
 
Probably something like Xbox one x settings. One x was running this in native 4k - 30fps. So no problem for 2080ti to run it at least 2x.

Most of the settings in rdr 2 are hard to tell in reality above certain quality level (yet game becomes more demanding). One x settings are good starting point when tweaking the game.
Yes, there are much less demanding settings that still look good, but many games on PC offers good scalability.

That's how the RTX2080ti runs RDR2 with maxed out settings.




JoTiuGT.jpeg
 

Laptop1991

Member
Laughing emoji where?
Maybe you took it off or i saw the original one on my post about Crysis , anyway you on the drink or just trolling for fun lol, i'm still not impressed with your games, not that they are bad or look terrible, but i'm still happy with my opinion lol.
 

SlimySnake

Flashless at the Golden Globes
True. Devs do have a skill issue, brought on by tight timelines and high budgets. Devs need to make short term development games and long term development games. Publishers are the enemy making these timelines.

But either way, we need to get back to custom, efficient engines and not relying on UE or Unity the Swiss Army knives of game development as one size doesn’t fit all.
I think people need to understand how RDR2 was made. All 7 rockstar studios stopped working on games like Max Payne, GTA5, L.A Noire, Midnight Club, and focused exclusively on RDR2 for FIVE straight years. rockstar only released some dlc for it and literally cancelled heist missions because the entire company was focused on making missions for RDR2.

Even bigger studios like CD Project max out at 500 devs. ND has 400 devs. 250 of which are working on Intergallactic. No one can compete with GTA6 which has been in development since 2018 and has 3,100 rockstar devs working on it simultaneously.

Ubisoft's Snowdrop engine is very efficient. I can run high settings at 4k 60 fps using dlss quality whereas dlss quality gets me around 80 fps in rdr2. Honestly not bad considering it has RTGI, rt reflections, rt shadows and an insane amount of foliage compared to RDR2.

Outlaws is a bit more expensive and i can only do 4k 40 fps maxed out at dlss quality, but again, its using a lot of new techniques that make it look better than rdr2. Remedy's northlight engine is also very good, and of course there is Decima which produced the stunning looking HFW that finally took RDR2's throne as the best looking open world game at the time.

RDR2 is my Game of the generation, but people really need to stop posting it as something that hasnt been topped. RDR2's PC port itself was criticized back when it came out and couldnt run at native 4k 60 fps on the 2080 ti while maxed out.

yVs19rP.gif


AZ6eTG6.gif


aYlPhWL.gif


And UE5's performance issues have been sorted out. The video posted above shows that the software lumen and hardware lumen are now almost the same performance wise. The CPU bottleneck that plagued early UE5 titles will be a thing of the past as they switch to UE5.4. Megalights will allow for even better graphics with far more light sources than we are seeing today.

Even with UE5.1 with all its CPU single threadedness, Black Myth ran at around 40 fps on my 3080 using DLSS 4k quality. Maxed out cinematic settings and while its literally half of what RDR2 gives me, i think the visual upgrade is generational.

oFClfMM.gif
 

Killer8

Member
He's a moron with no real world development experience who's been ripped to shreds repeatedly on Unreal-focused forums. He basically advocates for nuking the engine's tentpole features like Lumen and Nanite and going back to working developers to the bone doing arduous things like hand-placing GI probes and making loads of LOD models. To make himself sound more reasonable with that last one, he once suggested that magic AI will come along and automate the LOD production process for developers - even though that's what fucking virtualized geometry was already designed to do in real time.

He also fails to realize that a lot of the features of the engine, like Lumen and Nanite, are being pushed for more important reasons than just performance. I think a great many people will readily admit that UE5 can be heavy with the whole feature set enabled. However, it strikes a fairly robust balance between significantly raising the quality floor this generation, while massively reducing development costs and man hours, while still being relatively performant (especially in newer versions). This bigger picture approach is the reason why so many studios are flocking to it.

People like to nitpick but the fact is that even AA developers can now create games on a modest budget that look as good as RoboCop or Silent Hill 2, which simply would've been unheard of 5 years ago.
 
Last edited:
Top Bottom