Graphical Fidelity I Expect This Gen

Cracking Up Lol GIF


kojipro priorities. Ray tracing and virtualized geometry support? No. Pregnant bitches posting tiktoks? Hell yes.
 


kojipro priorities. Ray tracing and virtualized geometry support? No. Pregnant bitches posting tiktoks? Hell yes.

well, PS5 hasn't been "a big enough leap" over PS4. so what did you expect?!?!
I don't mean to disparage Kojima's work but yes I agree, Kojima doesn't know how to prioritise, but that is 100% intentional [e.g. stylistic animations instead of realistic ones]
I am sure a PS5-only HFW would crush DS2 in every technical aspect, GG are just more skilled and experienced with the engine
 
Last edited:

I recently grabbed dead island 2 for free on the Epic Store and I'm in love with their gore system.
FLESH puts most AAA efforts in that regard to shame.
Also the game looks pretty decent for what it is and runs in Dual-UHD (7680 x 2160) Ultrawide DLSS-Quality on my 5080 without even taxing it. I had almost forgotten how "light" non RT games were.
 
Last edited:
I am sure a PS5-only HFW would crush DS2 in every technical aspect, GG are just more skilled and experienced with the engine
This looks a touch sharper and has those better screen-space reflections, but the sacrifice of halving your frame-rate is ultimately too costly to really recommend - and as even Guerrilla's own tech director said to us at the PS5 Pro preview event, "friends don't let friends play at 30Hz".
let it go.

We have done this time and time again with every single sony studio. First it was insomniac then SSM then PD then Sucker Punch. And now Kojima himself has been infected with the 60 fps bug.

If you want RTGI and other next gen tech like nanite on consoles, you are going to have to drop down to 720p for 60 fps and sony studios seem to be targeting 1440p. insomniac did allow spiderman 2 to drop to 1080p but they only had rt reflections and likely passed on rtgi in order to not drop below that.

while the rest of the industry has embraced realtime GI, be it RTGI or software based, sony studios go by Kevboard's baked only philosophy. And they dont seem to want to invest in virtualized geometry either. Hence, the consistently last gen looking asset quality we've seen in spiderman 2, astrobot, ds2 and ghost of yotei. I think it's time to give up. GG couldve implemented RT reflections, better draw distance, faster traversal, etc. with burning shores which was ps5 only and came out in year 3 of ps5 but they didnt. Different priorities at sony nowadays.
 
let it go.

We have done this time and time again with every single sony studio. First it was insomniac then SSM then PD then Sucker Punch. And now Kojima himself has been infected with the 60 fps bug.

If you want RTGI and other next gen tech like nanite on consoles, you are going to have to drop down to 720p for 60 fps and sony studios seem to be targeting 1440p. insomniac did allow spiderman 2 to drop to 1080p but they only had rt reflections and likely passed on rtgi in order to not drop below that.

while the rest of the industry has embraced realtime GI, be it RTGI or software based, sony studios go by Kevboard's baked only philosophy. And they dont seem to want to invest in virtualized geometry either. Hence, the consistently last gen looking asset quality we've seen in spiderman 2, astrobot, ds2 and ghost of yotei. I think it's time to give up. GG couldve implemented RT reflections, better draw distance, faster traversal, etc. with burning shores which was ps5 only and came out in year 3 of ps5 but they didnt. Different priorities at sony nowadays.
lazy and safe
 
sony studios go by Kevboard's baked only philosophy.

it's not a baked only philosophy. it's the philosophy to use what works best.
in Doom The Dark Ages real time GI works best. in Doom Eternal baked works best.

in Silent Hill 2, Expedition 33, Robocop and Ninja Gaiden 2 Black, baked would have worked best, but Lumen is a shortcut that was sadly taken.

in The Finals, realtime GI works best, because nearly the entire map is destructable... and so they used that... thankfully they used RTX GI and not Lumen :) RTX GI is faster and better looking... but then again that's basically true for nearly any alternative to the cancer called Lumen
 
Last edited:
I recently grabbed dead island 2 for free on the Epic Store and I'm in love with their gore system.
FLESH puts most AAA efforts in that regard to shame.
Also the game looks pretty decent for what it is and runs in Dual-UHD (7680 x 2160) Ultrawide DLSS-Quality on my 5080 without even taxing it. I had almost forgotten how "light" non RT games were.

Try to break knees with blunt weapons🥲
 
I need to avoid that best console graphic topic or my brain is gonna implode :messenger_tears_of_joy:

Slimy, shadows over wukong? Really?

You can't just ignore the characters rendering in a best graphic list cmon.
 
let it go.

We have done this time and time again with every single sony studio. First it was insomniac then SSM then PD then Sucker Punch. And now Kojima himself has been infected with the 60 fps bug.

If you want RTGI and other next gen tech like nanite on consoles, you are going to have to drop down to 720p for 60 fps and sony studios seem to be targeting 1440p. insomniac did allow spiderman 2 to drop to 1080p but they only had rt reflections and likely passed on rtgi in order to not drop below that.

while the rest of the industry has embraced realtime GI, be it RTGI or software based, sony studios go by Kevboard's baked only philosophy. And they dont seem to want to invest in virtualized geometry either. Hence, the consistently last gen looking asset quality we've seen in spiderman 2, astrobot, ds2 and ghost of yotei. I think it's time to give up. GG couldve implemented RT reflections, better draw distance, faster traversal, etc. with burning shores which was ps5 only and came out in year 3 of ps5 but they didnt. Different priorities at sony nowadays.

you do know that I am pro-30fps !? right?
Burning Shores was a missed opportunity but ultimately an understandable one. the time gap was just not enough for the tech to develop
also, I wouldn't want my best and latest tech to be revealed in a DLC
that's just my opinion of course
but I do get your point, if it were up to GG themselves, I don't think they would choose 60fps. at least not as the default mode
 
Last edited:
I didn't know this was also in ue5


This game is never coming out is it. Didn't it originally start as a wii u game on kickstarter lol. I've been following it for years as it looks so unique, but I just think they have no idea what they actually want to make anymore.
 
First off…this is MY thread and topic. Second I'm not stupid, I have a long history in gaming and technology. There are open world games that look much better than single player linear games, why? You tell me. I'll wait. Best looking games are simply…the best looking games regardless of anything else, do you understand that?
"MY THREAD!!!" What are you 12? rofl...

Why? budget. TLOU II was the best looking PS4 game closely followed by The Order and U4. Shock horror all linear or "wide" linear games. RDR II is also up there and a great looking game for an open World game but again Rockstar are the exception not the rule (same as a Sony funded Kojima game) where budget trumps everything.

MY THREAD MY THREAD MY THREAD!!! lol.
 
you do know that I am pro-30fps !? right?
Burning Shores was a missed opportunity but ultimately an understandable one. the time gap was just not enough for the tech to develop
also, I wouldn't want my best and latest tech to be revealed in a DLC
that's just my opinion of course
but I do get your point, if it were up to GG themselves, I don't think they would choose 60fps. at least not as the default mode
i know you are. I am just saying that things have changed at Sony studios. it was GG's own tech director who said the friends dont let friends play 60 fps nonsense.

We shall see i guess. To me, any team thats targeting 1440p 60 fps is essentially working with a 5 tflops console. Not much you can do with such a small tflops increase.
 
lol the lack of AO is hilarious.

that said, doom and indy rt issues are basically down to poor implementation by Machine Games and ID Software. And I shouldnt even say poor, its just them hitting the limits of the consoles which simply cant do RT at 60 fps and higher resolutions.

I saw an interview with the indy team and they made it sound like nvidia did the Path tracing implemenation themselves. My guess is some engineers from nvidia went in and did most of the work and thats probably whats happening with Doom as well. I doubt they had anything to do with the game's RT implementation. They just wanted 60 fps and higher resolutions and there is only so much you can do. AC shadows has the best RTGI on consoles and they didnt bother shipping the 60 fps mode with RTGI.

Doom simply wouldnt work at 720p 60 fps like avatar because there is so much going on at all times, and other UE5 games on consoles so they had to target 1080p at the minimum. Indy targets 1440p on the PS5 which is the bare minimum IQ in my opinion. But because both games have to run at 60 fps, they are essentially only able to tap into 5 of the 10 tflops to render next gen visual effects. they are in effect targeting series s specs.

Yes exactly, it's related to performance constraints, not even limited by the consoles alone, but also by mainstream hw in general. At least when it comes to Doom, they are barely within performance targets on mainstream GPUs, like the 4060, which runs like crap in the busiest fights, even tho' it's running at 960p internal:

3GgsYew.jpeg


As a reference, the 4060 runs Eternal native 1080p with RT-ON (which is just reflections, but better reflections than TDA) at ~170fps. One hundred and seventy frames per second. This is actually the very central point also when it comes to the "forced ray tracing" debate. I was reading the thread a few days ago and I had a quite lenghty post I wanted to make as I think lots of people got baited by DF's strawman "people with a 1060 want to run 2025 AAA titles" and ended up missing the point entirely. That was a very clever way to derail the argument and avoid talking about the real problem, that, or Battaglia, besides being dishonest, is just fucking stupid.

Mainstream hardware is still not good enough to run RT-only 1080p 60fps locked. Perhaps it will be from now on with the 5060. From now on, 2025. This is also to debunk the other argument DF made "we have had RT capable since 2018 now". Looking at the screenshot above I'd say nah, still not capable. To make a 4060 capable you'd need to run most of the settings at LOW, de facto losing reflections almost entirely and so making the game look kinda worse (or not much better) than its predecessor, which ran at more than double the framerate.

The real issue when it comes to "forced RT": it's not people with 1060s protesting forced RT, it's people with mainstream RT-capable GPUs who are getting perceptually worse graphics for less than half of the framerate, compared to what they used to be getting in the last few years.

They just wanted 60 fps and higher resolutions and there is only so much you can do

There is something you can do: hybrid, raster+ray tracing. Which is the way things should be done right now, at this point in time. If you go RT-only then yes, there is only so much you can do. You can't do proper GI + proper reflections and stay within acceptable performance targets on mainstream hw. If you want to do that, then you're going to leave stuff out, like contact and self shadows in Indiana Jones, or specular highlights in Doom TDA.

Now, when it comes to F1 25, I honestly don't know what is going on, I don't see why SSAO can't be enabled together with RT, given the fact that Codemasters (RIP and bless them o7) seems to be offering the usual plethora of graphical options in the settings' menu. The pit stop comparison was posted by Compusemble, the race track comparison was posted by Nvidia. Lots of people were negatively surprised by what we see in that 1st comparison, this includes Seba Aaltonen (ex Ubisoft):



And Nicolas Lopez (Ubisoft):

 
Last edited:
let it go.

We have done this time and time again with every single sony studio. First it was insomniac then SSM then PD then Sucker Punch. And now Kojima himself has been infected with the 60 fps bug.

If you want RTGI and other next gen tech like nanite on consoles, you are going to have to drop down to 720p for 60 fps and sony studios seem to be targeting 1440p. insomniac did allow spiderman 2 to drop to 1080p but they only had rt reflections and likely passed on rtgi in order to not drop below that.

while the rest of the industry has embraced realtime GI, be it RTGI or software based, sony studios go by Kevboard's baked only philosophy. And they dont seem to want to invest in virtualized geometry either. Hence, the consistently last gen looking asset quality we've seen in spiderman 2, astrobot, ds2 and ghost of yotei. I think it's time to give up. GG couldve implemented RT reflections, better draw distance, faster traversal, etc. with burning shores which was ps5 only and came out in year 3 of ps5 but they didnt. Different priorities at sony nowadays.
the 60 fps "elitism" is really tiring.
We all played 30fps for whole 360/ps3/ps4 gens and it was great. It's not out fault devs forgot how to minimalize input lag.
Look at Remedy. They cannot make well controlling game on console for the life of them. Control controls awfully and Alan Wake 2... even worse. It's the worst controlling 30fps game I've played this gen. The deadzones are wrong, the acceleration on sticks is wrong... and who know what they are doing with vsync. It's so delayed. Even feels bad in 40fps mode.
You launch well controlling and looking (30fps really helps) game like Uncharted 4 on ps4 or ff16 this gen... in comparison, ff16 looks and feel way faster.

I like 60fps and I am satisfied with graphics nowadays. So it's good. But I never minded 30 either. Not even when I got 240hz monitor and not when I got 120hz oled.... ok 240hz made it way more difficult to even play 60fps, let alone 30. But it's all a matter of adjusting
 
the 60 fps "elitism" is really tiring.
We all played 30fps for whole 360/ps3/ps4 gens and it was great. It's not out fault devs forgot how to minimalize input lag.
Look at Remedy. They cannot make well controlling game on console for the life of them. Control controls awfully and Alan Wake 2... even worse. It's the worst controlling 30fps game I've played this gen. The deadzones are wrong, the acceleration on sticks is wrong... and who know what they are doing with vsync. It's so delayed. Even feels bad in 40fps mode.
You launch well controlling and looking (30fps really helps) game like Uncharted 4 on ps4 or ff16 this gen... in comparison, ff16 looks and feel way faster.

I like 60fps and I am satisfied with graphics nowadays. So it's good. But I never minded 30 either. Not even when I got 240hz monitor and not when I got 120hz oled.... ok 240hz made it way more difficult to even play 60fps, let alone 30. But it's all a matter of adjusting

I agree. I personally have zero issues with 30 fps as long as frame times are stable and the framerate in general sta

Some people speak as if 30fps was some sort of heresy and honestly is quite cringe in general. I can understand them preferring 60fps but demonizing 30fps is silly. In fact many will be disappointed when reality hit their face with GTA6 being 30fps
 
I agree. I personally have zero issues with 30 fps as long as frame times are stable and the framerate in general sta

Some people speak as if 30fps was some sort of heresy and honestly is quite cringe in general. I can understand them preferring 60fps but demonizing 30fps is silly. In fact many will be disappointed when reality hit their face with GTA6 being 30fps
They are mostly new gamers who just now got high refresh screens or got into 60fps games for first time ever. edit: (and they kinda think they are smart for noticing how 60 is smoother than 30)
People who play games like... since the 90s or so on, been through all framerates and I know it doesn't matter that much to me. not anymore.
Unreal was like 20fps on 3dfx lol. I was so surprised to find this out recently since I could've swore it was smooth as butter.
The worst gen was 360 because sub 30 was the norm but even then I got used to it. playing earth defense force 2017 in coop at 7fps changes a man :P
 
Last edited:
Now, when it comes to F1 25, I honestly don't know what is going on, I don't see why SSAO can't be enabled together with RT, given the fact that Codemasters (RIP and bless them o7) seems to be offering the usual plethora of graphical options in the settings' menu. The pit stop comparison was posted by Compusemble, the race track comparison was posted by Nvidia. Lots of people were negatively surprised by what we see in that 1st comparison, this includes Seba Aaltonen (ex Ubisoft):



And Nicolas Lopez (Ubisoft):



Path traced lighting yet it's struggling to look any more realistic than a rasterized Gran Turismo 7.



Color grading is absolutely everything. But we knew this already from the GT7 vs Forza Motorsport showdown, and how transformative Assetto Corsa LUT mods can be.
 
Path traced lighting yet it's struggling to look any more realistic than a rasterized Gran Turismo 7.



Color grading is absolutely everything. But we knew this already from the GT7 vs Forza Motorsport showdown, and how transformative Assetto Corsa LUT mods can be.


That green filter in GT7 is bizarre. That's not how real life works.
 
Took some Space Marine 2 gifs to compare against the pre-release footage that was downgraded.

Ignore the shimmering and squiggly lines in my capture. the game doesnt look like that with dlss 4k quality.

The only difference i see is in the volumetric lighting and smoke effects. I wish they had added a higher setting for those effects on PC. They gave it a very cg look thats completely missing from the release version.

kb06xWC.gif


lA1Gmvf.gif


cXHOGJm.gif


W5SNxEF.gif
 
Took some Space Marine 2 gifs to compare against the pre-release footage that was downgraded.

Ignore the shimmering and squiggly lines in my capture. the game doesnt look like that with dlss 4k quality.

The only difference i see is in the volumetric lighting and smoke effects. I wish they had added a higher setting for those effects on PC. They gave it a very cg look thats completely missing from the release version.

kb06xWC.gif


lA1Gmvf.gif


cXHOGJm.gif


W5SNxEF.gif
Even Fallout 4 had volumetric lighting, it can't be that expensive right?
 
Path traced lighting yet it's struggling to look any more realistic than a rasterized Gran Turismo 7.



Color grading is absolutely everything. But we knew this already from the GT7 vs Forza Motorsport showdown, and how transformative Assetto Corsa LUT mods can be.

Materials are just so off in GT7, they refuse to bounce the sunlight. either in cars or on track. I dont understand how people dont see that. its so plainly obvious to me. It's like the just absorb the light and ends up looking dull. People then attribute it to looking realistic but it just looks bland.

Even DF missed it when they did the 1 hour long comparison with Forza. Now path traced F1 looks so clearly superior but people still rate GT7 above it. I actually went back and played the free demo again today and i see the same issues. it looks good for a last gen racer, but even on the pro with ray traced reflections, im not seeing the same kind of lighting i do in forza.
 
Even Fallout 4 had volumetric lighting, it can't be that expensive right?
nah, its actually rather expensive. its basically dynamic lighting in otherwise baked GI games because it has to be calculated in realtime. cant be baked into textures.
 
outlaws really puts jedi survivor to shame
can't believe koboh with much less NPCs and buildings is full stutters, judders and gives you really low performance yet jaunta's hope with open world around it with wind physics, higher quality textures, much better LODs, much more NPCs, random NPCs with speeders speeding through the town, random activities and interactions, and better animations, run smoothly at 60+ fps on moderate hardware without stutters and incredible frametime stability



PIxSpqN.jpeg


like the survivor literally has a special loading corridor before you enter the bar, stutters and judders while doing so, makes you wait in front of a door. yet outlaws, everything just runs like dream



sZT9Bof.gif
 
Last edited:
outlaws really puts jedi survivor to shame
can't believe koboh with much less NPCs and buildings is full stutters, judders and gives you really low performance yet jaunta's hope with open world around it with wind physics, higher quality textures, much better LODs, much more NPCs, random NPCs with speeders speeding through the town, random activities and interactions, and better animations, run smoothly at 60+ fps on moderate hardware without stutters and incredible frametime stability



PIxSpqN.jpeg


like the survivor literally has a special loading corridor before you enter the bar, stutters and judders while doing so, makes you wait in front of a door. yet outlaws, everything just runs like dream



sZT9Bof.gif


Yeah, massive difference here. One engine was designed to run open world games with lots of elements, full RT suite and no loading/stuttering and the other one is UE4...

They pushed UE4 to the point of breaking and game still is in a sorry state 2 years later. They will probably switch to UE5 for next game and I doubt it will run any better.
 
Yeah, massive difference here. One engine was designed to run open world games with lots of elements, full RT suite and no loading/stuttering and the other one is UE4...

They pushed UE4 to the point of breaking and game still is in a sorry state 2 years later. They will probably switch to UE5 for next game and I doubt it will run any better.
It's one of the most impressive looking games on console this gen im sorry to say but only if you're willing to accept 30 fps and play on the Pro. I see why PC gamers are pissed though. I cant get too mad at Respawn on console because they put in effort to push visuals unlike so many other devs
 
Materials are just so off in GT7, they refuse to bounce the sunlight. either in cars or on track. I dont understand how people dont see that. its so plainly obvious to me. It's like the just absorb the light and ends up looking dull. People then attribute it to looking realistic but it just looks bland.

Even DF missed it when they did the 1 hour long comparison with Forza. Now path traced F1 looks so clearly superior but people still rate GT7 above it. I actually went back and played the free demo again today and i see the same issues. it looks good for a last gen racer, but even on the pro with ray traced reflections, im not seeing the same kind of lighting i do in forza.
Did you try GT7 without pssr? Because it looks awful whenever I try to play it with pssr and RT....downright terrible, but when turned off it looks good.

Unless having an hdmi 2.1 tv allows it to be played differently on Pro, this is one of the worst Pro updates we've seen. Blurry, aliased, and lower resolution...
 
Took some Space Marine 2 gifs to compare against the pre-release footage that was downgraded.

Ignore the shimmering and squiggly lines in my capture. the game doesnt look like that with dlss 4k quality.

The only difference i see is in the volumetric lighting and smoke effects. I wish they had added a higher setting for those effects on PC. They gave it a very cg look thats completely missing from the release version.

kb06xWC.gif


lA1Gmvf.gif


cXHOGJm.gif


W5SNxEF.gif
Damn, another lying shameless developer ..
 
the 60 fps "elitism" is really tiring.
We all played 30fps for whole 360/ps3/ps4 gens and it was great. It's not out fault devs forgot how to minimalize input lag.
Look at Remedy. They cannot make well controlling game on console for the life of them. Control controls awfully and Alan Wake 2... even worse. It's the worst controlling 30fps game I've played this gen. The deadzones are wrong, the acceleration on sticks is wrong... and who know what they are doing with vsync. It's so delayed. Even feels bad in 40fps mode.
You launch well controlling and looking (30fps really helps) game like Uncharted 4 on ps4 or ff16 this gen... in comparison, ff16 looks and feel way faster.

I like 60fps and I am satisfied with graphics nowadays. So it's good. But I never minded 30 either. Not even when I got 240hz monitor and not when I got 120hz oled.... ok 240hz made it way more difficult to even play 60fps, let alone 30. But it's all a matter of adjusting
Bruh, he was talking about not having better RT at Sony because of resolution and you went on your 30 vs 60 crusade again.
 
Holyshit Pekora!
Today Kojima than man has died and Pekora the god is born


Referring to anything anime as hentai is just pathetic.
good anime games from a decade ago still look better than anything realistic.


GG Strive is freakin gorgeous! Has a ton of characters and stages now too. Anyone who likes fighters should pick it up. I love the various stage graphics too in this game.
 
Yes exactly, it's related to performance constraints, not even limited by the consoles alone, but also by mainstream hw in general. At least when it comes to Doom, they are barely within performance targets on mainstream GPUs, like the 4060, which runs like crap in the busiest fights, even tho' it's running at 960p internal:

3GgsYew.jpeg


As a reference, the 4060 runs Eternal native 1080p with RT-ON (which is just reflections, but better reflections than TDA) at ~170fps. One hundred and seventy frames per second. This is actually the very central point also when it comes to the "forced ray tracing" debate. I was reading the thread a few days ago and I had a quite lenghty post I wanted to make as I think lots of people got baited by DF's strawman "people with a 1060 want to run 2025 AAA titles" and ended up missing the point entirely. That was a very clever way to derail the argument and avoid talking about the real problem, that, or Battaglia, besides being dishonest, is just fucking stupid.

Mainstream hardware is still not good enough to run RT-only 1080p 60fps locked. Perhaps it will be from now on with the 5060. From now on, 2025. This is also to debunk the other argument DF made "we have had RT capable since 2018 now". Looking at the screenshot above I'd say nah, still not capable. To make a 4060 capable you'd need to run most of the settings at LOW, de facto losing reflections almost entirely and so making the game look kinda worse (or not much better) than its predecessor, which ran at more than double the framerate.

The real issue when it comes to "forced RT": it's not people with 1060s protesting forced RT, it's people with mainstream RT-capable GPUs who are getting perceptually worse graphics for less than half of the framerate, compared to what they used to be getting in the last few years.



There is something you can do: hybrid, raster+ray tracing. Which is the way things should be done right now, at this point in time. If you go RT-only then yes, there is only so much you can do. You can't do proper GI + proper reflections and stay within acceptable performance targets on mainstream hw. If you want to do that, then you're going to leave stuff out, like contact and self shadows in Indiana Jones, or specular highlights in Doom TDA.

Now, when it comes to F1 25, I honestly don't know what is going on, I don't see why SSAO can't be enabled together with RT, given the fact that Codemasters (RIP and bless them o7) seems to be offering the usual plethora of graphical options in the settings' menu. The pit stop comparison was posted by Compusemble, the race track comparison was posted by Nvidia. Lots of people were negatively surprised by what we see in that 1st comparison, this includes Seba Aaltonen (ex Ubisoft):



And Nicolas Lopez (Ubisoft):


Wait ...so only the Path traced version of F125 has any AO? Is that the takeaway?
 
Top Bottom